Web3 needs its Windows 95 moment
Also: software may provide great leverage but it doesn't necessarily follow that software companies require fewer employees
Web3 usability sucks
Way back in August, 1995, a bunch of nerds stood on stage and attempted to dance to the Rolling Stones’ song “Start Me Up”:
This is the launch of Windows 95 by Microsoft in August of that year. Here’s what Windows 95 looked like when a user first booted up their computer:
It’s hard for people in 2022 to understand, but this was a revolutionary computing environment. Microsoft took Apple’s famous GUI—first developed at Xerox PARC—and applied it to Windows. Windows was by far the dominant operating system back in the mid-90s. I started college in the fall of 1995, and I remember how gobsmacked everyone was: a ‘professional’ company like Microsoft released what was basically a clone of Apple’s OS!
No longer did the office workers of the world have to deal with Windows 3.x or, worse, DOS. Now lawyers and accountants and consultants and ad people and anyone else in white collar America circa 1995 could avail themselves of a modern computing platform whose visual metaphors made using the computer as simple as dialing a telephone. And that’s the key: Microsoft put in millions of man-hours to make computing simple. It abstracted away all the complexity inherent in traditional, command-line interfaces. By making computing simple, Microsoft opened up computing for the masses.
If you read that last paragraph thinking, “this is ridiculous, GUIs suck. I can do much more faster in a command line than I can with a mouse,” well: you’re missing the point. Microsoft is in the business of selling Windows and ancillary services, and it realized pretty early on that Apple had the correct read on the average user. The average user was not interested in a Unix-like command line interface. The average user didn’t want to have to think about arcane syntax. The average user wanted to get shit done, and move on with her day.
And that’s the problem that Web3 is currently dealing with. Advocates of “Web3” constantly say that in order for their vision to be realized, “mass adoption” has to occur. Well, in order for mass adoption to occur, Web3 technology has to be dead simple for average users. One big problem here is that all of the people building the various tools and technologies that encompass Web3 mainly speak with each other. They don’t have the first clue what an average user looks like.
Microsoft, like Apple, spent billions of dollars and millions of man-hours figuring out how to make computers work for average people, for whom a command line interface might as well be alien runes. The people presently building Web3 tech need to similarly figure out how to build technologies that average users can use. If the people building Web3 technologies can’t figure that out, then Web3 will fail.
Maybe you still doubt my claims. Here are six articles which suggest that I’m correct:
Web3 will only surpass web2 when it becomes less annoying to use
How bad UX is threatening to send web3 to the graveyard of ambition
Here’s Jake Brukhman, co-founder of crypto investment fund Coinfund, tweeting a mathematical model of web3 adoption:
In other words, adoption relates roughly to the lack of friction. If 1/friction is 10 (i.e., friction = 0.1), then that’s better than Adoption = 2 (i.e., friction = 0.5). This is of course inexact (what are units of friction?) But the intuition is solid: the less friction there is in using web3, the more adoption will occur. If you are an adoption maximizer, you want friction to be as low as possible. The quantitative amongst you will note that if you have zero friction, you have undefined adoption, which is of course nonsensical. This is not meant to be a robust mathematical model, but rather it is meant to capture an elemental truth about technology adoption: the less friction that is required, the more widely adopted the technology.
Microsoft proved this relationship with its release of Windows 95. Web3 needs to have its Windows 95 moment, and it is not clear that it is anywhere close.
Software may be eating the world but it's not eating employees
Back in 2011, Marc Andreessen famously claimed that “software is eating the world.” He had this to say:
More and more major businesses and industries are being run on software and delivered as online services--from movies to agriculture to national defense. Many of the winners are Silicon Valley-style entrepreneurial technology companies that are invading and overturning established industry structures. Over the next 10 years, I expect many more industries to be disrupted by software, with new world-beating Silicon Valley companies doing the disruption in more cases than not.
Why is this happening now?
Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.
I thought of this essay today, when I saw the following tweet:
A common assumption about software-enabled business models is that software businesses require fewer staff than other companies. Leverage is greater. More revenue is generated with fewer staff. And, while it is true that there is bloat at the various FAANG companies, and Twitter recently halved its workforce, I think that people overstate the case when they say that software companies necessarily will hire fewer people than more traditional businesses.
It may be true that software gives people more leverage than any other tool, but it does not follow from that proposition that software requires fewer human employees. It may be the case that software creates *more* work for people to do.