There is a growing divide between techno-optimists and pessimists. Growing up I vaguely remember a public debate between bible thumpers and people who wanted evolution taught in school. The debate can be framed in any number of different ways — a media "Techlash" vs. Silicon Valley, humanists vs. nerds, progressivism vs. material progress — but recently, in his "Techno-Optimist Manifesto," venture capitalist Marc Andreessen boiled it down to a war between deccelerationists and accelerationists. That is, those who want the rate of technology to slow, stagnate or reverse and those who want the opposite.
In Andreessen's eyes, the debate has metastasized to touch nearly every corner of human endeavor. And in many ways it has: with the increasing rate of internet adoption, tech plays an ever-larger role in our daily lives. More and more of our interactions are mediated by social media or mobile communications tools. As cash falls by the wayside, almost all economic activity is routed through digital rails operated by banks or fintechs. We're streaming more, playing more online games and even taking work online.
While in many ways these developments have made modern life more convenient and efficient, tech also causes problems. Even if you buy into the idea that all technologies are neutral, just "tools" — an extremely forgiving view that instead faults human error when things go awry — it's hard not to think that there's something dystopian about the past two decades of progress. As many crypto advocates point out, as the internet grows, it is increasingly put in the hands of a few companies to control our lives.
This is the point of departure for Chris Dixon's new book, "Read Write Own: Building the Next Era of the Internet," which was published Jan. 30. Dixon, a long-time colleague of Andreessen, who runs the independent crypto arm at storied VC firm a16z, traces the lineage of the internet to find what has gone awry. What started out as a network of interoperable and open protocols, what's now known as Web1, has been sectioned off in Web2. It's an era where essentially five companies control who gets to use what, when and why. Thankfully, Dixon argues, what comes next, Web3, offers real solutions.
Web3 is a bit of a buzzword. For Dixon, it is essentially just the "ownership" layer of the web that has been missing so far. While Facebook and Twitter did connect the world in some sense, it never allowed users to own their identities or accounts. It's a similar story across digital banking, blogging or really anything online with a sign-in and password. With the rise of blockchains, what Dixon refers to as virtual computers, users are finally in control over their digital lives — so long as they maintain control over their private keys.
It might take a techno-optimist to think that blockchain can reverse the consolidation of the web, considering the trillions of dollars at play, how little adoption has taken place so far and the industry's battered reputation in the mainstream. But for Dixon, apart from crypto's ability to hand back control to everyday users, blockchains also offer a space for permissionless development. We may not know exactly what blockchains are for today, but so long as there are people as excited about the tech as Dixon is, we're likely to find out eventually.
CoinDesk spoke with Dixon to get a better sense of the three eras of the web, his work as a venture capitalist funding the next wave of crypto startups and whether he thinks direct democracy is over- or underrated.
The overall theme of the book is the evolution of the internet from Web1, which was dominated by open source protocols, to Web2, which was walled off and siloed, and now there is the re-decentralizing force of Web3 and crypto. Would you consider blockchain to be part of the wider open-source movement and in what ways are Web1 and Web3 different?
In the book I talked about outside versus inside-out technologies, which is this idea that, if you look at the history of computing, there's things like the iPhone and AI that came from established institutions like Apple and Google and Stanford, and then there's a whole separate tradition of hackers at the fringe, building stuff. Early PCs — the Homebrew Computer Club was Steve Jobs. They were outsiders. Open-source software, Linux and the whole stack of open source software came from outside the World Wide Web. Tim Berners-Lee was a physicist at CERN; there's no central casting for computing platforms. Blockchain is very much in that tradition, inheriting that tradition of deep believers of openness and shared systems that's motivated through a distinct ethos.
A lot of people call Facebook a "platform" because you can technically build apps on it. Is there a better definition for what makes a platform, in the sense that blockchains won't kick you off like Facebook did to Zynga?
Facebook may be a platform, but it's very shaky. There's a long history of entrepreneurs who tried to build on top of Facebook and Twitter, and felt essentially robbed because they changed the terms of condition and the APIs. I think we see this going on with Apple right now with Epic suing them and companies like Netflix and Spotify not creating apps for the Vision Quest Pro. A platform is supposed to be a predictable, safe place that developers can build a real business on and have some degree of certainty. If you think about the offline world, like starting a restaurant, when you spend all this time and money, you can still run your restaurant even if the landlord jacks up the rent. That's kind of what we have today: essentially five big landlords that appreciably change the rules and change rents. That's created a very inhospitable environment for independent developers and startups.
In the venture capital business, we invest in startups. We want to see a dynamic internet and one that's hospitable to startups. And I don't think that's the case today. One of the reasons I'm excited about blockchains, is because I see them as a way to return us to the kind of original internet with predictable platforms, an environment that is hospitable to entrepreneurs and creators, where they can build direct relationships with their audiences. They can have a predictable relationship. This is what the internet was supposed to be.
I really worry that the way we're headed now, there's going to be three or four big platforms like broadcast TV in the 70s — ABC, NBC, CBS. And everyone's going to spend their time in one of those silos. And to me, that's a tragic outcome for what was once this open, democratic network. And we should be doing all we can to counter that.
One of the major differences between Web1 and Web3 is the role of the government and academia developing the foundational internet protocols. Does Web3 need another DARPA like to succeed?
I don't think so. There's enough sources of funding and other things. I actually don't think that's the shortcoming. We need more entrepreneurs, we need more academics, we need clear policy. Because many policy decisions are playing out in courts, and that takes many years; it creates uncertainty and disincentivizes entrepreneurs. We want to be as inclusive as possible. If folks in academia and government want to get involved in a constructive way, that's awesome. Unfortunately, I think there's just a ton of misunderstanding around the blockchain space, which is a big reason I wrote the book. There's a lot more skepticism than is warranted.
You write that software is more like fiction than anything, and have said that information wants to be free. Does this imply that fiction or other creative works should be priced at the level for commodity inputs or devalued broadly?
Let me maybe add nuance. I don't know exactly what context I said it; I very strongly believe that creative people should be paid for their work. There's a section in the book on the media business where I call attention to the attention monetization trade off, which is a trade-off in media between getting people to see what you do by sharing it on the internet. On the other hand, you want to charge for it.
I look at the video game industry as the most pioneering in that they have figured out that it's a better business to, instead of charging for the game, they charge for compliments to the game like virtual goods. League of Legends and Fortnight are two of the most successful games and the games themselves are free. I suspect, as AI enables anybody to create high-quality illustrations for free, that's gonna put downward pressure on the prices of illustrations. So it's more important than ever to think about new business models for creative people that don't involve simply quote-unquote selling the game. They sell other things.
Information can be free in the sense that content can be free, and creative people can still get paid. NFTs are an obvious example, right? How do artists get paid in the offline world? Artists don't get paid by copyrighting the image of a painting; they want that image to propagate and they sell the original paintings or photographs. They sell essentially the signed, authenticated version of the image, not the image itself. NFTs introduced a similar idea into the digital world. You can reconcile an internet where sharing content that drives the price of the content to zero with business models that make sure creative people get paid...
Read the full invertview on the web.
– D.K.
@danielgkuhn
daniel@coindesk.com
EmoticonEmoticon