The Promise and Paradox of Decentralization
Plus! Valuation Arbitrage; Labor Liquidity; Pricing Power; Diff Jobs
Welcome to the Friday edition of The Diff! This newsletter goes out to 24,928 subscribers, up 98 from last week. In this issue:
The Promise and Paradox of Decentralization
The Promise and Paradox of Decentralization
There are, to varying extents, efforts to do all of the above. But what's worth considering is that all of these bans happen through centralized actors running on decentralized systems. The Internet was supposed to be a totally open set of protocols that anyone could interact with, and for a long time it was, and yet getting kicked off Facebook and Twitter means being exiled from public discourse. Banks are regulated, but every country can set up their own banking regulations, and different countries see the social role of banks very differently; for some countries they're a tool of economic planning, for others a way to help connect local companies to global markets, for others a source of money that can be channeled to politically favored groups. The chip industry was historically very open to new entrants; the transistor was developed at Bell Labs, and for political reasons AT&T licensed it to all comers at a low price.
And yet all of these originally decentralized systems end up facing some degree of centralization.
The promise of a decentralized system is that anyone can build anything they want, because no one owns it. There are three important features of this; two downsides and a caveat.:
The first downside to "Anyone can build anything" is that "anyone" means anyone, and the people to whom decentralized systems are the most attractive are the ones who are banned from other systems, often for good reasons. Free speech absolutists want to be defending Aleksandr Solzhenitsyn or Eugene V. Debs or even Lenny Bruce, but if you want to get a sense for the typical piece of content enabled by a totally uncensored communications system, check your spam folder.
The second downside is that a lack of ownership means that no one is responsible when things go wrong. Bitcoin doesn't have a way to censor your decision to move money from one address to another or even a protocol-level sense of whether you're buying Alpaca socks or a rocket launcher. (nb: You are almost certainly not buying a rocket launcher with Bitcoin; you are at best getting scammed, and at worst getting in trouble). But this also means that the protocol can't tell when you meant to send your life savings to your cold wallet and sent it to the wrong address instead.
The caveat is that, to quote a Diff piece from last year ($), "[A]ny decentralized order requires a centralized substrate, and the more decentralized the approach is the more important it is that you can count on the underlying system." This applies to cryptocurrencies (you can trust the protocols) and to the Internet (ditto). It also applies, in an interesting way, to the increasingly decentralized world of free trade and free-flowing capital: money can only flow and deals can only be made if everyone has a consistent sense of property rights and contracts, and the definition of those concepts will typically be determined by whichever participant in a transaction has the more sophisticated financial and legal system. Globalization ends up being a bit like Henry Ford's pitch: you can use any economic paradigm you want, as long as it's the American one.1
An open, decentralized system is sparse: there's room for anything, which means that most of the possibility space consists of nothing at all. This creates a natural demand for onramps—Coinbase for figuring out Bitcoin, portals for figuring out the Internet, TSM for turning chip designs into functioning products, and the like. These create a complicating dynamic: these onramps are built on an open system, but part of their function is to close off some of it. And the better they do that, the more value they can capture. Meanwhile, because they have a monopoly on what they've fenced off, their economics support providing a better user experience. Make a really helpful update to the Bitcoin wiki, and you can feel good about yourself and perhaps contribute to the price of Bitcoin in a highly indirect way; design and launch an improved onboarding process at Coinbase and your contribution is quantifiable, and affects your next cycle's equity grants.
This privatization proceeds in three different ways:
Some companies start building on an open protocol and then slowly make it their own. Chrome's decision to make the search bar synonymous with the address bar was a slight improvement in the interface, but also a very strong statement that Google had replaced manually typing in URLs as the default way navigate. (The strongest evidence for this is the frequency of navigational searches; the most commonly-searched term on Google is "Facebook" and the most frequently searched term on Bing is "Google".) Typing in URLs is a sometimes-decent way to find what you're looking for, though the half-life of links is about two years. But guessing at URLs can be perilous. No one is responsible for fixing this, but Google is the closest approximation, because what they are responsible for is solving the "look for something on the Internet and get the wrong result" problem, a different but related one from the broader task of allowing arbitrary site owners to host arbitrary content. Privatizing is effective, and solves problems for users rather than protocol designers. But it does go against the spirit of the original network.
Twitter is another example, except that they basically created their own open mostly-open protocol and then slowly took control of it. Tweets were originally a simple format, with 140 characters of text and a bit more metadata (this made it compatible with SMS, which had a 160-character limit). For a while, Twitter looked like it would operate a novel protocol and let third parties decide what to put on it and what to read off it. And then the company started to realize that an open system was also one where it was trivial to filter out ads and recommendations. So the outside partners gradually got cut off.
Facebook—the product, not the company—is a third example. It essentially implemented a missing element to the protocol: identity matters. When the original protocols were being hammered out, identity was a problem taken care of by the fact that getting online required steep hurdles; if everyone involved is affiliated with a university that trusts them to use a PDP-10 responsibly, spam or abuse were not big concerns. (Not that early usage was strictly professional.) With billions of people online, accessing relevant information can mean traversing a graph of people rather than links, so the identities of those people becomes important. To the user, if not to the network itself, a name and relationship is a far more important unique identifier than an IP address.
This pattern raises a question: is centralization just a natural tendency of all networks? Are we destined to have a "decentralization sandwich," where there's a hard-to-change set of protocols, something open built on top of that, and a series of closed systems built on top of that, which are the only ones the average person interacts with?
One of the interesting alternatives to this is Urbit, which is trying to build a network in which identity exists from the start, and is neither free nor monopolized. One of Urbit's principles is that users own their data, and while they can use third-party services, they're sharing their data with services, not using a login to access information that lives on someone else's servers.
This is a nice theoretical solution to the "build your own" problem above: if someone tries to start Twitter-but-without-banning-X, they begin with "Twitter-but-it's-100%-X," which seems bad until they reach the next stage where trolls arrive and they start to learn that Online Censorship is mostly about spam and gore and only occasionally about politics. But if they're moving their data around, and their friends graph with it, then spinning up a separate instance of the same service, pulling in the same content from their friends, and whitelisting someone who's blacklisted elsewhere is a lot more feasible.
At least at first. One of the problems with owning your own data is that it ignores the question of what format that data is in. Data formats are sometimes considered carefully, but they can also be hammered out fast to get a minimum viable product out the door. Then they evolve over time, accumulating cruft and complexity, until they might as well be encrypted.
And indeed, what's to stop someone from having a proprietary format that's literally encrypted? Or one that gets changed to an encrypted version later on? This is not a trivial point; if you're ever in a position where you absolutely have to share some information that you want to keep secret, the smartest thing to do is share a huge amount of confusing and poorly-explained information, or use an annoying but technically accurate format. Strict and unalterable rules are no match for determined adversaries, whether they're trying to play a prank, make a point, or maximize shareholder value.
But the data format questions are not rhetorical: there is actually an answer of sorts, though it's a frustrating one. Software systems are not just lines of code, but the entire extended system of people who maintain, host, and build on platforms. And the norms of those communities can push back against market pressures. I've been spending an increasing amount of time around Urbit users and devs, and they really do view the current centralized model of the Internet as anathema. They are aware that the fun apps we use are all either designed to be addictive from the start or subject to relentless evolutionary pressure to get more addictive. There's a history of norms evolving in response to new addictive behavior. And Urbit's architecture is a lot closer to norms than to the clumsy mechanism of laws; instead of centralized services that own data, there's a set of 256 Urbit identities that can route messages between users. Instead of single points of failure, there are a few hundred fallback options; instead of the binary of on-Facebook-or-anathematized, there's a 2^8 bit gradient.
One possibility for the successor to the current Internet is that it will end up being a system designed to support communities that achieve cohesion based on which forms of addictive or socially corrosive technology they reject. Sort of a digital version of the Amish, not rejecting technology wholesale but carefully considering every new technology before adopting it. If you assume that the pace of change in consumer-facing Internet stays high, and that it gets more compelling over time, then in the long run the only options are a) be careful what you use, or b) accept that you're outsourcing a growing share of your decisions to product managers and growth hackers who do not necessarily have your interests at heart.
Norms are not a perfect solution to these problems (if they were, we'd only know about issues like novel addictive behaviors, monopolies, and the homogenization of entertainment from history books). And technology can't fully solve social problems. But it can certainly have an effect on them—"you can't solve a social problem with technology" is a good guideline, but technologies mitigate them all the time. What's a streetlight if not a technological solution to the social problem that there are places where it's dangerous to be out when it's dark? What's asymmetric-key cryptography if not a technological solution to the social problem that it's hard to keep secrets? There's still a fundamental tension with open systems, where they get mature enough to be useful at which point the median user's priority is getting things done rather than adhering to the network's original design principles. But a design that recognizes this from the beginning, and a community that wants to make it happen, can stave off decline for a long time. And that's the positive story of technology: we're all fighting against entropy, which is a hard fight to win, but there are long and fascinating stalemates ahead.
(Disclosure: I'm long decentralization by way of Bitcoin and Urbit. If you're on Urbit and would like to discuss The Diff, go to ~lableg-tadrex/the-diff.)
Thanks to Jonathan Blow for a very interesting discussion on this topic, in particular for some smart skepticism around the easy answers.
A complicated company will typically get valued on a sum-of-the-parts basis: apply a multiple to each of their businesses, add cash and securities, subtract debt, and you know roughly what to pay for the stock. Companies that have high cash balances from one part of their business, and that get a high multiple on other parts, are presented with an arbitrage opportunity: make deals that increase their enterprise value, in exchange for making investments that are enterprise value-neutral. Thus, the deal in which CME starts using Google Cloud, and Alphabet invests $1bn in CME preferred stock ($, WSJ). Moving $1bn from cash to securities doesn't really affect how Alphabet is valued, but reporting higher growth at Google Cloud does.
A general tendency in software is that it's easier to capture value by being adjacent to high-dollar transactions rather than low-dollar ones. If nothing else, it's better to figure everything out in an environment where margins are high and then move downmarket. There's a large and diverse market for applicant-tracking software which focuses primarily on high-wage white-collar jobs, where the cost of software is a small percentage of compensation, while the opportunity cost of missing out on hiring is very high. But once the product has been figured out, it works well for lower-wage jobs, too, which has allowed Fountain to raise $85m for an applicant tracking system focused on hourly workers. The "Great Resignation" is not just people quitting lower-paying jobs, but shifting within the low-wage market. Whenever turnover rises and pricing gets uncertain, a well-informed middleman can collect larger margins (and in the long run, if Fountain controls a larger supply of workers, they'll make the market more efficient, ensuring that the economic minimum wage is less of an abstraction and more of a fact).
During the Long Deflation, many companies got good at driving higher margins through mix shift: prices didn't change much, but they got customers to buy more, or to add on higher-margin items when making regular low-priced purchases. That's a good strategy for growing margins when inputs are stable in price or getting cheaper, but it's a challenge when higher input prices turn some products from low-margin to loss-leaders. Large companies have been increasingly talking up their pricing power ($, Economist), and actually raising prices as well. One interesting possibility is that consumers in the developed world spent twenty years getting used to fairly stable prices as growth in China's share of overall output offset wage inflation there; Covid was a reset where some services weren't available for a while, or had dramatic drops in quality, and as spending patterns return to normal, flush consumers are less concerned with higher prices.
(For an earlier look at the Long Deflation, see this piece.)
The latest roles from companies in the Diff network:
A fintech company that gives people in developing markets better access to investments is looking for senior engineers with Typescript experience. (Remote)
Another fintech company is providing banking services to an underbanked but rapidly-growing economy; they're looking for product managers, ideally with payments experience. (London)
A company giving regular investors access to sophisticated systematic asset allocation strategies is looking for a writer who can help explain these strategies to regular investors. (Remote)
We're also looking for fundamental equity researchers with experience in alternative data—or people who have extensive data analysis experience and an interest in equities.
If you're interested in hearing more about one of these positions, please reach out. Diff Jobs is free for candidates, and even if you don't see a direct fit in one of the roles above, we're happy to chat and discuss other opportunities we're working on.
If you're hiring (or about to be) and you suspect that the people you're looking for are readers of The Diff, please reach out.
There is naturally some resistance to this, both because of conflicts with different countries' norms and with other countries' natural desire to determine their own destiny. So it's a leaky abstraction, but it gets less leaky over time as economies get more integrated and thus more of the world has to standardize. The most direct example of this is the fact that the dollar's share of global invoices is 4.7x the US's share of global imports ($, Economist). In the same way that English is most likely to be the shared second language of any two randomly-selected people who don't speak it as a first language, the dollar is most likely to be the second most useful currency to anyone in a country that doesn't use dollars.