The Long, Long View of Interest Rates
The single most important variable in economics is the risk-free interest rate, i.e. the price of money. Over time, the available data indicate that money has gotten much, much cheaper. The trend is striking:
This chart really embeds two different trends: one of which is misleading, and one of which is likely to only get more extreme over time. The misleading piece is that measuring the rate of return on government debt in the 15th century and the 21st is like measuring two completely unrelated markets. The second piece is that changes in the complexity of the economy, the age structure of the world's population, and the nature of reserve currencies all create larger global demand for savings.
First things first: a loan to the Duke of Burgundy in the 15th century and a loan to the US treasury in 2023 are completely different things. In the latter case, it's a loan to a global hegemon that issues the world's most-accepted reserve currency, a currency in which that loan will be repaid. The Duke, by contrast, is a person, not a country. He collects taxes, but perhaps intermittently, and some fraction of Burgundy consists of his personal property. His big source of uncertain expenses is military campaigns, which are also one of the few legible investments that can produce some theoretical return. But they're usually a bad deal.
Many of the direct costs of conflict would have been paid for through non-market means. But some costs required hard currency, like ransoming captives. These ransoms could be absurdly pricey; this event from early in the time series led to a ransom of 3m crowns, which according to this page is about 75,000x the annual income of a typical farmer. Benchmarking this to US household income, it would be the equivalent of ransoming a national leader for ~$5.6bn—but that actually understates the cost, since taxes as a share of GDP were typically in the low single digits.
So for the tiny set of people (like the Duke) who regularly borrowed large sums, their big lumpy expenses were probably ransoms, which means ransoms were potentially a big lumpy source of income as well. In other words, they took part in a physical contest which would lead to an unpredictable positive or negative payoff. 12% is expensive for a sovereign credit, but pretty cheap for a loan to an athlete whose primary source of income is wagering on the outcome of their own matches.
The process of replacing personal relationships with institutional ones has been gradual.[1] Over time, though, it's created more low-risk or even essentially risk-free lending opportunities. It's hard to draw a dividing line here, especially because some countries move in and out of risk-free status depending on the market's general fears and the specifics of their political situation (the spread between German and Italian rates, for example, is a good proxy for the market's view of how stable Europe is). In the case of the US, we really do have a stable, low-risk borrower—but even the US likes to periodically stress-test the market with debt ceiling fights.[2]
There were lower-risk bonds in the early period of the chart; the paper mentions people earning 5% lending to the governments of Florence and Venice, for example. But in practice, the only risk-free investment at the time was taking hard currency and literally burying it; a risk-free asset with a positive yield is a 19th-century innovation.[3]
Adjust for that, and you'd end up with a new chart: the rate of return on a risk-free investment was on average slightly negative (the inconvenience of hiding it, the risk of losing it, and the possibility, for some currencies, of devaluation). Then it jumped some time in the 19th century despite the fact that governments still needed to borrow, they were now larger and better at tax collection than their predecessors, but their bonds now competed with high-return private sector opportunities). And then we see a decline, or a reversion to the mean, starting in the 1980s: growth declined, rates declined, and the real rate of return on risk-free assets dropped. That's recently been disrupted again, with the resurgence in inflation since 2021.
But the other big driver of that secular-decline chart is still in place, and it pushes the equilibrium interest rate relentlessly lower. The biggest factor is the existence of retirement. There have always been old people, though not as many; this Wikipedia article cites some sources claiming that a reasonably well-off person who made it to 21 in late medieval England would live, on average, to 64. But consider the population in question: if they're rich, in an agrarian economy, they're living off of rent from inherited property. They were born retired. The key difference in modernity is that we took a luxury previously available only to the elite, i.e. the ability to live well entirely off the labor of others, and made it an option for anyone who chose to sock away enough money in their 401(k). (The "labor of others" is now some miniscule share of the profits from every company in whatever index the retiree in question has invested in, and their capital comes from forgone consumption when they themselves had a job, but the fundamentals are unchanged.) Longer lifespans that mostly lead to longer retirement rather than more working years will necessarily increase global savings; whether this old-age saving is mediated through private sector investments or through public pensions like social security, it creates an implicit asset on the retiree's (or future retiree’s) economic balance sheet, a corresponding liability on the part of whoever is offering that income, and thus a demand for income-producing assets that match that liability.
Before retirement savings products and government old-age benefits existed, there were, of course, people who couldn't produce enough to survive but who still got enough to eat and somewhere to sleep. Economic growth is extremely hard to measure because part of what it entails is moving some actions into the market. This tends to bias GDP metrics upward, on average.[4] It also means that more of our economic balance sheets are measurable. Replace a family-, community-, or church-based obligation to care for the old with Social Security and Medicare, and one thing you've done is make that obligation more comprehensive, though at the cost of agency problems and other limitations.[5] But all that really means is that we see the savings-driven decline in real rates more clearly. In a system where the welfare state is replaced with informal obligations, people still need to save in case they're the ones who end up being obligated. In fact, they have to save more; as the development of Korea and China demonstrated, one way to encourage higher savings is to have weak employment protections and a minimal safety net.
Technology is also a driver of rates. But the direction is noisy. The more sophisticated the financial system, the more likely it is that deploying new technology will be inflationary. There are two forces at work: in the long run, new technology is deflationary over time, since we're getting more from less—the number of labor-hours required for illuminating a room for an hour, traveling across the country, or getting a nutritious meal has continuously declined. But when technology is being deployed, it's inflationary, because there's more demand for investment and labor. So asking whether the impact of a given technological development is net inflationary or deflationary over, say, the next decade, amounts to asking: how quickly is it getting deployed? If we developed some radically transformative new technology, like a way to generate low-cost, low-emissions energy from trivial amounts of a fairly abundant natural resource, taking advantage of this would require spending money on construction labor, equipment, and raw materials, but would lead to energy abundance over time.
The surge of inflation in the 1970s is a case study in this. There were certainly geopolitical factors, like the oil embargoes, which were the proximate cause for higher inflation. But that period was also a period of high household formation, both because the baby boomers were reaching their peak childbearing years and because of higher divorce rates. Increased household formation was essentially a new deployment period for cars, appliances, and other purchases tied to home ownership. But these purchases also reduce marginal costs; a dishwasher converts immediate raw materials into an annuity denominated in time. An earlier commodity supercycle in the early 20th century was more directly driven by the second industrial revolution reaching its 1-to-N phase, but was followed by the deflationary 1930s.
Last big feature in the real rates model is the existence of a reserve currency. Early in the time series, there were reserve-like currencies; some kinds of money were good for transacting or paying taxes in a specific place, but a ducat or florin was useful just about anywhere, because Venetian and Florentine merchants were almost everywhere, and people who did business with them were everywhere else. But these were small, open economies, of the sort that can't absorb significant inflows. They're closer to the Swiss franc than to the dollar: everyone knew they were safe, but it wasn't possible for everyone in the world to denominate savings in the same currency.
What the dollar as a reserve currency does is to create demand for dollar-denominated savings from exporters, who a) want to keep their currency from appreciating too quickly, and b) want to have local dollar liquidity to ensure that they don't have a ruinous financial crisis if their exports slow down. The relevant exporters, and the policy consequences, have varied over time; sometimes the petrodollar is the dominant form, and sometimes it's manufacturing economies. But the direction persists, and as long as the dollar has such strong network effects, there will be foreign demand for dollar-denominated savings with minimal interest rate sensitivity.
Extremely long-term trends are important, because they're the closest thing we have to true economic fundamentals. If something was true under feudalism and democracy, in wartime and peacetime, in an agrarian economy, a manufacturing economy, and a services-based one, it's probably just a fact of economic life. The decline in real rates is noisy in the chart and noisier still in reality, but it's something we should accustom ourselves to: if people live longer than they work, and provide for their old age by saving money; if technological advances are deflationary over time and haven't been happening as often as they did at the peak; and if countries still grudgingly rely on the dollar; then the long-term set point for rates will decline over time. There will be interruptions to this, of course; the impact of the Singularity on treasury rates is not in the top ten list of most important things about it, but it's true that the deployment of AI will increase spending right away while reducing it later on. So, enjoy high rates for a while, at least if you happen to be enjoying them. The safe long-term bet is what it's been for the last 800 years.
Thanks to Will Eden for the Twitter comments that prompted this piece.
It happened on the lenders' side, too; a running theme in this history of the Medici is that their branch network was constrained by the number of effective, trustworthy cousins and nephews they could find. ↩︎
The paradox of the US's creditworthiness is that it makes US government debt a financial version of a Giffen Good, where anything that seriously threatens the US's ability to pay bonds makes other investments so much worse by comparison that bond prices actually go up. For people who own long-term treasury bonds, a skipped coupon would be more than offset by the capital gain from the bonds rising, at least if the problems eventually got worked out. ↩︎
Though this raises a question of whether we talk about "risk-free" in the sense that there's literally no risk, or just that every financial actor behaves as if there isn't any risk. That matters, because when we're measuring the risk-free rate, we actually want investors to be deluded. A prudent Victorian investor buying perpetuities with a 3% yield might have imagined that the sun would never set on the British Empire forevermore, and that Britain would always be able to make good on its borrowing. They were wrong, but their capital loss is our data gain, because now we know what such an investor thinks the indefinitely long-term rate of interest for a risk-free investment should be. ↩︎
That's only on average, though. There's also some downward bias, depending on how heterogeneous peoples' desires are, in particular, how much necessary behavior they actively dislike. If someone switches from spending half a day each weekend cleaning house, catching up on laundry, etc., to hiring a maid to do this and spending the same time on some freelance project that happens to earn them as much after taxes as they pay for the maid, the usual GDP increase happens. But if they love the freelance project and actively loathe cleaning, economic aggregates can easily underestimate the value created. The same low transaction costs that enable more behavior to happen through the market also means people have more choices about how they spend their time; there's an invisible consumer surplus creation trend that could be a substantial contributor to long-term wealth creation even if it doesn't show up in GDP. ↩︎
The government can easily promise a certain baseline income for everyone over 65, but they aren’t nearly as good as family/community/religion when it comes to ensuring that they have a minimum number of social interactions with someone who cares about them. Universal Basic Visit-From-The-Allograndkids is probably not going to happen. ↩︎
Diff Jobs
Companies in the Diff network are actively looking for talent. A sampling of current open roles:
- A diversified prop trading firm with a uniquely collaborative team structure is looking for experienced traders and PMs. (Singapore or Austin, TX preferred)
- A company building the new pension of the 21st century and building universal basic capital is looking for a GTM / growth lead. (NYC)
- A new health startup that gives customers affordable access to preventative care and lifestyle interventions seeks a founding engineer. 7+ years of JavaScript experience preferred (TypeScript is ideal), and payments experience is a plus. A great opportunity for anyone excited to make healthcare better by treating problems cost-effectively before they're catastrophic. (US, remote; Austin preferred)
- A new fintech startup wants to bring cross-border open banking to LATAM, and is looking for a founding engineer. (NYC)
- A company building ML-powered tools to accelerate developer productivity is looking for a mathematician. (Washington DC area)
Even if you don't see an exact match for your skills and interests right now, we're happy to talk early so we can let you know if a good opportunity comes up.
If you’re at a company that's looking for talent, we should talk! Diff Jobs works with companies across fintech, hard tech, consumer software, enterprise software, and other areas—any company where finding unusually effective people is a top priority.
Elsewhere
Insiders
The hedge fund Two Sigma says an employee made unauthorized changes to a trading algorithm, which earned one strategy $450m and cost another $170m ($, WSJ). There's an interesting pattern with rogue trading stories, which always makes them a fun laboratory for understanding incentives. A typical rogue trading strategy starts out small, secret, and profitable, but ends up losing money and getting publicized. Which leads to the distinct possibility that some firms tolerate them, or at least that some managers look the other way—if the profits are high, the incentive to defer applying the rules is high, too.
This particular case has a bit of both: the part where the rogue traders' profits speak for themselves and nobody investigates too closely, and the part where the losses are too big to ignore. But it raises the question: what sort of tweak would be profitable for one strategy and lose money for another? Of course, some of this could be a matter of timing, or of someone doing multiple rogue trading strategies with uneven payoffs. But then there's this:
Wu’s changes led to gains of $450 million in total for some Two Sigma funds—including those in which the firm’s own executives and employees invest, as well as those available to clients. But they also led to a total of $170 million in losses for other funds compared with how they otherwise would have fared—losses largely borne by clients. Two Sigma has made them whole.
Many funds will run different strategies for different investors, and will often run the highest-risk version for mostly or entirely internal capital. This always raises the question among investors of whether or not they're getting the firm's best ideas (the answer is: no, you're always getting the best ideas you can afford, but there is no long-term reason to expect "being an LP" to be a common reason for accumulating vast wealth). But this is an interesting case where the incentive misalignment can create a problem. A change to strategies that lets one take advantage of opportunities ahead of another is a wealth transfer to whichever portfolio gets to move faster. If, in a more aggressive case, portfolio A is buying, after which portfolio B buys from portfolio A, it gets worse. That's not directly what investors were worried about—if they thought a manager would do that, they'd probably decline to invest at all. But it's hard to perfectly design incentives in a big organization, and what's economically rational at one level can lead to pathological behavior at another.
Threads
Adam Mosseri, who runs Meta's Twitter competitor Threads, hosted an ask-me-anything on Friday where he said, among other things, that he wants Threads to be the "de facto platform for public conversations online." That's a high bar, and one that requires a company to have a low discount rate. There are plenty of actions they can take right now that increase engagement but ultimately poison user loyalty, or carve off some user populations that will be reluctant to adopt the site. At least at the parent company level, Threads is the best-capitalized short message platform, while Twitter is the worst, so whenever they're both faced with an opportunity to trade off short term financial needs against long-term strategic ones, Threads will come out ahead. On the other hand, Musk has plenty of net worth, just less liquidity. But on still another hand, if he borrows against other assets, or sells them, he threatens his own cost-of-capital advantage.
Disclosure: Long META.
Where the Pandemic Didn't Happen
There are some events that represent a permanent inflection in a long-term trend, and others that feel that way at first but turn out to be a minor wiggle in the long-term chart. One example of the latter is Covid's impact on air travel, which looked for a while like it might have permanently crimped demand. As it turns out, the bigger problem is supply: China's aviation regulator is planning a new route list that represents a 34% increase to pre-pandemic levels. It's possible that, particularly in a more metrics-driven system like China's, one result of Covid was the realization that some things, including air travel, had been undervalued, and only losing them for a while made it clear how much.
Leftovers
Flexport, which has recently gone through a burst of cost-cutting, is in talks to buy the technology assets of recently wound-down Convoy. The companies have some high-level similarities—they're both well-funded logistics startups that raised significant sums to build an operations-and-software business in parallel. Which means that part of Flexport's calculation is not just that buying Convoy gets them some features they might have had on their roadmap already, but that it prevents someone else from buying them.
Opportunity Cost
The WSJ has a piece on students dropping out of school to start AI companies ($). There are two good models for dropping out:
- Realizing that the opportunity cost of staying in school is unacceptably high—in this case, that waiting a few more years to start or join a company means entering a sector where the companies are more mature and the options are struck at a higher valuation.
- Realizing that the opportunity cost of missing a semester or two is low. In a high-variance sector where startups can be wiped out overnight (RIP the "talk to your PDFs" industry), any given company will either take off quickly or be obvious roadkill in a month or two. So the decision is less about potentially becoming a dropout forever, and more about tolerating the worst-case scenario of graduating slightly later.
Disclosure: I dropped out; you'll be fine.