Empiricism Disappears into the Black Box

Plus! Back to Normal; Theoretical Profits; Ukraine and US Energy; "Altasia"; Rebirth of the Author; Diff Jobs

Empiricism Disappears into the Black Box

One of the big open questions about AI right now is: when we improve systems, do we get worse at understanding how they work? And, if so, does that put limits on how much better they can get?

Many technologies can be understood at a wide range of abstraction levels—really grocking the car means knowing a bit about how internal combustion works and knowing how the layout of cities changes when new commuting options become available. But LLMs are already reaching the point where the bugs and their explanations are extremely weird. We can't be many generations of LLMs away from the ones whose bugs defy all human understanding. We already have an ML researcher at OpenAI who doesn't know why models trained to answer English-language questions generalize so well to other languages![1]

But that's just a special case of a general characteristic of our civilization: the world embeds a lot of complexity that goes unseen until something breaks. And, like a large language model, it often gives you an output that makes sense from the input, with a process that's hard to parse.

We see situations like this all the time. For example, a few weeks ago, there was a massive ice storm in Austin, which weighed down lots of trees and broke many more branches (there are still branches neatly stacked up in many yards around my neighborhood). A particularly large branch fell down in our yard, and we needed a saw to get rid of it. Demand for saws in Austin was certainly not running at seasonally normal levels, and yet Home Depot had one in stock and we were able to pick it up.

And that's extremely impressive! If you talked to someone in, say, 1973 and told them roughly how rich America was circa 2023, how big the houses got, how much more physical stuff people were able to afford, etc., they might be able to grasp it. But the idea that a natural disaster wouldn't necessarily lead to a shortage of the equipment needed to manage such a disaster could easily blow their mind, because a meaningful component of the late twentieth century increase in wealth has come from improvements in supply chains.

For us to buy a saw that conveniently:

All that requires significant investment and attention, and it has to be generalized. Go back far enough in history, and retail inventory management is a fully manual process. A smart store operator might have an inkling of some unusual change in demand, and order accordingly, but they wouldn't catch every detail. My guess is that all the big stores run a bunch of regressions looking at the impact, over different lags, of every conceivable variable on demand, and that there's a wide range of products for which the weather forecast is a useful input. No single human being in the world needs to know specifically what function describes the correlation between forecasted wind speed and demand for various gardening implements; the organization as a whole just needs to know that they're in a competitive business where customers prize reliability.

And the net result of all of this is that consumers have to think a whole lot less. It's the brainpower equivalent of the post-ownership economy: You'll know next-to-nothing about how goods and services are provided, but that won’t matter because you’ll get what you want and you'll be happy.

Of course, it doesn't always work out that way. We all collectively learned a lot about the frailty of supply chains in 2020. On the other hand, supply chains are more resilient now, with planned reshoring, actual nearshoring and country diversification, and deeper transparency. And the supply chain shock from Covid and the policy responses thereto seems proportionately smaller than the impact of, say, the Thai floods in 2011 or the SK Hynix fire in 2013.

At this point, a champion Tradle player might be able to predict the third-order impact of some natural disaster or political disruption, but what they'd really have to predict is the gap between reality and what a competent supply chain manager would predict, which is a much harder question. This kind of complexity tends to increase fixed costs and reduce marginal costs in a wide range of businesses; it doesn't make sense for a smaller retailer to invest much in inventory management systems when they won't have the data or the redundancy to build better models. Meanwhile, the distributors that those small retailers work with will make such investments, meaning that the bigger ones will have a compounding advantage.

And the same story plays out in other areas: computing is getting more centralized in the sense that the largest platforms are taking share, and it's also getting more complex in the sense that a cloud computing service is harder to mentally model than a server in a closet. Plus, it's also getting cheaper, and some of that cheapness is downstream of other complex, opaque supply chains; the 700-step process for fabricating chips relies on people who have expertise at some stages of the process, but none of whom have comprehensive knowledge of all of it.

So throughout the economy, we can see areas where there's simultaneously 1) a growing number of people dependent on some centralized system, and 2) a shrinking number of people who fully understand that system—with sufficiently complex systems, the asymptote is zero. And yet, having a glancing familiarity with them sometimes becomes very important; the people who were excited about AI back in the GPT-2 days had a big head start in the last few months.

You have to have a mental taxonomy that looks something like this:

  1. There are topics that are important, and where you have some kind of advantage such that you can be among the better-informed people in the world on that subject. (There's variance in here between being an expert on a niche thing and knowing a bit more than average about a range.)
  2. Some topics you just have to discard by betting that they're some combination of too hard to figure out and too minor to matter.
  3. But some areas would ideally fit into category 1, except that they're changing fast enough that you can't plausibly catch up, but they're also growing in importance fast enough that you'll wish you had. This forces you to fight some kind of intellectual rear-guard action where you hopefully have a friend or two who can keep up with the topic, and you periodically check in with them to see how optimistic or freaked-out they seem (or, of course, a newsletter that you open and passively read from time to time).

That's a weird and postmodern way to live, since it essentially means outsourcing a growing amount of your knowledge about technology and economics. But there's a meta-Malthusian dynamic at work, where the easy-to-grasp concepts get discovered early and implemented everywhere, so more of the growth and change comes from ideas that are harder to understand, both because they require so much contextual information and because the minimum intellectual horsepower to understand them is so high.

That's an environment that can be rife with fraud and exaggeration: keeping up with AI is a more than full-time job, much like keeping up with crypto was a few years ago. If there's a dire enough shortage of expertise, with the true experts focused on doing rather than funding the doers, it means that scammers who can fake expertise will be able to raise money relatively easily.

Meanwhile, one place where that's less likely to happen is when new technologies get applied within existing institutions—going back to the supply chain example about inventory management systems and modeling above, a company that claimed to offer this as a third-party service might be able to raise money on that basis without a working product, but an executive who claimed to be able to manage demand better than some legacy system would be continuously tested against the reality of that legacy system, and wouldn't be able to get by on promises forever.

A rise in the share of scams and overhyped companies doesn't preclude value creation. It describes the environment in any kind of technology boom; when the ideas are big enough and investors are aware of them, the most acute shortages are in a) the ability to navigate between the current reality and the promised future, and b) general skill at execution.

But that's also a preview for what a growing share of the economy will look like: given enough growth, the majority of the world economy will be dependent on a shrinking cohort of experts whose skills are so specialized that they're mutually-unintelligible to one another. Which is almost a throwback to the pre-modern economy: an agrarian civilization is dependent on photosynthesis, on the unnatural selection of plants and livestock, and sometimes on rituals that roughly approximate what the germ theory of disease would tell you was a good idea (it's not a coincidence that many of the longest-lived religions have taboos around avoiding certain foods and maintaining ritual cleanliness—the groups that didn't have such rules were more vulnerable to pandemics!).

The industrial revolution created many marvels, but they were marvels that an inquisitive, literate person could figure out pretty quickly, which meant that on the margin economic growth made the world more understandable rather than less: every time human and animal power was replaced by steam, it meant that a system dependent on the then-unknown ATP cycle was replaced by one that could be explained in a few simple diagrams. But we've passed that point: ongoing economic growth will mean increasing inscrutability, and it will be harder and harder to keep up.


  1. Which was probably because the full training set was not entirely English; working with data means putting a high prior probability on interesting results turning out to be a result of bad data. But the lack of confidence is telling! ↩︎

A Word From Our Sponsors

If time is money, why are you wasting it?

Maximize investor research and diligence returns with an end-to-end platform that solves inefficiencies. Tegus streamlines the information investors need to move quickly, build conviction and make better decisions to outperform the market.

Right now, The Diff readers can trial the Tegus platform for free at http://www.tegus.com/thediff.

Elsewhere

Back to Normal

You can measure how hard an industry was hit by the pandemic by how long they kept quoting performance "compared to 2019" instead of the traditional year-over-year comparison. US airlines may finally leave this behind: seven-day passenger count in the US is now at 99.7% of its 2019 level for the same week of the year. Movies, however, have not. Sometimes external disruptions temporarily shake up an industry but ultimately leave it with roughly the same economics, minus a few very bad years of losses. And other times, they're big enough to completely restructure it: theatrical releases will still be around for a long time as marketing events, just as video game retailers kept doing midnight release events well after gamers had mostly switched to downloading. But the theater model has been permanently impaired by the growth of streaming, which has created more ways to release a movie and meant that IP owners who also have streaming services have a stronger bargaining position with other distribution channels.

Theoretical Profits

Austrian bank Raiffeisen more than doubled its profits last year, but most of the growth came from a Russian operation that can't repatriate cash and is hard to sell ($, FT). Their CEO says, aptly, "We have very, very good results on the one hand, but on the other hand enormous problems." Selling the business means giving a good deal to someone close to Vladimir Putin, but keeping the business means both contributing to Russia's war effort and dealing with the PR fallout from doing so. When companies consider operating in politically unstable countries, they usually think of it as a high-risk situation where their investment could be rewarding, or could go to zero. But some political configurations mean that the investment is in one sense a phenomenally profitable one, but that in another sense is more of a liability.

Ukraine and US Energy

US energy exports to Europe have been growing in volume and price since the start of the war in Ukraine ($, WSJ). In the long run, oil is a global market where price discrepancies get held down by fairly cheap tankers and (eventually) pipelines, but in the short term it's a market that's prone to disruption because storage is such a small fraction of consumption; the market is always precariously balanced between shortage and glut, and sometimes that's more true regionally than globally. The US is 19% of global oil production, which is significant, but over the decade ending 2021 it was 150% of net oil production growth; fracking has made the United States the effective global swing producer, a position that's significant disproportionately to the exact amount of oil involved.

"Altasia"

The Economist highlights the trend of manufacturers moving away from China but staying in Asia ($, Economist): the "Altasia" countries have larger labor forces and exports to the US in the aggregate. One striking detail from the piece is that China's manufacturing labor was cheaper than almost every other Asian country in the early 2000s, but has since roughly octupled in price and is 2.5x-4x other Asian countries'. The fact that China's exports have largely grown over a period during which the country went from competing on cost to competing on investment and relationships is a striking example of how a steady growth trend is often composed of a series of tough, high-risk choices.

Rebirth of the Author

Short story magazines are being overwhelmed by AI-generated submissions, apparently because writing GPT-generated fiction is a popular side hustle. Spam always scales faster than real content, but usually scales in a detectable way. But as LLMs get better, it gets more human-intensive to spot. What's notable about this is that if the AI-generated stories were trivially distinguishable from real stories, this would be a problem as trivial as the usual flow of spam or near-spam that winds up in all but the best-managed inboxes; zipping through a bunch of unwanted emails to triage an inbox into higher-priority stuff is an inconvenience, but a minor one.

Fiction may be a category where AI is good enough that the effort to spot it is no longer trivial, but it's not as good as the real thing. Maybe that's a temporary situation; once AI-generated fiction is indistinguishable from the median human-created submission, it could lead to harder questions about what a fiction publication is for. Conveniently, science fiction author Neal Stephenson recently gave an interview ($, FT) with a decent answer:

My theory is that when we experience art — whether it’s a video game or a Da Vinci painting or a movie — we’re taking in a huge number of micro decisions that were made by the artists for particular reasons. In that way, we’re communing with those artists, and that is really important. Something generated by AI might seem comparable to something produced by a human, which is why people are so excited. But you’re not having that awareness of communing with the creator. Remove that and it’s hollow and uninteresting.

Perhaps the next phase in fiction will be one where book tours constitute proof-of-work that the author is a human being and not a bunch of matrix multiplication—or perhaps some of the fun of reading AI-generated fiction will be the unique inferences we'll be able to make about models when they're less tethered to reality.

Diff Jobs

Companies in the Diff network are actively looking for talent. A sampling of current open roles:

Even if you don't see an exact match for your skills and interests right now, we're happy to talk early so we can let you know if a good opportunity comes up. We're always onboarding new companies, so the available roles change frequently.

If you’re at a company that's looking for talent, we should talk! Diff Jobs works with companies across fintech, hard tech, consumer software, enterprise software, and other areas—any company where finding unusually effective people is a top priority.