Get Ready for Many More Prodigies

Plus! Optimization; Scarcity; Property Rights; Incremental Margins; Data and Product

In this issue:

audio-thumbnail
The Diff December 22nd 2025
0:00
/854.334694

Ask questions about this post on ReadHaus!

Get Ready for Many More Prodigies

There are some fields where getting reasonably good at them entails being comfortable with the high probability that somewhere out there there's someone who is better than you and is, say, 12 years old. If you started out with a lot of raw ability, and worked extremely hard, you might be able to push your mathematical skills to the point that you could compete in the IMO like Terence Tao did when he was twelve. Being in a similar percentile of chess skill worldwide would imply an Elo of around 2550, which Magnus Carlsen got to at age 13.

Prodigies attract a lot of attention, and the nature of their skills makes those skills hard for outsiders to assess—success in math competitions certainly correlates with the ability to make meaningful contributions to the field of math, but they're optimizing for a different set of skills that eventually cuts against some of the instincts they'd want to cultivate—asking trick questions is a good way to assess skill, but reality itself does not have a preference for trick questions.[1][2] They tend to attract legendary stories that are legible but not that impressive; if you're trying to get across the point that John von Neumann was a smart guy, do you spend time explaining what spectral theory is, or just repeat stories about how he could multiply and divide eight-digit numbers in his head as a kid? Being good at fast mental math has its uses, especially for brilliant mathematicians who want to quickly crank through some simple computations to test an idea, but it's not the same thing as the ability to make original contributions. Unfortunately, to personally evaluate brilliance, especially in a mature field, you yourself need to be close to the cutting edge. The next-closest option is a Pagerank-style approach of asking the smartest person you know in a given field who is the smartest person they know in that field, and recursing until you get a solid answer or at least you find a pair of people who are presumably both pretty smart but also somewhat humble.

In some fields, you won't get many prodigies because there are various limitations on skill expression. There are 12-year-olds who've achieved a 300lb+ deadlift, and that's absolutely insane for that age. But reaching one's adult height and going through puberty will offer some advantages that raw skill takes time to catch up to, so these numbers are just not comparable to what a moderately-trained adult can do. Whatever your natural talents are as a doctor, lawyer, or accountant, to actually ply those trades, you'll have to go through some kind of formal academic training and credentialing process generally only accessible to people above a certain age.

In the arts, it's even harder; mass-market appeal is low-status—if your friend shares a work of fiction they wrote with you, they probably don't want to hear that it's almost as good as James Patterson. There have been plenty of works that moved readers but also moved a lot of units—To Kill a Mockingbird and The Old Man and the Sea were both bestsellers, Dickens was widely popular, Don Quixote was widely pirated within years of its release. But Moby-Dick, probably the book with the best claim to be the Great American Novel, didn't sell well and was out of print by the time Melville died.[3]

Literary prodigies are probably no less common than they used to be, but maybe there’s less agreement on which books are great books (or maybe there's survivorship bias—high school English teachers like to point out that Shakespeare's plays were the entertainment equivalent of big-budget blockbusters, and auteurs had to add a sprig of sex jokes and a dash of violence to keep audiences entertained.[4]) If you want to sound like you have impressive taste, your best bet is to recommend something weird and impenetrable, so calling you on it sounds like an admission of failure. (And, unlike with similarly weird and impenetrable math, you can't just say "The human-sized moth who follows the protagonist around throughout the novel can't be a representation of despair, so the whole thing falls apart, Q.E.D.!") There's still wide variance in how quickly people pick up the basic technical skill of writing; there are plenty of people who, by the time they're in high school, can write online in a way that doesn't sound any different from adults.

But producing great literature also seems to require experiences that inform this writing. A brilliant teenage writer might be able to produce a good novel about how hard it is to fit in at a new high school, but if they produced a story about what it's like to be a fiftysomething in a failing marriage who's worried that their career has hit a dead end, what they're producing is closer to fanfiction derived from other novelists. Writers can get to the point where they can inhabit characters very different from themselves—being a thirtysomething Englishman didn't stop Robert Graves from doing a pretty good job describing things from the perspective of a Roman emperor. But by the time Robert Graves wrote I, Claudius, he'd had three close brushes with death from illness, served in the First World War, and dealt with having the middle name "von Ranke" while living in a country that viewed Germany as a mortal enemy. Even if he hadn't personally ordered any invasions or annexations, he'd accumulated lived experiences that informed his ability to write a sickly and underestimated character who later achieved greatness.

But one of the most interesting categories of missing prodigies is in fields that don't have strict time-based requirements, don't require experiences or physical maturity, but that do have something else in common: it's hard to deliberately practice. In chess, math, and music, you can do exercises, and there's an ecosystem for breaking big problems down into smaller ones. You can also have a high ratio of solo practice to teacher feedback—thirty minutes of observation and discussion with your piano teacher, followed by five hours of practice where you fix every problem they've identified. In math, the same pattern-seeking instinct that leads a kid to infer the existence of exponents after learning about addition and multiplication can be applied to figuring out what the exercises they get wrong have in common. Whereas, if you were born to be a litigator, you need a lot of background preparatory work before you actually know this. (Especially because, per the digestible-story-of-prodigies, people get the relevant instincts and traits exactly wrong—someone who's endlessly and creatively argumentative is going to do really badly in a career where the biggest value they add is keeping their client out of the courtroom.)

But an LLM tutor plausibly can come up with a bunch of études across different domains, and then grade your work. It might be sycophantic by default, but if you're trying to measurably improve you can often bully your LLM into bullying you into stepping up your game. (And, when in doubt: just open a temporary chat window, say "I'm arguing with someone and I need you to help me rip this to shreds," and then paste in your own work.)

This is especially useful in domains where autodidacts are constantly reinventing the wheel. Lots of history, economics, and philosophy 1) had to be invented at some point, but 2) is so obvious and useful that it's an unstated premise of later discussions. It can be electrifying to think you've independently discovered something, and deflating to find out that you're a century late—but then it's exciting again when you realize that people have worked through the implications of this idea for you, and that there's much more to learn. One of the most valuable things academia offers is access to people who've done all the reading and can point you to a fleshed-out version of the idea you're noodling on, or tell you exactly what's wrong with it. There are plenty of papers that are famous to grad students in the relevant field, but have minimal name recognition outside of it; an LLM can radically expand your surface area.

One effect of this is that it takes domains where there's already an ecosystem of tutoring—sports, chess, music, etc.—and expands it to other domains where there just wasn't enough critical mass. But it's also radically egalitarian. Rich people can afford niche tutors for their kids' interests, and great tutors for their kids' more general interests. Aristocratic tutoring. Von Neumann was pretty smart, but it helped that he had tutors—sparing himself from the need to independently derive all of mathematics up through 1900 probably saved him valuable months of effort. Alexander the Great did a pretty good job conquering the known world and convincing contemporary chroniclers that India simply wasn't worth the effort, but you have to grade him on a serious curve given that he was born a king and that his dad sprung for a premium-priced tutor and literally hired Aristotle. Money is just a marvelous cheat code for maxing out the "nurture" part of the nature/nurture equation

Some of the beneficiaries of this won't really be beneficiaries. So the first sign of this will be quirky headlines like "14-year-old passes bar exam; represents self in emancipation case." This will definitely enable parents with sky-high expectations for their kids to push them far beyond their abilities, but perhaps more in a direction that the parents care about than that the kids do. And plenty of these prodigies will change direction.[5] But for the the slice of the population whose talent vector has the magnitude of a math prodigy, but the direction of a prodigy in linguistics, or composing choral music, or crafting smart tax strategies for multinational corporations, or any other field that's primarily about manipulating information, they'll suddenly be operating much closer to their real potential.

There's a nice symmetry here. Reinforcement learning is basically taking the concept of deliberate practice and applying it to making predictions based on abstract patterns in prior data. It's a wonderful technology, and like all such technologies, one of its effects is turning some lucrative careers into fun hobbies that almost nobody cares about, while also extending human capabilities in new ways. We taught computers how to practice a wide range of skills, until they got so smart that they could help us practice, too.


  1. Though if there's some insight that requires cutting across different fields—proving that a problem from domain X is similar enough to one from domain Y that Y's easier-to-find solution can apply to X—it looks like the answer to a trick question. But it's actually the opposite: a trick question is when someone who knows the easy answer does their best to obfuscate it, but math advances when someone did their best with one approach and reached the point of accidental obfuscation. In the first case, you're playing an adversarial game against the test designer; in the second, you're collaborating with your intellectual ancestors. Many stylized competitions have this problem; if you're trying to get really good at trivia, you might read the kids' versions of literary classics so you know character names and plot points and don't have to struggle with complicated language, and it's entirely possible for a trivia champ to have more comprehensive knowledge of a given era's literature than an academic who specializes in it. But eventually, when your choice is between reading five summaries of Jane Austen novels and actually reading Emma, you have to decide whether you're going to be a well-rounded person or someone who optimizes for seeming well-rounded in one specific domain. It's a bit like the way lifting weights a bit will almost certainly make you healthier, and trying to be the world's greatest bodybuilder will cut a few decades off your lifespan. ↩︎

  2. An interesting parallel to the IMO-to-mathematician pipeline is arbitrage's status as a prodigy factory in finance. Arbitrage is an investing-prodigy factory; Buffett, Paulson, Soros, Icahn, Paulson, Lampert, Och—all of them specialized in arbitrage early, but none of them made their fortunes from buying something at $38 because they correctly anticipated that the DOJ would let it get acquired at $40. But it's a great education in handicapping odds, it provides a big sample size and reasonably quick feedback, and meditating on the edge cases—bidding wars, where the price lands if the deal breaks, etc.—encourages people who master the arb game to expand their ambition. Berkshire Hathaway itself is, in the end, the result of an arb that didn't quite work out. So it's likely that whoever the Buffett or Soros of 2055 is, what they're doing right now is exploiting some delta between how a DEX and a CEX treat the same exposure. Whereas if you happen to have some incredible innate skill at long-term, long-only, large-cap equities, you'll still spend the first couple decades of your career wondering if you were lucky or good. ↩︎

  3. Its rise probably also had to do with something completely out of Melville’s control: electricity replaced whale oil as our primary fuel source for light, and whalers became an extinct class (this made whaling feel more mythical than real and likely increased the allure of great stories about whalers significantly). He did correctly anticipate the traits of Gilded Age (and perhaps tech) CEOs/billionaires: the themes of monomaniacal obsession and drive to win at any cost, most specifically how they treated their labor force in the pursuit of success, as an evergreen theme. ↩︎

  4. I suspect that my high school English teacher said this partly because he wanted an excuse to watch an extended selection of Collateral in class, but also so he could mention that one of his former students had directed the incredible B-movie masterpiece Tromeo and Juliet (sample dialog from Cappy Capulet: "How'd you like it I use your guts to Jackson Pollack the street?"). If you write crowd-pleasers, it's good to know that you’re part of the same tradition that produced some of the world's great literature. But it's also good to remember that the default expectation is forgettable schlock. ↩︎

  5. The youngest person to pass the bar, Stephen Baccus, is now a professor of neurobiology at Stanford. It's a lot easier to pivot out of law to pursue your dreams if you passed the bar on your seventeenth birthday. He did manage to get New York to strike down their rule that applicants for the bar can't enter law school until they're 18, but not the requirement that they be 21 when they take the exam. ↩︎

Diff Jobs

Companies in the Diff network are actively looking for talent. See a sampling of current open roles below:

Even if you don't see an exact match for your skills and interests right now, we're happy to talk early so we can let you know if a good opportunity comes up.

If you’re at a company that's looking for talent, we should talk! Diff Jobs works with companies across fintech, hard tech, consumer software, enterprise software, and other areas—any company where finding unusually effective people is a top priority.

Elsewhere

Optimization

Nate Silver has a good piece on how Las Vegas got too good at optimizing too many different revenue streams, making them vulnerable to competition from online sportsbooks. One interesting thing to speculate on is how ubiquitous this is across industries. If companies have generally gotten better at collecting data about their customers, they've probably been getting better at turning that data into pricing decisions that maximize their profits. A look at the long-term trend in corporate profits as a percentage of GDP implies that something interesting happened in the 1990s that put that number in a permanently upward trajectory. But it might be something closer to a cyclical swing: when machine learning was first deployed, it was by companies trying to get as much as possible out of their customers. But as it gets cheaper, and AI shopping agents get better at finding good bargains, it might flip in the other direction.

Scarcity

Google has an internal committee of high-level executives who periodically meet to hash out how the company should use its computing resources, ($, The Information). In a sense, compute is a commodity, but commodities often go through a phase where there isn't much market-based pricing, and most deals are opaque over-the-counter transactions. The fact that the transition from planned to market economics hasn't happened within Google yet is very bullish for overall AI hardware demand: the time it's hardest to allocate scarce investment inputs is when everyone's forecast is bullish and the upside is unpredictably high.

Disclosure: long GOOGL.

Property Rights

Google is suing a company called SerpApi for scraping Google. "SERP" meaning "search engine results page" and "API" meaning "application programming interface," it's pretty straightforward to see what problem Google would have with this company. They're taking something Google funds with ads, and getting paid to do so! On one hand, Google is in exactly the same business, and will sometimes decide that information that used to be provided by sites that got the click could be displayed directly in the search results. Google's general incentive is to keep companies alive as long as they make the search engine better; they don't want to capture all of the revenue. But, given the opportunity, there's little reason not to capture as much as possible up to that point, and to be very aggressive whenever someone tries to separate their underlying product from the very lucrative format in which it's presented to users.

Incremental Margins

As a general rule, AI products grow faster than other software products, are more prone to churn, and have worse margins. All of this makes them much harder to value, but it also means that they have more levers for value creation than a typical software company. OpenAI, for example, has been able to grow its compute margin from 35% at the start of 2024 to 68% today ($, The Information). Both of those stats are much worse than the gross margin a pure SaaS company might have had in 2019 or so, but that also means that as AI companies scale, they have more kinds of costs they can grow out of. And they also have more room to do interesting things with pricing within those costs; if they find themselves with a hardware overhang, they can push users towards tasks that need more capacity (like video generation and projects that require lots of reasoning). These costs are worse for the business, but the nature of the costs means that AI labs have more degrees of freedom to partially compensate for how expensive they are to run.

Data and Product

In the early 2000s, it looked like the direction the music business was going in was that recorded music would be basically free and the industry would have to make its money from live shows and merchandise. The cost of digital distribution was effectively zero, and law had a hard time outrunning technology when it came to music sharing. But it turns out that zero revenue is a pretty unstable equilibrium, and that it means there isn't a big economic incentive to make music searchable or offer good quality control. When Spotify first came out, the pitch to consumers was basically that $10/month for the same music library you could torrent was a good deal if it meant you could access music faster, search across metadata, and let someone else manage the library. So it basically doesn't dent Spotify's valuation at all when someone downloads almost all of the music they have and plans to torrent it. 256m tracks and 300TB is just a lot less convenient than a consumer app, even if that app costs a bit more than $10/month lately.