- OpenAI co-founder Greg Brockman on becoming a machine learning practitioner (2019). This is a great piece, not just as an object-level look at a skillset many people are trying to pick up right now, but a meta post about the psychology of becoming a beginner after an impressive career doing something else. This is another instance of the curse of comparative advantage: if an evil genie wanted to make sure you'd never make it as a writer, musician, or artist, they'd curse you with the world's greatest natural ability as an accountant or something. But over time, the relevance of skills decays, and those transitions are always painful. This is a classic story of impressive technical skill, but every hardware cycle made that specific expertise less relevant—while simultaneously increasing the rewards from being able to pick up such expertise.
- Emmet Penney has an essay for Praxis (disclosure: I'm an investor) on the "industrial revolution" as a continuous process extending far beyond the time period in which it's typically discussed. The more important a technological shift is, a) the more likely people have been quietly working towards it for a long time, and b) the more profound the long-term effects are.
- Sam Bowman writes about Britain as a developing country. "Developing country" is very much a know-it-when-you-see it judgment; it's easy to believe the magnitude of the GDP gap between the city that includes the Tenderloin and the city of Shenzhen, but the direction is counterintuitive. One model for the gap is that, contrary to the terminology, a "developed" country is one that continues to prioritize growth over other concerns; one way for a country to reach middle-income status is to start getting rich, and then take a long break while the rest of the world surpasses that level. Argentina, for example, was getting close to US GDP per capita in the early 20th century. In absolute terms the country is richer than it used to be, but the rest of the world moved much faster. And, as with many other areas, it's difficult to stand still: every year means that non-renewable inputs are a bit scarcer and a bit more expensive, and also means that the gap between what institutions are capable of and what they need to be capable of keeps widening.
- Dan Wang has notes on China and technology. One key point is the feedback mechanism: political repression risks slowing economic growth—after the Ant situation, companies have to think twice about taking the risk of building better payment and finance apps—while slower economic growth weakens the CCP's legitimacy and encourages more repression.
- Dylan Matthews at Vox profiles Anthropic, which split from OpenAI out of concern that OpenAI was more focused on capabilities than safety, and which has since made many advances in capabilities. Some of it's quite fun, like the descriptions of training a model by giving it a secret goal and then trying to build rules that thwart that goal, or the fact that you can get models to behave in certain ways just by asking. There's some interesting game theory with technologies like AI: if a powerful and malevolent one is technically feasible such that someone will probably end up developing it, the safety-first approach can be do develop powerful models faster in order to have time to get them to be less malevolent before they're widely deployed.
- And in this week's Capital Gains, we look at the important but sometimes fuzzy distinction between tradable and non-tradable goods. Over the last generation, there has been massive deflation in anything that can either be put on a container ship, or is made out of parts that can be put on such a ship. That's been very beneficial in many ways, but it has its drawbacks, and it's not a law of the universe that such goods will get endlessly cheaper while housing, healthcare, and education get relentlessly more expensive.
- Paper: Yes, this is by the same person who wrote Cod and Salt. It's a very detailed history of an important technology that's easy to take for granted. But the book is also interspersed with the author's insistence that changes in society lead to changes in technology rather than the other way around (for example, he claims that the Reformation wasn't catalyzed by the printing press, but that the printing press was catalyzed by the same forces that led to the Reformation). That's a good description of why technologies get deployed, but falls a bit flat in some places—it's not as if electrons in slightly impure blocks of silicon grudgingly decided to behave in useful ways just because society had gotten complex enough to put transistors to good use. The book is full of fun little details about how the different polities and cultures negotiated the introduction of paper and the information-sharing it enabled, but it would be much more enjoyable if it were easier to be confident that all of them were true: the book makes some wild claims about language, like arguing that most medieval Europeans didn't know how to count past 2, mixes up the Ottoman and Seljuk Turks, and gets a few dates wrong; Amazon's one-star reviews, the Community Notes of pop nonfiction, include several people with different specialties calling out mistakes ranging from the reputations of calligraphers to the composition of Gutenberg's ink. In conclusion, handle with caution and double-check.
- Drop in any links or comments of interest to Diff readers.
- Per the first link above, about Greg Brockman learning ML, I’m especially interested in good papers, firsthand reports, etc. on continuing to learn over time. Many people who are successful in a given field for a long time are really continuously reinventing themselves. It would be great to better understand how.
Companies in the Diff network are actively looking for talent. A sampling of current open roles:
- A successful crypto prop-trading firm is looking for new quantitative developers with experience building high-performance, scalable systems in C++. (Remote)
- A company reinventing the way Americans build wealth for the long-run by enabling them to access "Universal Basic Capital" is looking for fullstack engineers with prior experience in fintech. (NYC)
- A firm using machine learning to customize investments is looking for a data engineer. (NYC)
- A hedge fund is looking for an experienced alternative data analyst who can help incorporate novel datasets into systematic strategies. (NYC)
- A vertically integrated PE-backed company applying a rigorous investment/operations approach to a high-growth industry is looking for an analyst who has banking experience. (Little Rock, AR—no remote, but relocation assistance is possible)
Even if you don't see an exact match for your skills and interests right now, we're happy to talk early so we can let you know if a good opportunity comes up.
If you’re at a company that's looking for talent, we should talk! Diff Jobs works with companies across fintech, hard tech, consumer software, enterprise software, and other areas—any company where finding unusually effective people is a top priority.