What Happened in Texas
A post shared by Byrne Hobart (@byrnehobart)
Last week, I lost power for a day and a half and water for a bit longer. Once everything got back to normal, I went online and was promptly greeted with a few dozen mutually-incompatible explanations for what had happened and why: Deregulation! Too much regulation! ERCOT! Too much reliance on renewables! Too much reliance on natural gas! Insufficient winterization! A freakishly bad winter! Price signals that didn't respond to supply and demand! Prices that led to ruinous bills for power companies (and Griddy customers) while leading to windfall profits for electricity traders!
I'm always wary of fully fleshed-out policy suggestions that emerge within a day or two of a big news event. As it turns out, the US power grid is fiendishly hard to optimize, for reasons that are a mix of physics, technological transition, and historical accidents. An arbitrarily reliable grid is an engineering possibility, as long as the "cost" and "delivery date" lines in the contract are left blank, to be filled in with some very startling values later.
We're at an interesting historical inflection point, where it's possible to radically rethink the grid, and luckily Texas has some interesting traits—plenty of domestic capacity for natural gas and wind production, homes that are not designed with energy efficiency in mind, a deep suspicion of NIMBYs, and a GDP roughly that of Italy or Brazil. It's a great place to experiment with novel systems.
Making the Grid
But it's worth taking a step back to ask how we got here. In the earliest days of electrification, homes and streetcars were powered by what would today be termed a microgrid: small-scale power generators built for a single consumer. The usual math of the power business was that fuel was cheap but generators were expensive, and power was cheaper when it was made at a larger scale. These forces pushed power companies to consolidate, centralizing generation and selling it at the lowest possible price.
The pioneer of this model in the US was Samuel Insull, who worked as Thomas Edison's assistant and then quit to run Chicago Edison. Insull's model was to increase scale, cut prices, and encourage off-hours usage in order to balance loads. At the time, 5% utilization was about as expensive as 50% utilization, and willingness to use electricity was responsive to lower prices, so Insull cut relentlessly: from 20 cents per kilowatt-hour when he started to 10 cents five years letter, to 2.5 cents a few years after that.1
Other acts of gridification were less about business and more about charismatic capex. The TVA, for example, was a demonstration of the New Deal's power to shift economic resources and alter institutional norms. Before it was built, electricity consumption in the Tennessee Valley was half the national average; after it was 25% higher.2 In Russia, Lenin was enamored of electricity, and declared that "Communism is Soviet power plus the electrification of the whole country,” a task at which he made a surprising amount of progress, despite some internal political conflicts. (He delegated some of it to Trotsky, and, just as in corporate America, when there’s turnover at the CEO level, the prior CEO’s favorite underlings sometimes get axed.)
The economics of grids make them natural monopolies, but the engineering behind them encourages integration. A solo power plant can't shift spare capacity to an under-served market it's not connected to, nor can it get help from a neighbor, but an integrated collection of power plants can. The power plants' status as a natural monopoly put them under regulatory scrutiny, and their status as a leverage-driven growth business—like fracking, they were a triumph of financial engineering as well as the normal kind of engineering—meant that many power companies went insolvent in the 1930s, and had to be reorganized.
The modern US power system is descended from this naturally-centralized and eventually-integrated network. Demand grew for decades after the Depression—more appliances meant more use per household, and higher usage meant more efficiencies and lower prices, which encouraged further growth. In a statistical sense, if every power plant and every power market is an independent variable, the system should get more predictable over time as N goes up. In practice, this is not true: some of the complexities of the power business scale superlinearly, some parts of the grid get less reliable as the median plant gets older, and renewables introduce new difficulties.
The Electricity Market
Like every market, the electricity market has to clear, but unlike other markets it has to clear constantly and instantaneously. One mental model you can use is to mash up Amazon’s logistics minus the warehouses, mashed up with the classic film Speed.
Solar and wind produce cheap power, but they produce it at inconsistent and inconvenient times. As a result, they make the grid less reliable over time. They aren’t why Texas lost power, but they are one reason that keeping a grid stable has gotten harder over time.
Renewables are getting cheaper: if you want to generate a certain amount of electricity over a year, solar is currently the way to do it. But electricity is not just consumed "over a year": there's fluctuation in demand from day to day, over the course of a day (usage peaks in the evening when people are at home, repeatedly opening and closing their refrigerators, running AC, watching TV, etc.), and also fluctuates during the year—higher in the summer, for example, when there's more AC use. It can even fluctuate minute-by-minute; electric teakettles use a lot of power, and British utilities used to plan for a demand spike during commercial breaks of popular TV shows.
Meanwhile, solar power works when there's sun, both in the very predictable sense that it operates best in daytime, and in the less predictable one that individual panels can be briefly blocked by a single cloud. Wind, too, is intermittent. Sometimes, the wind stops blowing. And one particularly windy afternoon in 2010 produced the equivalent of almost two nuclear power plants' worth of output over the course of an hour.
This doesn't make them untenable, and of course it completely ignores the negative externalities of burning fossil fuels. But it illustrates the grid-level renewables tradeoff. All else being equal, a grid more dependent on solar and wind will need some combination of:
- Lots of batteries, which currently erases the cost advantage.
- Lots of natural gas, which can be turned off fairly quickly during surges and turned on at other times, which erases some of the environmental advantage.
- A tolerance for intermittent output. Or
- A wholesale change in consumption patterns.
One way to look at the reliability question is that we have more tradeoffs than solutions; relying more on wind and solar reduces costs and reduces emissions (great!) but also means that we need some other source of electricity in the evening, when consumption is highest and both wind and solar are producing less.
And even within that reliability question, there is complexity. One of the most startling details in Gretchen Bakke's The Grid is the narrative about the great East Coast blackout of August 2003. It was caused by a sequence of correlating problems—trees interfered with some power lines, so power had to be shunted on to other power lines, but power lines sag more when a) it's hot, and b) when more power passes through them. So every rerouting means more risk of downed lines, and it’s more likely to happen at a time when dependence on electricity is highest. Many human systems have the trait that the further they deviate from normal behavior, the harder they are to model—it's much harder than usual to predict what the S&P will do on a day after it drops 10%, and at a more tangible level, a protest march gets a lot less predictable after the first teargas canister or brick gets thrown. But this is sometimes true of other complex systems. The most common threats to the grid include untrimmed trees and curious squirrels.3 And the physics of weather and wires mean that Squirrel Risk rises on hot days, and rises faster with each squirrel incident.
There's another level of uncertainty embedded in this system. Any company that provides residential power is, implicitly, betting against volatility. Normally, they turn a profit by buying wholesale and selling retail (whether that "buying" consists of a derivatives contract or the capital expenditure for building a new plant). The price they pay is hard for them to control, and extreme fluctuations end up looking like a naked short squeeze: someone has promised to deliver something, and must pay whatever the market price is to get it. In this sense, most power companies are some sort of bank, in the fundamental business of borrowing short and lending long. But in this case, there is no Central Bank of Energy, and it's economically prohibitive to have a capital buffer. If everyone in a business is short volatility, can't retain a capital buffer, they'll risk blowing up. And that happened: gas traders had to scramble for cash early on, and now some power buyers are failing to deliver collateral and may be insolvent ($, FT).
What's Special About Texas
So that's the general setup: a grid that was evolved rather than designed, and evolved for different constraints than it now faces, combined with a market that naturally produces extreme volatility from time to time.
Enter Texas' regulatory environment. Many fingers have been pointed at ERCOT (several directors have already resigned) and at the state's failure to mandate winterized power plants. But the actual proximate cause was on both the supply- and demand sides.
On the demand side, Texas builds more houses than any other state, has cheap electricity that encourages electric heating, and doesn't insulate houses much because the cost of keeping them cool is relatively low.
This is a very cost-effective model, and even represents a sort of implicit state-level protectionism: why let Big Insulation reap the benefits of keeping Texans climate-controlled, when locally-sourced plant-based solutions—e.g. natural gas—are so abundant? This is slightly flippant, but it's a genuine concern: Texas is energy-rich, and the whole point of cheap energy is that it's a universal solution to many problems. There's a very direct long-term connection between cheap energy and better standards of living (consider the relative comfort of a) pushing a plough by hand, b) getting an ox, c) using a tractor, or d) sitting in your air-conditioned office and using your lightning-fast Internet connection to pull up a live feed from your TractorBot on one of your half-dozen monitors. Every rise in ease can be measured as a drop in the cost of a kilowatt-hour.)
Every state adapts to its own constraints. The US used to build lots of cheap oil power plants, until the oil crisis of the 70s made them uneconomical. Hawaii still uses oil; it may be expensive to buy, but it’s still cheap to ship.
Texas' heavy reliance on local natural gas is one reason the state's energy is so affordable. Another is the growth of wind power; Texas is very much adopting the gas-plus-renewables combination that currently optimizes for cost and reliability.
What Texas does not optimize for is a sudden drop in temperature. This increased household energy use, at the same time that it cut off the state's access to natural gas. Some people adapted fast. For example:
Chris Bird first saw the rumors Friday morning on Twitter.
Physical natural gas prices were soaring in Oklahoma amid a cold blast that was gripping much of the U.S. and only stood to get worse. Bird, owner of a small gas producer in Tulsa, called one trader who confirmed the heating fuel was going for a staggering $350 per million British thermal units. Then he called another who said it had risen to $395.
That’s all Bird needed to know. He and his production technician grabbed some winter clothes at the dollar store and drove the stretch of highway to Osage County some 20 miles north. They met up with a buddy who owns a propane torch and began melting ice off idled gas wells to get them back online.
“We’ve got four of us in the office turning on every single gas well that we’ve got,” Bird said. “We have old wells that haven’t produced in 10 years, and we’re like, ‘open the taps, let’s go.’”
But most producers were not quite so adaptable. So a few million homes, mine included, went dark.
Meanwhile, there's ERCOT: Texas has its own power grid, with its own policies, and one of those policies is to not pay for excess capacity. Would this have solved the outage? Quite possibly! But the mechanics of a payment-for-extra-capacity system are also nontrivial, especially if that capacity needs to be used once every decade (or more) rather than at annual peaks. Suppose ERCOT has a program to pay coal or gas power plants every day to remain available, and then fines them if they're asked for and unavailable. If the fine is low, the optimal strategy is to collect fees all the time and pay a big fine once a decade. At the right fee structure, it would be profitable for me to get paid continuously if I promise the grid a few megawatt-hours on-demand and cut the state a check once a decade when they ask me for power and I come up empty-handed. On the other hand, steep fines create another optimal strategy: do exactly the same thing, pay a dividend every year, and when the state asks for power, declare bankruptcy. Think of these power plants as a bond that pays, say, 15% annually and will default with zero recovery at some point in the next ten years or so. They might be a great diversifier for power-trading desks that make all their money in chaotic conditions and don't earn much when power markets are sedate.
The kinks could be worked out, but it would probably take another crisis or two to learn what the optimal setup is. So, if climate pessimism is accurate, ERCOT could normalize its capacity payment policies and fix this problem by 2040 or so; if the optimists are right, and this year's storm was a freak once-a-century occurrence, Texas will know how to solve the next one around the year 2221. If you asked me to bet whether this will happen before we get a fusion-based grid where kilowatt-hours are quoted in nanopennies or millisatoshis, I'd offer even odds.
Texas could mandate that every gas producer winterize wells, or have them all prepare a Chris Bird-style emergency plan to keep gas flowing no matter what. If the storm is a regular occurance, then that's prudent—but if it's a freakishly rare one, that's a lot of investment for a problem that rarely arises. This is roughly where the debate has settled: A smaller-scale but similar incident occured in 2011, but it's very hard to reason statistically when N=2. Meanwhile, it's hard to estimate what it would cost to winterize the extended grid—"extended" because power plants, gas wellheads, storage tanks, compressors, and pipes would all need to be protected, and it would be highly embarrassing to spend a vast sum of money just to find out that some forgotten bottleneck in the system still froze solid.
But there are some interesting possibilities.
Opportunities: The Battery Fleet, Nuclear, LNG, and Bitcoin
From a driver's perspective, an electric vehicle is a way to get from point A to point B and incrementally save the world at the same time. From the grid's perspective, an electric vehicle is a battery with two wonderful properties: it has wheels, and somebody else paid for it. This makes electric vehicles a marvelous way to smooth out intraday demand for energy, or to absorb the peak production from wind and solar. Since the usual pattern for these vehicles is to drive them home after work, i.e. to run down the battery after wind and solar stop producing, they don't quite subsidize renewables, but they do make a great fit for baseload power, especially the sort that takes a long time to turn on and off—nuclear. Widespread electric vehicle usage creates a source of electricity demand in the usual demand trough from 10pm to 4am, so for a power source whose economics require constant usage, they're a direct subsidy. Nuclear has benefits for reliability, too: the fuel is easy to stockpile (coal is easy, too, albeit unsightly; natural gas is the most just-in-time of the power sources).
Natural gas is another market with multiple sources of demand. Some of it gets used locally, for electricity and heat, and some gets compressed and shipped out. LNG terminals are capital-intensive and very risky (in the time it takes to get one approved, funded, and constructed, natural gas can go from cheap to expensive or vice-versa, as a number of LNG entrepreneurs have learned). But LNG is another way to smooth out demand, since it globalizes the market: producing more gas and selling more LNG means that the next time some percentage of Texas' natural gas production is temporarily unavailable, it's a percentage of a bigger number. (This is terribly unfair to the LNG buyers, but there are other sources.)
The need to match supply and demand creates an interesting quirk in the energy market: there's a cost for electricity, but there's also a payment-for-not-using-electricity-you're-entitled-to, sometimes known by the euphonious term "Negawatts." As far as a residential electricity user is concerned, capacity that comes online because someone fired up a gas peaker plant is precisely identical to power that's only available because somebody else shut down their factory early. For most manufacturing, it's a very bad idea to just shut things down; machines tend to break in expensive ways when they suddenly lose power. But some businesses have a large electricity component to their cost structure, and are, to some approximation, in the business of converting dollars into kilowatt-hours, and converting those kilowatt-hours back into more dollars. Desalination and hydrogen are both energy-intensive, for example, and the outputs can be stockpiled, so they're a good source for negawatts.
And then there's Bitcoin mining, which is almost purely in the business of converting electricity into money, and which has the added bonus of converting local electricity into a global market for computation. This means that usage can precisely and directly respond to supply and demand fluctuations in the local market. In the early days of Bitcoin mining, this model would not have worked, because chips got obsolete so fast—the opportunity cost was too high to stop mining except in emergencies. But now that Bitcoin miners have zipped through the last few cycles of Moore's Law at breakneck speed, chips have a longer life and their economics are less affected by a few missed hours or days here and there—especially if their owners are getting paid for it. Pairing new generating capacity with new Bitcoin capacity is a way to offset some of the upfront cost (although it does mean taking a risk on Bitcoin price volatility, which is never easy). As real rates rise, capital for large projects like power plants is no longer effectively free, but in historical terms it's still quite cheap, and there are clearly many dollars looking for a positive-real-ROI home right now.
This entire shift is much easier if everyone gets a bit more exposure to power prices. If your Tesla is half as expensive to charge if you let it charge late at night—and if commuting is a lot pricier during a heatwave or a snowstorm—this will affect your behavior in prosocial ways. It encourages you to avoid straining the grid at a time when the dominos are about to start falling. Managing your behavior around a constantly fluctuating cost of electricity sounds inconvenient, but the alternative is for grids to manage their supply around a constantly fluctuating cost, and to sometimes reach the limits of their capacity. ERCOT might have kept more lights on for longer if it had convinced all 24 million of its customers to set their thermostats to 60 degrees, turn off all the lights, and unplug everything except refrigerators and medical equipment. Having 24 million arguments about conservation in parallel at two in the morning is probably beyond the capacity of any organization, but raising prices is something they can manage.
Will Texas accept a glorious future of more fracking and new nuclear power plants—they're expensive now, but if you build ten in a row they'll start getting cheaper!—and slick new EVs and endless server farms solving math puzzles?
If we want to avoid the next blackout, and keep Texas' electricity wonderfully affordable, that is indeed what we'll do.
Further reading: I enjoyed Gretchen Bakke's book on the modern power grid and various efforts to improve or escape it. It's called The Grid. I also got a lot out of Philip Schewe's history of electrification in the US. It, too, is called The Grid. I learned a lot from this Nic Carter interview with Brannin McBee. This piece is a good look at how bad things could have gotten.
In the interest of full disclosure: I own Bitcoin and some uranium. I also live in a poorly-insulated house in Austin, Texas, where the heating and air conditioning are very affordable.
A Word From Our Sponsors
The On Deck Writer Fellowship is an eight-week remote program for internet writers who want to hone their craft, meet other writers, and grow an audience. The Fellowship combines:
- Weekly small-group writing workshops facilitated by expert writers and editors from our team to help hone your craft.
- Tactical sessions and guest lectures focused on finding your voice and niche, growing your audience, and monetizing your work. Some of our recent speakers include Marginal Revolution author Tyler Cowen, The Profile author Polina Marinova, Substack's Head of Writer Experience Nadia Eghbal, and Lenny's Newsletter author Lenny Rachitsky.
- An engaged, creative community of incredible writers from around the world to keep you inspired and accountable.
In the words of ODW1 alum Lyle McKeany, "It's not an overstatement to say that ODW has changed the trajectory of my writing life. I know that sounds hyperbolic, but it's not."
We're currently offering special early-bird pricing of $1,990 for our third cohort, which kicks off April 17.
For software that benefits from collecting user data, there's a fuzzy efficient frontier: addressing niche use cases is expensive, but it expands the sample size, and many of the most lucrative products are the ones that can handle extreme edge cases—Google's ability to disambiguate never-before-seen typos, or LinkedIn's power to identify the one person at a Fortune 50 company who works in corp dev, went to your alma mater, and knows one of your friends. In the voice assistant market, companies are building products that help people with atypical speech to interact with them ($, WSJ). There might be a technical and business case for this; a model that matches human speech processing perfectly can deal with a lot of idiosyncrasy, while Siri, for example, enforces an inviolable semantic distinction between a "timer" (an increment of time) and an "alarm" (a specified time of day). A more humanoid speech detector could handle more nuance, and perhaps by leaning in to edge cases, voice assistants can build this more generalized model. On the other hand, there doesn't need to be a business case for this: it's a way for tech companies to improve lives in a way that other institutions can't.
The Economist highlights the messy economics of tech competition ($): each of the major megacap tech companies has a product line it's uniquely good at monetizing, and they're all complements to one another, so tech companies often cooperate in one domain (like Google paying for placement on iPhones) while competing in another (Apple and Google fighting to control the dominant mobile platform) and gearing up to stop cooperating (Apple poaching Google engineers and ramping up its search indexing). The mistake in tech antitrust is to look at the current situation as static; each of the major companies succeeds by commoditizing its complements, but the long-term result of this is that all the commoditizable components of the supply chain get squeezed out, and what's left is monopolies adjacent to monopolies. It's still possible to imagine a world where one of the big tech companies triumphs over all the rest—an Apple that makes its money from search, that treats iMessage as the core of a social network and scoops up the profits Facebook currently makes from advertising apps. But the aggregate resources pursuing that outcome for any one megacap tech company are outweighed by the aggregate efforts of the other big tech companies to prevent it.
From Quotation System to Market-Maker
Zillow is making its Zestimate the offer price from Zillow Offers; Rob Hahn has interesting thoughts on what a big development this is. There's a natural tension between running an asset-light, low-risk information business and running a capital-intensive market-maker that's subject to wide swings in inventory value. But the better the data is, the more of an edge that data represents for users. In financial history, there are some institutions that get dominant market share in a particular asset class, and a) broadly set prices in the market, while b) making most of their money on proprietary trading in the same market. This describes Drexel in high-yield bonds in the 80s—a "liquid" junk bond was one Drexel would make a market in. And it seems to describe Enron in the electricity market in the late 90s and early 2000s.
The obvious pattern there is that both Drexel and Enron collapsed and became synonymous with financial scandals, but in both cases the proximate cause was not their trading business. The non-obvious pattern is that both companies' regulatory issues turned into existential crises, because they had to fund large balance sheets for trading and couldn't raise sufficient money from skittish investors once the headlines turned bad. Zillow's comparatively simple business probably avoids the adverse-headline risk factor. Balance sheet risk remains, but the opportunity is lucrative; both Enron's trading operation and Drexel's junk bond desk went to great lengths to understate how much money they were making in their heydays.
GameStop is once again rallying, trading above $130 this morning compared to the mid-$40s earlier this week. Not-quite coincidentally, large growth and momentum stocks have been weaker in the last few days. When stock-picking funds have high leverage, low net exposure, and a policy of reducing aggregate exposure after losses, this is the natural correlation: a short squeeze in one stock makes other heavily-shorted stocks go up, and causes hedge fund favorites to drop. But not as much as in January: hedge funds have reduced their single-stock shorts, and replaced them with a) higher net long exposure, and b) more bets against indices ($, WSJ). As I wrote a few weeks ago:
The most likely outcome remains that long/short funds will use a bit less leverage, and choose to be biased a bit more towards making long investments—but with more capital, it will be safer for them to pursue short bets. The same fund structure that makes shorting safer in an environment where retail investors can push stocks up 100x in a year is also more suited to an environment in which there are more opportunities for shorting—relatively fewer outright frauds, but many more stocks that are temporarily overpriced.
The Bull Market in Conservation
Carbon emissions credit prices in Europe are rising ($, Economist). There are two drivers: end users, who buy credits to offset their current production, and speculators, who expect the supply of emissions credits to be curtailed in the future. It's the financial circle of life that one category of aggressive asset manager is getting short-squeezed in GameStop, while another category is implementing a short squeeze somewhere else. It will be interesting to see heavy emitters accumulate credits in advance of corporate decisions that affect the value of those credits. The largest polluters have a regulatory liability, but they have an information asset, and in a liquid enough market they can exploit that.
In other emissions-related news, a Japanese company that manages portfolios of ships is raising a $5.6bn green fund to buy LNG-fueled LNG carriers ($, Nikkei). Shipping is around 3% of global emissions, and is slightly sensitive to faster growth: one of the big cost-saving innovations in shipping has been moving ships more slowly, to save on fuel; this has to be balanced against the cost of keeping working capital tied up at sea for longer. When interest rates rise in an economic recovery, the relative benefit of slow shipping declines (and the unpredictable demand pickups that accompany these recoveries also increase the rewards of getting inventory to customers fast). If shipping companies expect to attract attention for their emissions, acting early is better.
This model probably sounds eerily familiar to anyone who has studied cloud computing, especially Amazon.com. In a nice bit of symmetry, both Bezos and Insull cut the unit cost of their flagship product to ten cents in order to stimulate demand. Jeff has not released his Amazon order history, but I have to wonder if he read an Insull biography in the early 2000s and did some thinking about the economics of idle servers. ↩
It was also a great example of one generation of technology bootstrapping the next. During the Second World War, they had to supply ridiculous amounts of energy to a mysterious project in Oak Ridge, whose purpose was not revealed even to the head of the TVA. It turned out to be uranium separation, which eventually resulted in the development of nuclear power. (And other things.) ↩
In a nice bit of symmetry with the observation before, squirrels have been responsible for two separate NASDAQ outages. ↩