Longreads + Open Thread

Longreads

  • Philo at MD&A has a great writeup on the midwit meme and the midwit trap: the tendency to look for sophisticated, elaborate explanations because the simple ones are suspiciously simple. One ironic thing is that the left tail of the distribution is applying a simplistic answer, while the way to get that same answer on the right tail is to think from first principles and realize that the impact of all possible causes on some single effect is probably distributed close to Zipf's law, and that complex explanations implicitly equal-weight a bunch of factors that are unlikely to be equal-weighted in reality.1 If you have one explanation for something, you might be oversimplifying, but if you can rattle off a dozen you're even more likely to be overcomplicating.

  • Chris Gillett on how to read the news: every source has a motivation, and writers generally have some idea of what the story will be when they start choosing sources. So you can glean some useful information from who says what—and who is in a position to know what.2

  • Jodi Kantor and Arya Sundaram at the NYT have a detailed story on worker-tracking software and its increasing prominence in white collar jobs. While it's not necessarily true that every company is behaving rationally when it automatically micromanages behavior, it's worth considering the possibility that we really do systematically underestimate how much goofing off we do on the job, and that this software helps. There are lots of real-world examples of this—if you start meditating, you immediately become conscious of how distractible you normally are, and how hard it is to focus on any one thing; very few people look at the "time spent" feature on their smartphone and see a number they're fully satisfied with. And some of this software is essentially applying Atul Gawande-style checklists to more jobs—if you insist on taking all the easily-measured actions in the right order and at the right time, you're less likely to make an obvious mistake.

  • Bloomberg looks at the fall of what was once the world's biggest gold trader, who was convicted of spoofing and market manipulation earlier this month. Spoofing—entering a large order on one end of the market in order to execute the opposite trade, then canceling it—is an odd category of financial crime. For one thing, it's the sin-of-commission version of the perfectly acceptable sin-of-ommission strategy of breaking up a large order into smaller ones in order to get a better price and avoid spooking the market; in one case, you're making a trade, and the other, you're avoiding one, but in both cases the intention is to obscure your actual goals. The other issue with spoofing is that it is, or should be, incredibly risky! If your plan is to sell a little bit of gold, and to do this you first place an enormous order to buy that you intend to cancel soon, you run the risk that someone will go ahead and fill that order. Generally, systematic investors are pretty good at spotting patterns in human traders' behavior, so it's somewhat surprising that there aren't systematic strategies that prey on spoofers.

  • Cliff Asness of AQR writes about what managed futures strategies are for, and why some strategies sold this way are not providing the return profile they're supposed to. It's a universal in finance that investors ask for uncorrelated returns, but after a sufficiently long bull market what they want to have is a stream of returns that correlates with equities because equities have gone up. There's style drift, whether deliberate or accidental, in which hedging strategies slowly get less hedged.

    Traditional value investors sometimes have a quasi-religious objection to reading about systematic strategies, especially when those strategies incorporate the buy-high-sell-higher stuff as Asness's do. But it's very worthwhile for such investors to read AQR's work, for two reasons: first, value as a strategy has been in a decade-long drawdown, and it's good to put that performance in context by thinking of it as one signal among many that can go through difficult times. And second, value investing and systematic investing have a most recent common ancestor in the form of Benjamin Graham—whose junior employees spent a lot of time filling out standardized cards with financial statistics, applying a predetermined formula, and computing a fair price for investments. That's plenty systematic, and if Graham had been born a generation or two later he probably would have implemented the same strategy with computers and would have been seen as a quant.

And also: I was on Will Jarvis' podcast, talking about bubbles, efficient markets, reflexivity, and more.

Books

  • Goldman Sachs: The Culture of Success: a history of Goldman from an ex-employee, this book is a good overview of how the firm came to be so dominant—a process of reputational compounding at first and financial compounding later on. The book opens somewhat ominously: a Goldman partner has a copy of the "tombstone" listing the 722 firms that underwrote Ford Motor Company's 1956 IPO. Every time one of those firms got bought or went under, the partner would cross its name off the list. Goldman has survived by being cautious in what businesses it enters and then aggressive once it chooses to get into them, a recipe that works but leads to the occasional scandal.
  • Money and Power: How Goldman Sachs Came to Rule the World: It's Goldman Sachs week. Culture of Success was written by a Goldman alumna, and is largely positive. Money and Power is more negative. (The author also worked in banking, at Lazard.) While the book sometimes goes out of its way to poke fun at Goldman and its partners' foibles, it's also quite well-researched, with lots of details that didn't seem to make it into other works. Goldman has been closely tied to powerful politicians for a long time, though whether that's an instance of undue influence or just evidence of a meritocracy that happens to recruit and promote unusually effective people is unclear. One notable thing about the book is that even though it takes some potshots at Goldman, it also spends lots of time defending the firm from hostile portrayals in other works.

Open Thread

  • Drop in any links or comments of interest to Diff readers.
  • Warren Buffett has noted that, for whatever reason, investment banks and trading firms tend to die off pretty fast. Goldman is a counterexample, of course, but it does seem to be an industry where there's been consolidation over time. Is this because the biggest ones are so dependent on a small number of people, who eventually retire, or is it because of the inherent leverage in the business? Or is there some other factor?

Reader Feedback

Last week's post asked about which companies benefit from network effects in multiple, non-overlapping networks. From Philo:

Maybe not quite the answer you're looking for, but media companies traditionally consist of a lot of non-overlapping network effects businesses, eg if you owned 50 local newspapers, you had 50 separate local networks of advertisers and consumers. The same would be true if you owned a portfolio of magazines or trade journals or cable networks and so on.

This is indeed not the answer I was looking for, but it's a very interesting one! Maybe there's some transferable talent in managing non-scalable network effects, leading to conglomerates that are in "media" broadly and don't get a lot of direct synergies between their various holdings. Capital Cities was a great example of this, with TV and radio stations, local newspapers, trade magazines, and cable—all of which have network effects, but which largely don't interact.

A Word From Our Sponsors

There's a set of white collar tasks which are highly repetitive, time consuming, and still difficult to automate because there isn't a standard format. Onboarding your customer's data is one such task. Each customer has their own, slightly different processes for managing information about their business, and so each time you onboard a new customer, you've got to adapt their dataset to your system.

Flatfile is the product that takes this complexity away. They have an embeddable drag and drop tool, so you can jump seamlessly from closing a deal to a clean dataset.


  1. A partial exception is any system where you're looking at deliberate interactions between two forces that push in opposite directions. In that case, you can have two powerful forces that push in opposite directions and cancel out, and then a bunch of smaller ones that collectively make more of a difference specifically because they don't cancel. Political coalitions look like this sometimes; Joe Manchin is so influential because the main parties' votes roughly cancel each other out, for example. This also applies to diet: if your weight is stable for a long period and you decide to start drinking a can of coke every day at lunch without changing any other aspect of your behavior, that ~6% of daily calorie intake can explain ~100% of your ensuing weight change. And the meta-footnote here is that yes, I know I'm undercomplicating things, but these are exactly the kinds of domains where there is in fact a lot of complexity, and where a decent working model is to assume an equilibrium first and then focus on what disturbs it. Which raises the further fun point that explaining changes can usually be simplistic, while explaining equilibria is complicated.

  2. "Writers have an agenda in mind when they start writing" applies to this newsletter as well, of course, though we do our best to mitigate it. The best hedges against this are 1) to deliberately increase the surface area of raw information—economic datapoints, company financials—relative to summary writeups and pitches, and 2) to try to start with questions ("Why is X happening?") rather than conclusions ("X is happening, and that's terrible/awesome"). Of course, that's not a cure for biases, but it helps.