The Indelible Stamp of a Lowly Origin · by Byrne Hobart

In this issue:

  • The Indelible Stamp of a Lowly Origin

  • How the Chip Shortage Happened

  • New York Real Estate: More Data Points

  • Chronic and Acute Labor Shortages

  • Extrapolation

  • Texas and Trading

Every corporate finance decision balances two requirements: cash needs and signaling. A company that sells stock gets cash, but signals that the stock price is not cheap. A company choosing between dividends and buybacks is choosing whether it's more important to signal that their current profits will persist for a long time (so a dividend is a good idea) or that their current stock price is too cheap.

Many companies put a lot of effort into mitigating the signaling cost of their moves. For example, a company that expects to issue lots of stock might 1) talk about how they still see many opportunities for growth, and 2) note that they like having a clean balance sheet with little debt. It's easier to do this in advance than after the fact (everyone has a good excuse for why their negative signal shouldn't be received as negatively as it was).

One corporate finance decision that's impossible to undo is the choice of how and when to go public. The ideal looks something like this:

  1. The company is founded, and grows with little to no outside capital.

  2. It raises some money from a financial sponsor. If it's remotely tech-adjacent, that means venture capital; if not, that might mean selling a large stake to a private equity firm.

  3. The company does a conventional IPO, with a standard lockup.

Any deviation from this scheme says something. Even the ostensibly positive signals can read as negative. Ubiquiti, for example, is a perennial target of short seller writeups, and one of the points they often make is that the company has had suspiciously low capital needs, and its founder is firmly in control. (It had a VC on the cap table when it went public, but that was an oddly-structured secondary transaction in which Ubiquiti sold VCs $99.5m of stock and warrants and used the proceeds to buy back $100m of the same from existing employees.) Once a conventional path has been established, it's odd to deviate—strange to go public early, or to try to bootstrap from penny stock status to mainstream company; it's a bad sign to be underwritten by a broker with a reputation for junky offerings

; it's dubious to take a company public through a reverse-merger with a mostly empty shell.

And yet, every one of these rules has exceptions:

  • Xero went public extremely early, raising just $11m in 2007, and only later raised venture capital to expand internationally. The company now has a $14bn market cap.

  • Stratton Oakmont, the most notorious penny stock brokerage, and the inspiration for The Wolf of Wall Street, mostly underwrote worthless companies. But it also took Steve Madden public; that company is now worth $3bn. (Madden himself faced some bumps along the way.)

  • WPP, the largest ad agency in the world, was originally Wire and Plastic Products plc, which made shopping baskets. Martin Sorrell, who had left Saatchi & Saatchi after running their M&A program, took control of the company and used it as a vehicle for acquiring other ad agencies. And Texas Instruments also went public through a reverse-merger in 1953, combining with a publicly traded company that owned rubber plantations but whose value mostly consisted of cash on hand and an NYSE listing.

These companies all had to overcome reputational headwinds, but none of them really suffer for it today. In Phil Fisher's Common Stocks and Uncommon Profits, there's a brief interlude about TI in the mid-1950s. Fisher cites a broker's report published after insiders sold a block of stock, which said "We agree with them and recommend the same course!" This kind of moderate snark is easier to get away with when a company has reputational problems. The report was wrong—in the second edition of Common Stocks and Uncommon Profits, Fisher takes a quick victory lap—and Texas Instruments has since become very legitimate indeed. It's compounded at about 13% since then, ignoring dividends, or about 3250x.

The SPAC boom, like penny stocks, reverse mergers, and other spurts of retail investor enthusiasm, will tend to select for lower-quality companies than a conventional IPO process. But it also selects for high-variance companies that are willing to use a low-status financing method because the numbers happen to work. Someone who doesn't care that SPACs still have a mixed reputation, and that some investors are already adding every single SPAC merger to their short-ideas watchlist

, is probably just trying to take advantage of the boom to raise money. But someone who doesn't know this has been focusing on other things, which may lead to better outcomes. The usual skew for equities is that median returns are worse than mean returns; outperformance is driven by a small number of companies (this has to be true mathematically, because a) indices are dominated by a fairly small number of companies, and b) the list of those companies changes over time). But the mean-median gap will be bigger for SPACs. 2022's list of companies with post-IPO performance of -90% or worse will probably have a healthy SPAC representation, but 2030's list of the best stocks of the decade may have some SPAC representation, too.


How the Chip Shortage Happened

Daniel Newman of Futurum Research has a good overview of how the chip shortage happened, and how different players reacted. The very short version is that it's a story of a recession that happened to increase PC demand, followed by a recovery that electronics companies were better able to react to, because of their short product lead times, than auto companies. One interesting subtext of this shortage is that it hasn't led to all that much inflation in chip prices. It's hard to get a good sense of chip spending per dollar of gross margin across all products, and "chips" are a broad category, but given the size and fixed cost of the car industry, it seems that they'd be able to outbid other buyers for already-allocated capacity. That hasn't really happened, although they are certainly bidding for future production capacity. It's another example of weaker inflationary pressures: when supply is locked down, a spike in demand doesn't lead to high prices for the most inelastic components, just to production delays.

Meanwhile, Mule has a great look at the longer-term picture; even after the shortage resolves, automotive demand will be a growing piece of the overall demand for chips.

New York Real Estate: More Data Points

Using price data to track which cities people are entering or fleeing during Covid has been tricky. Rents are visible in close to real-time, but are artificially low since landlords cut deals in 2020 in order to keep people around for 2021 and onward. Meanwhile, single-family home prices are up, but there are complicated mix issues: people leaving high cost-of-living places may buy a home that's expensive for its area but cheap relative to where they were before. One interesting datapoint on New York: a Manhattan real estate developer is selling a portfolio of high-end Manhattan apartments ($, FT) for about 20% less than they were worth pre-pandemic.

In a model where expensive real estate measures the wage premium for working in one city over another, that's a good indicator that New York's high-earning tax base has indeed partly fled for other places.

Chronic and Acute Labor Shortages

For about the last decade—probably starting with Google's unilateral pay hike in 2010—there's been a shortage of programmers. This has led programmer wages to ratchet relentlessly upward, and since many programmers do work that's complementary to other programmers (the frontend app developer creates a feature that gives the data science team more information to play with), that demand keeps rising. Other kinds of labor shortages are more temporary and situational. Right now, for example, blue collar workers are benefiting from growth in e-commerce and housing ($, WSJ). At this end of the job market, skills are more directly fungible; booming demand for different kinds of low-priced labor means that labor doesn’t stay low-priced, at least for a while.


In the early weeks of the pandemic, the most important fact for anyone in the world to understand was that anything with a doubling time of under a week would quickly be a big deal. But a few months into the pandemic, the modeling problem got more nuanced: when you're trying to measure the potential magnitude of a pandemic, it's useful to have wide confidence intervals so you know the worst-case scenario, but when you're deciding how to mitigate it, you want accurate short-term ones. Bloomberg profiles data scientist Youyang Gu, who built a model that 1) uses a standard susceptible-exposed-infectious-recovered approach (see Kevin Simler's simulation for a look at how these work), and 2) continuously trains the model based on measured deaths.

This is partly a story about an independent researcher coming up with a good approach to an important problem, but it's also a data science parable. A model could be trained based on case counts, percent-positive rates, hospitalizations, or deaths; deaths are the laggiest indicator, but they turn out to be the most reliable. In theory, data science is all about building really amazing models to extract signal from the noise, but in practice the biggest challenge seems to be getting and cleaning the right data. Even when the goal is to provide real-time information about a rapidly-changing situation, it can be better to use a reliable number that's released late than to use noisier data with more confounders that's available sooner.

Texas and Trading

Macquarie has raised its full-year earnings guidance from slightly lower year-over-year profits to a 5-10% increase ($, WSJ), entirely due to profits from commodity trading ahead of and during the Texas blackouts. One function of a trading desk is to look at the entire supply chain, spot changes first, and then figure out where those changes will have the largest price impact. Another, of course, is to make a market, quoting bid/ask spreads when people are desperate to trade. Macquarie played both roles.

What's more broadly interesting about this is what it says about the long-term model of investment banks, and about recent shifts in their exposures. A given investment bank will have many lines of business, some of which produce mediocre profits and a few of which mint money. If one trading desk at Macquarie can be responsible for ~7% of the entire company's annual profits in the space of a week or two, results are pretty skewed. The temptation is always to ask why a bank should bother with businesses that earn subpar returns, but if the bank is a bundle of options that have very spiky payouts, it makes sense to a) diversify all of its lines of business, and b) achieve economies of scale in a few firmwide tasks (getting cheap capital, recruiting people, and complying with regulations). In 2007 or so, banks seemed to be systematically short volatility: when the world got more normal, and when spreads between high-yield debt and safer instruments narrowed, they reported good numbers. When trends reversed, they tended to take big writedowns. Now, it's the risk-taking parts of banks that get the most upside from craziness. · by Byrne Hobart