The Incredible Flattening Yield Curve

This is a pretty amazing image, courtesy of J.P. Morgan Asset Management:

2018_0630_US_Yield_Curve
Source: J.P. Morgan Asset Management (obviously)

People are really starting to worry the Fed is going to invert the curve. Historically, an inverted curve (short rates above long rates) has been a pretty good recession indicator. I don’t have a particularly strong opinion about the direction of interest rates, especially now that we are above 2% on the 2-Year. But I do think this chart is telling us something.

If the curve is basically flat from 7 years on out to 30 years, that is not exactly a ringing endorsement of long-term growth and inflation prospects. I’ve heard from some fixed income people that it’s demand for long-dated paper from overseas buyers holding the 30-year yield down. I’ll buy that. But it’s still telling us something about supply and demand for capital along various time horizons.

Namely: we’ve got an awful lot of long duration capital out there looking for a home, and not enough opportunities to absorb it all.

2Q18 US Factor Returns

Below are my latest factor return charts. I update these on quarterly intervals but the underlying data, from Ken French’s Data Library, lags by a month.

Not much to write home about this quarter. The divergence over the past few years between the Momentum and Market factors and the remaining, more value-oriented factors (Value (Price/Book), Operating Profitability, Conservative Investment) remains striking.

The Size factor has also performed well year-to-date. May was a particularly good month for Size (+4.78%) and Momentum (+4.02%). In traditional “style box” terms, this corresponds to small cap growth stocks.

2Q18_Factor_Averages
Source: Ken French’s Data Library
2Q18_Market
Source: Ken French’s Data Library
2Q18_Size
Source: Ken French’s Data Library
2Q18_Value
Source: Ken French’s Data Library
2Q18_Momentum
Source: Ken French’s Data Library
2Q18_Profitability
Source: Ken French’s Data Library
2Q18_Investment
Source: Ken French’s Data Library

Nerd Stuff: Factor Valuation Edition

I have to give Research Affiliates some serious props for their online interactive (and, yes, free tools). I mentioned the asset allocation tool in a post from earlier this week. If you didn’t check out the tool then, you really should.

I did not realize until this morning that Research Affiliates also has a similar tool for factors, called Smart Beta Interactive. This allows you to slice and dice factor strategies and also the underlying factors themselves. I highly recommend checking this one out out, too.

Anyway, this post isn’t meant to be a Research Affiliates commercial. Instead, this is going to be a post on reflexivity. Behold, factor valuations for the US market:

RAFI_1Q18_Factor_Valuations
Source: Research Affiliates

Regarding their methodology, Research Affiliates states:

Just like stocks, bonds, sectors, countries, or any other financial instrument, equity factors and the strategies based on them can become cheap or expensive. We measure relative valuations of the long vs. short sides to estimate how cheap or expensive a factor is. We find that when relative valuation is low compared to its own history, that factor is positioned to outperform. When valuation is high it is likely to disappoint.

This is reflexivity in action. Briefly, reflexivity is a concept popularized by George Soros. The idea is that by taking advantage of perceived opportunities in the markets, we change the nature of the opportunities. Howard Marks likens this to a golf course where the terrain changes in response to each shot.

Here’s how this happens in practice:

Step 1: Someone figures out something that generates excess returns. That person makes money hand over fist.

Step 2: Other people either figure the “something” out on their own or they copy the person who is making money hand over fist.

Step 3: As people pile into the trade, the “something” becomes more and more expensive.

Step 4: The “something” becomes fairly valued.

Step 5: The “something” becomes overvalued.

Step 6: People realize the “something” has gotten so expensive it cannot possibly generate a reasonable return in the future. If prices have gotten really out of hand (and particularly if leverage is involved) there will be a crash. Otherwise future returns may simply settle down to “meh” levels.

Step 7: As the “something” shows weaker and weaker performance, it gets cheaper and cheaper, until some contrarian sees a high enough expected return and starts buying. The cycle then repeats. Obviously these cycles vary dramatically in their magnitude and length.

I do not consider myself a quant by any means, but I think the two most important things for quants to understand are: 1) why a factor or strategy should work in the first place, and be able to explain it in terms of basic economic or behavioral principles; 2) reflexivity.

Many people believe AI is going to push humans out of the financial markets. There is some truth in this. Big mutual fund companies that have built businesses on the old “style box” approach to portfolio construction are in trouble. The quants can build similar funds with more targeted exposures, in a more tax efficient ETF wrapper, and with lower expenses.

What I think people underweight is the impact of reflexivity. If the AIs aren’t trained to understand reflexivity, they will cause some nasty losses at some point. Personally, I think there will be an AI-driven financial crisis some day, and that it will have its roots in AI herding behavior. We are probably a ways away from that. But technology moves pretty fast. So maybe it will come sooner than I think.

Anyway, back to factor valuations.

What stands out to me is Momentum and Illiquidity at the upper ends of their historical valuation ranges. On the Momentum side this is stuff like FANG or FAANMG or whatever the acronym happens to be this week. On the Illiquidity side it’s private equity and venture capital. If you have read past posts of mine you know I believe most private equity investors these days are lambs headed to slaughter.

There tends to be a lot of antipathy between quant and fundamental people. Even (perhaps especially) if they are co-workers. The fundamental people are afraid of the quants. Partly because they are afraid of the math (a less valid fear), and partly because they see the quants as a threat (a more valid fear). Quants, meanwhile, tend to believe the fundamental people are just winging it.

In reality I think this is more an issue of language barrier and professional rivalry than true disagreement over how markets work or what is happening in the markets at a given point in time. In my experience, the best fundamental investors employ quant-like pattern recognition in filtering and processing ideas. Many quants, meanwhile, are using the same variables the fundamental people look at to build their models.

Personally, I think anyone who wants to survive in the investment profession over the next twenty years is going to have to be something of a cyborg.

Though, come to think of it, that probably applies to every industry.

Lies, Damn Lies and Active Share

These days it is fashionable to evaluate investment managers on a statistic called active share. Active share measures the similarity between a fund and a benchmark. Specifically, it compares the weighted portfolio holdings of a given portfolio to those of a benchmark index.

If I own everything in the S&P 500 portfolio in the same proportions my active share is 0%. In theory an index fund would have 0% active share but transactional frictions will create small differences.

Anyway, if I want high active share I can get it in several ways:

  • Own things that aren’t in the benchmark
  • Refuse to own things that are in the benchmark
  • Underweight things versus the benchmark
  • Overweight things versus the benchmark

All active share can tell you is that a thing is different from a given index. Full stop.

Shrewd marketing people have done their best to distort this to mean “funds with high active share are better.” This is nonsense. If I pick 10 stocks outside the S&P 500 at random I will show an active share of 100%. You would have to be an idiot to buy my fund on the basis of its active share.

Shrewd marketing people get traction with the notion that “funds with high active share are better” because it IS true that dramatic outperformance results from being 1) very different, and 2) very right. Very different on its own doesn’t get the job done. Being very different and very wrong for example is ticket to the poorhouse.

Active share is a popular statistic because it is easy to calculate and easy to understand. People are always on the lookout for that “one weird trick” they can use to hack the system for more money, better looks and lots of sex.

Unfortunately that’s not how quantitative analysis works.

Quantitative analysis isn’t “pure” mathematical reasoning. It’s inductive reasoning. When we prove things in mathematics, we know they are true. We don’t actually prove anything using statistics. Rather, we “fail to reject the null hypothesis at such-and-such a confidence interval.” This is the problem of induction.

Doing a statistical analysis of an investment strategy is like trying to assemble a puzzle where the pieces are constantly changing shape (albeit pretty slowly and by pretty small amounts). Active share is just one of those pieces. Even then you have to recognize that the results of the analysis are backward looking. There’s no guarantee those statistical relationships will persist in the future.

So, you know, caveat emptor.

Cause and Effect

This is a quick follow-on from an older post. That post discussed the issue of low interest rates and their impact on justified valuation multiples. I wrote:

A popular contrarian narrative in the markets is that central bankers have artificially suppressed interest rates, and that absent their interventions the “natural” rate of interest would be higher (implying a higher discount rate and thus lower sustainable valuation multiples). The key risk to this thesis is that low rates are not some exogenous happening imposed on the market by a bunch of cognac swilling technocrats, but rather a consequence of secular shifts in the global supply and demand for funds. Specifically, that these days there is a whole boatload of money out there that needs to be invested to fund future liabilities and too few attractive investment opportunities to absorb it all.

If low rates are actually a function of the supply and demand for funds, it doesn’t ultimately matter what central bankers intend do with monetary policy. Market forces will keep rates low and elevated valuations will remain justified.

A friend questioned what I was driving at here, and whether it would be possible to falsify this thesis. For the record, I have no idea whether I’m right or wrong. I’m just trying to envision different possibilities.

That said, I am pretty sure the answer lies in the shape of the yield curve.

As many, many, many commentators have already observed, the Treasury yield curve hasn’t made a parallel shift upward as the Fed raised short-term rates. The short end of the curve has come up pretty significantly but the long end has basically held steady. This is important because central banks tend to have less influence over long-term rates than short-term rates.

chart
Source: Bondsupermart.com

As the Fed continues to shrink its balance sheet, what we would hope to see is the yield curve making a nice, steady, parallel shift upward. What we do not want to see is the 30-year Treasury yield stuck at 3%. The 30-year Treasury yield stuck at 3%, in the absence of Fed intervention, would support the theory that there are structural factors holding down future expected returns. Namely: an excess supply of financial capital relative to opportunities.

My previous posts on this subject have dealt with the risks of naively extrapolating very low interest rates forever. You can attack the issue from different angles but each case more or less boils down to overpaying for risky cash flows.

What I have not done is explored strategies for taking advantage of such an environment. As with the risks, you can attack the issue from a number of different angles. But again, they share a common thread. Here each strategy more or less boils down to taking on duration.

I want to examine this further in a future post, but here is a little teaser…

Duration is most commonly used to analyze interest rate risk in the fixed income world. But the concept can also be applied to other asset classes. Long duration equities are things like venture capital and development stage biotech companies, where cash flows are but a twinkle in your eye when you invest. Long duration equities usually can’t sustain themselves without repeated infusions of investor cash. They thrive when capital is cheap. They die when capital gets expensive.

If you knew capital was going to be remain cheap forever, you would probably want to make long duration equities a significant portion of your portfolio. You could get comfortable investing in really big ideas that would take a long time to be profitable. I am talking about massive, capital intensive projects with the potential to change the world (think SpaceX).

And here’s where I might start getting a little loopy…

…because what if an excess of financial capital is a precondition for tackling the really big projects that will advance us as a species?

Futures Did Not Crash The Bitcoin Price

Longtime readers know I am largely a crypto skeptic. Specifically I am one of those annoying the-principles-underlying-crypto-are-undeniably-transformational-but-I-am-skeptical-of-the-investability-today people. However, there is one crypto myth I cannot really abide and that is the myth that the start of futures trading is responsible for the crypto drawdown that started in January 2018.

I first wrote about this issue of Bitcoin futures here.

But this isn’t that complicated. The reality is that BTC futures volumes are low. Like really low in comparison to volumes for BTC overall. Below is the data for BTC:

BTC_Trading_Volume
Source: data.bitcoinity.org

And here is the data from CME Group for its contract (note: multiply by 5 because each CME contract is for 5 BTC):

CME_BTC_Futures_Volume
Source: CME Grou

Let’s double that to account for the fact that CBOE offers its own Bitcoin futures. Even then you are talking about maybe 30,000 BTC worth of total volume. I struggle to believe these meager volumes are pushing the market around.

More importantly, just because I sell a BTC futures contract does not mean the spot price of BTC automatically drops.

Someone has got to push the sell button in the spot market for that to happen. My selling of BTC futures does not in and of itself compel anyone to sell spot BTC. It may encourage someone to to come in and take a position based on how my order impacts the term structure of BTC futures in relation to the spot price. But that is a very different proposition from “I am short Bitcoin futures so now the spot market is falling.”

Now maybe there is data out there beyond someone lining up dates that shows a clear causal relationship between the BTC selloff and the start of futures trading, but I have yet to see it (if you have such data please get in touch).

Otherwise, repeat after me: correlation is not causation.

Deworsification [WONKISH]

If you are not interested in the mathematics of portfolio construction you can safely skip this post. This is a (relatively) plain language summary of a research paper published in The Financial Analysts Journal. It is not investment advice and should not be used as the basis for any investment decision.

One of the issues that I have been interested in for a long time is the issue of overdiversification in investment portfolios. We are conditioned by portfolio theory to accept diversification as a universal good. However, depending on the investor’s objectives diversification can be counterproductive–particular when higher cost investment strategies are involved. This post examines the research paper, “What Free Lunch? The Costs of Over Diversification” by Shawn McKay, Robert Shapiro and Ric Thomas, which offers a rigorous treatment of the issue.

Summary

The authors use empirical and simulated data to develop a framework for assessing the optimal number of active managers in an investment allocation. They find that as one adds managers to an investment allocation, the active risk (a.k.a “tracking error”) decreases while investment management expenses remain constant or even increase. This leads to the problem of “overdiversification” or, more colloquially, “deworsification.”

Source: McKay, et al.

The authors propose two measures to analyze the impact of overdiversification:

Fees For Active Risk (FAR) = Fees / Active Risk

Fees For Active Share (FAS) = Fees / Active Share

All else equal, one would like the FAR and FAS ratios to be as low as possible.

Source: McKay, et al.

However, perhaps the most important conclusion the authors reach is that as active risk decreases, the security selection skill needed to deliver outperformance versus a benchmark rises exponentially:

Holding breadth [portfolio size] constant allows us to develop a framework that illustrates the trade-offs between active risk and the information coefficient for various levels of expected return. Each line in Figure 5 is an isometric line, highlighting various combinations that give a fixed level of expected return. The curve at the bottom shows all combinations of active risk and the information coefficient in which the excess return equals 1% when holding breadth constant at 100. The two other lines show the same trade-offs for breadth levels of 60 and 20, respectively.

As expected, the required information coefficient increases as tracking error declines, but it rises exponentially as we approach lower levels of active risk. Allowing for greater breadth shifts the line downward, beneficially, but in all cases, there is a similar convex relationship.

Active_Risk_vs_Skill
Source: McKay, et al.

Practical Implications

  • The more diversified your allocation, the more difficult the relative performance game gets due to increasing fee drag on decreasing levels of active risk.
  • Investors who are aiming for significant outperformance via active management should concentrate capital with a small number of managers.
  • Investors who desire highly diversified portfolios are thus better off allocating to a passive, factor and enhanced-index funds than dozens of highly active equity managers.
  • Capacity and fiduciary constraints make it extra challenging for capacity constrained investors such as large pension funds to generate substantial alpha at the portfolio level, as it is imprudent for them to run highly concentrated portfolios. For these investors in particular, a core-satellite approach likely makes sense.