The quantitative analyst and the fundamental stock picker, from different worlds and often working on separate floors, are increasingly putting their heads together to produce better results for investors.
That’s why the term “quantamental,” a blend of the two styles, may soon become common parlance among ordinary investors.
“There’s a very important synergy that is not hard to recognize coming from the quant perspective and coming from the fundamental perspective,” said Savita Subramanian, head of U.S. equity and quantitative strategy at Bank of America Merrill Lynch in New York.
Buffett vs. Simons
Fundamental investors track corporate earnings, balance sheets, industry trends, the economy and other sources of information to make informed investment decisions. Billionaires Warren Buffett and Charlie Munger of Berkshire Hathaway epitomize the midnight-oil-burning fundamental approach, with their focus on the ins and outs of a corporate balance sheet combined with their application of common-sense judgment. The finance world has worked that way for decades.
Quantitative analysis, on the other hand, uses mathematical and statistical modeling that pulls in a sometimes-dizzying array of inputs to screen investment ideas. The rising popularity of quantamental investing reflects advances on the quantitative front. Pioneering quant-focused hedge-fund managers, such as Renaissance Technologies’ James Simons, are more famous than ever, at least in the financial world. Colleges are even offering courses of study, such as the Hoboken, N.J.-based Stevens Institute of Technology’s bachelor’s degree in quantitative finance.
Quantitative hedge funds will soon top $1 trillion: They had $967 billion in assets under management as of the second quarter, roughly double the figure in 2010, according to Hedge Fund Research in Chicago. The number of quant-oriented hedge funds recently surpassed 2,000.
At the same time, fundamental stock pickers are under pressure from low-cost, passive, index-tracking strategies that have capitalized on the inability of the majority of active managers to beat their benchmarks in the long run. Data from New York-based S&P Dow Jones Indices, for example, found that nearly 97% of active, large-cap equity mutual funds underperformed their benchmark, after fees, in the five years through 2017.
Meeting of the minds
But now a growing number of fundamental money managers who had relied on traditional research to make investment decisions are beginning to incorporate computer-based methods that use algorithms to sift through reams of often arcane data in search of trading signals. Still, they’re not scrapping fundamental analysis, which relies on expertise and human judgment to add value.
Long-only, active managers with $680 billion in assets under management use a blend of quantitative and fundamental analysis, more than double that of a decade ago, according to data-analytics company eVestment in Atlanta. (“Long” refers to a bet on rising securities prices; “short,” the opposite.) Among high-profile investors, New York-based fund manager BlackRock in 2017 announced it was moving $30 billion in assets to strategies that would rely on more computers and fewer humans.
The combination makes sense, proponents say, because quants scour data such as forecasts for sales and corporate earnings or macroeconomic reports looking for anomalies. They need that data, produced by fundamental analysts who know their companies and sectors well, for their analysis. Fundamental analysts, meanwhile, can benefit from the clinical approach taken by the quants.
“Experience has taught me that if you look at even the most fundamentally oriented fund managers, they have a need for systematic output that shields them from being emotionally attached to particular investment ideas,” said Brian Cho, the head of quantitative research at Chiron Investment Management, a New York money-management firm that uses quantamental techniques.
An example of the quantamental approach is offered by Bank of America Merrill Lynch’s “alpha surprise” model, which applies a quantitative overlay to forecasts by the firm’s fundamental analysts. The strategy has beaten the benchmark S&P 500 Index (.SPX) in 23 of the past 30 years, Bank of America’s Subramanian said.
Fundamental analysts “fly around and visit the companies and talk to management and analyze the nitty-gritty,” she said. “Then we say, ‘OK, removing the emotion around the buy or sell decision, what stocks from this disciplined screening framework look like the highest-conviction picks from our fundamental team?’ ”
Data on the alpha surprise model go back to 1986, underlining the fact that the blending of quantitative and fundamental analysis isn’t brand new. But it’s just one way that the average investor can now use a growing toolbox of quantitative techniques. Analysts say the past decade in particular, with the advent of big data and the computer power to rapidly crunch a variety of data sets relatively cheaply, has helped drive interest in quant strategies.
“I don’t see a real fine line between fundamental and quant — it’s a spectrum,” said Dan Culloton, director of equity strategies at the Chicago investment-research firm Morningstar. “Every quant model begins with a fundamental decision about what factors to include.”
Competition from passive products
Meanwhile, a number of drivers are contributing to the growth of quantamental investing.
Those include competition from passive products that are forcing fundamental managers to justify their fees, said Alon Bochman, a partner in New York-based Genpact’s capital-markets consulting practice.
Index-based mutual funds and exchange-traded funds (ETFs) have swelled with assets, as companies including Vanguard Group argue that the great majority of traditional fund managers fail to beat their benchmarks. The Vanguard 500 Index ETF (VOO), which mirrors the benchmark S&P 500 Index, costs an investor only 4 basis points a year in fees, or 40 cents for every $1,000 invested. A typical large-cap mutual fund costs about 1.25% a year in fees, 30 times the Vanguard fund’s fees.
Advances in machine learning and big-data analysis have further fueled the popularity of quant techniques, said Lin William Cong, assistant professor of finance at the University of Chicago’s Booth School of Business.
Machine learning is a form of artificial intelligence that centers on computer programs that adapt to feedback without human interference. Big data is a broad term, referring to massive data sets that can be crunched in the hunt for patterns and trends.
Recipe for disaster
But while acceptance of the symbiotic relationship between quantitative and fundamental analysis may be the case at a firm such as Bank of America Merrill Lynch that has been using those techniques for decades or a shop like Chiron that’s built around a quantamental approach, it isn’t necessarily the case elsewhere.
Often, the temptation is for a firm to hire a group of quants, set up a data-science team, give them a budget and resources, and tell them to build their own models, said Bochman. That can be a recipe for disaster, he said, due to clashing cultures.
Fundamental analysts, while hardly technophobes, are often weary of computer-driven investing techniques, insisting that human judgment can’t be replicated. That can make it difficult for quantitative analysts, who are often more versed in mathematics or computer science than the basics of the securities markets, and fundamental stock pickers to mix.
“The cultural issue is really, really hard to get through, and the way that it comes across to us is people from a fundamental background are very skeptical of automated solutions,” Bochman said.
That’s because it can be “psychologically threatening” to fundamental managers, and also because it has been employed poorly in the past, he said.
Among the biggest convulsions of the past two decades was the 1998 implosion of Long-Term Capital Management, the firm whose founders included two Nobel Prize winners but whose computer-driven interest-rate-arbitrage bets went awry due to the firm’s reliance on heavy leverage. The Russian financial crisis lit the fuse, and the U.S. Federal Reserve, in a dramatic move, organized a bank-led bailout to ward off a worldwide disaster.
“There have been so many movements that promised the Holy Grail through a computer, and they blew up spectacularly,” Bochman said.
The inroads of quantitative strategies reflect, in part, the advances in big data over the past decade. The ability to crunch large sets of data — along with access to new types of data — offers plenty of temptations.
One of the more curious pieces of data: satellite images that can be used to gauge retail sales by observing mall parking lots for cars. Even more nuanced information is at play. Software can now sift through earnings-call transcripts searching for changes in tone or word patterns. And those products can potentially be used for big advantages by a skilled analyst.
At the same time, “just because you have more data and you can do more analysis doesn’t mean you gain more insight,” said Chiron Investment Management’s Cho.
Rather than bolting on quantitative technology, a fundamental manager should instead take a hard look at each step of her investing process and look for ways to improve it, Bochman said.
For example, a fundamental stock picker might glean investment ideas from colleagues, news stories, conferences and a wide range of other sources. Quantitative analysis could then be used to capture all the ideas that ever came across a manager’s desk and let the computer sort them. The sources can then be judged according to which are the most productive.
A fundamental stock picker might be wary of a quant team simply telling her a stock is a “buy.” But if the quant team can identify screening techniques that produce significantly better returns, the stock picker is more likely to become a believer, Bochman said. That same feedback loop can be applied to other parts of the process, including due diligence, investment committees, reviews, portfolio construction and so on.
And then there’s the issue of cost.
Simply put, it’s cheaper to run a computer model than it is to hire and train a well-educated person, Bank of America Merrill Lynch’s Subramanian said, noting that virtually every industry has seen massive cost cutting and automation. And, besides, there are tasks that computers can perform better than a person, such as calculating thousands of correlations among stocks and sectors.
In fact, the growth of quantamental techniques and advances in artificial intelligence and other technologies means that the number of analysts and others employed on Wall Street in the next 10 to 20 years is likely to fall sharply as productivity increases, Bochman said. For investors, that should mean cheaper products that perform better.
Time will tell if that promise is fulfilled. In the meantime, investors should probably exercise some caution and, as always, try to make informed choices.
“The investing public has to understand that not all strategies work all the time,” said Chiron’s Cho, “and the quantamental process is not going to be immune to that.”