Illustration by Sébastien Thibault for Fortune
By Ryan Bradley
December 15, 2017

It’s possible, without squinting, to gauge the past decade as one of unprecedented technological growth. It ushered in not only the smartphone, but with it a cornucopia of apps for every imaginable want, need, or obsession. It was a decade in which Uber became a noun, a verb, and an analogy too—as in, the “Uber of X.” In these same 10 years, virtual reality headsets and drones joined the litany of everyday playthings; millennial friendships were saved by Venmo; and a completely stateless digital currency became an investor darling, even if most of Bitcoin’s fans still don’t quite grasp how it works. Genetic editing became extraordinarily accessible thanks to a tool named Crispr, and A.I. went from mastering Go to becoming the “self” in self-driving cars. There’s now an entire economy based on the mobile connectedness that’s built on the devices in our pockets. In 2007, most of these innovations didn’t exist.

So given all that, it would seem absurd to suggest that innovation is somehow less potent than it once was—that, somehow, all those new ideas we’re generating are offering less bang for the buck. But that is precisely what one group of economists have concluded. And it’s worth assessing their argument for a number of reasons, because it has enormous implications for businesses, investors, and society as a whole.

Over the past decade, even as innovation has soared, growth in overall productivity as well as economic output has slowed. Both remain at anemic levels even after recent upticks, hovering at annual rates between 1% and 2% since the Great Recession, whereas historical averages are closer to 5% GDP growth and 3% productivity rates. Plus, year by year, income inequality has widened. The economic benefits of the business world’s new ideas do not seem to be so widely shared. The new ideas that were meant to transform the way we live aren’t unleashing the kinds of broad economic benefits that actually make life better for most people.

Ideas that were meant to transform the way we live aren’t unleashing the economic benefits that actually make life better.

Put another way: There’s a gap between the ambitions of our innovations and the scale of their impact. Understanding the causes of that gap could help us close it—and, in turn, convert more of business’s big ideas into the kind of broadly shared prosperity we’ve been pining for.


We have been in a period of remarkable, near-­exponential growth since the beginning of the industrial age. Economists have explained this growth as the result, in part, of a more or less constant number of researchers and scientists diligently doing what researchers and scientists do. That constant, economists held, was supposed to be capable of producing a steady stream of new ideas that would, in turn, generate economic expansion. So long as the number of researchers and scientists is held in a relative constant, so is the potential for very strong growth.

But lately, cracks have begun appearing in this economic theory. This summer, a group of economics professors and graduate students at MIT and Stanford took a closer look at the link between research spending and economic growth. The title of their working paper, published by the National Bureau of Economic Research, got right to the point: “Are Ideas Getting Harder to Find?” Reading the paper, one quickly learns that, yes, ideas are getting harder to find, and a lot more expensive too.

The best example of the old theory—that constant research leads to exponential growth—is Moore’s law, which holds that processing power on a computer chip doubles approximately every two years. This has pretty much been the case for more than 40 years, as happily evidenced by our mobile phones, which hold more sheer computing power than many buildings full of 1970s computers.

The problem, however, is that the research that has driven such a tremendous change hasn’t been a constant at all. In fact, as the “Ideas” paper’s authors found, the cost of such research efforts requires 78 times the funding today in inflation-adjusted dollars as it did when Nixon was President. The computers may be much smaller, but now it’s the people driving their advances that fill many, many buildings.

.

This trend isn’t limited to the tech sector. The study’s authors also looked at agriculture, specifically the yields of corn, soybean, cotton, and wheat. They compared the annual yields with funding for research into improving crop productivity. Spending increased, and so did yields, but not in the way you might expect. While the average yield doubled between 1960 and 2015, the annual inflation-adjusted investment in research to improve those yields increased at least threefold. In some cases—for certain crops, over certain years—the research investment increased by a factor of 25. Agricultural businesses seem to be spending more and more on R&D, all while getting less and less out of it.

It’s the same story in other fields too. The Stanford and MIT economists also looked at productivity in medical research—specifically in cancer. They found declines in productivity across the board. Even as more papers were published and more clinical trials mounted, the growth in the rate of lives saved per 100,000 people has continued to slow. While individual moments of progress bring astonishment and relief (as we’ve seen to some extent in cancer immunotherapy), on the whole more effort is yielding less impact. We need to publish even more papers, spend even more on clinical trials, if we hope to keep up the pace of lifesaving treatments we were developing decades ago.

Sure, increases in research spending tend to pay off in the short term for individual companies. A Fortune study of the S&P 500 showed that of the 155 that reported increased levels of R&D spending between 2007 and 2017, more than two-thirds—108 companies in all—had stock returns that beat the index’s. But generating stock market outperformance isn’t the same thing as having a broad, positive impact on the economy as a whole. Across all publicly traded companies, the “Ideas” paper’s authors found that to produce the same rate of growth our economy experienced 30 years ago, companies would need to spend 15 times as much on R&D. The bottom line: Big ideas—the ones that truly drive economic growth or a change in the standard of living—are harder than ever to find because they’re more costly than ever before.

Why is that so? One reason that such innovations are increasingly difficult to come by is because, as knowledge advances, the base of fundamental knowledge grows. Just to be proficient in many fields of science or industry today requires an investment in education or training that is significantly higher than it was a generation ago.

Another factor is the cost of pure research. Yes, that’s growing too. Equipment has generally become more expensive and, relatedly, exclusive. Fewer people have access to it. And just as we’ve moved from single computers to supercomputers, we’ve moved from a culture of individual researchers making breakthroughs to one of massive teams of highly educated and compensated experts attempting to solve far more complicated problems. We’re way past the low-hanging fruit, trying to build the tools and systems that will get us to the tippy-top of the tree.


The “Ideas” paper, and others like it, might make it sound as though we’re staring down the barrel at the end of innovation. Do not despair. It is entirely possible, even likely, that we’re simply approaching the end of boom times in particular fields, the tops of certain already well-trimmed trees. From the discovery of computers to the rise of IT and the Internet, we have the undercurrent of Moore’s law, churning away. Now, that current is slowing down—but others could someday lend their velocity to a growing economy.

“It’s not true that because we’ve hit a hard place, we’ve reached peak ideas,” says Michael Webb, a Ph.D. candidate in economics at Stanford and one of the study’s authors. “A good way to think of it, perhaps, is like prospecting for oil. Within a given oilfield, you get most of the oil out, and it becomes really expensive to get the rest. We’ve been pumping out the oil from IT for a very long time now, but there’s a whole new oilfield out there we haven’t found yet.”

One such field, Webb suggests, is genomics, which is experiencing a flourishing of new applications thanks to the gene-editing tool known as Crispr (see “America’s Big Ideas in Waiting”). We’re still in the very early, exploratory days of what may become the medical and economic equivalent of a rich oilfield of cheap and abundant ideas that pay off in reduced medical costs and longer productive life spans. The same goes for virtual reality, which has been around for longer, and may end up being the media of choice in the future—driving explosive growth.

Which brings up a secondary but no less important point: There are plenty of current technologies that won’t look cost-effective unless and until they really do become transformative—at which point, in hindsight, they will appear to be relative bargains. The Internet was a geeky academics’ hobby until the mid-1990s. Today it’s an economic pillar.

The talent base only grows (making ideas cheaper) if we make a point of cultivating and supporting talent.

Nearly every economist I spoke to pointed to artificial intelligence as an especially likely avenue of ideas that could catalyze enormous leaps in growth and productivity. A.I. is also a great example of how expensive new ideas can be: The computing power required is enormous, the expertise tremendous, and the jobs incredibly well-paying (and thus costly for employers).

Benjamin Jones, an economist at Northwestern who studies entrepreneurship and has a forthcoming paper on A.I., says that, in perhaps a counterintuitive way, population growth can help reduce the cost of generating new ideas. “We’re fully able to throw more people at the problem,” he explains. Costs of employment should level off, in theory at least, as China and India—by far the two most populous nations—continue to integrate into the research engines feeding the global economy.

But having more people to throw at complex problems helps us only if more people are highly educated. In other words: The talent base grows (making ideas cheaper) only if we make a point of cultivating and supporting talent. As the evolutionary biologist Stephen Jay Gould once put it, “I am, somehow, less interested in the weight and convolutions of Einstein’s brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.”

John Van Reenen, an economist at MIT and another of the “Ideas” paper’s authors, has performed studies that look at talent cultivation in the U.S. by trying to determine the likelihood of someone becoming an inventor. He found that those born into the top 1% income level were 10 times as likely to become inventors as those born into the bottom 50%. He also found that men were more likely to be inventors, and minorities less likely.

Van Reenen’s research nicely illustrates a problem of opportunity. If we believe that a smart kid from anywhere could become an inventor—a belief that is integral to both the economy and the story we tell ourselves about the American dream—then we need to support the policies that make that possible. “Many kids don’t even get the opportunity to imagine what it would be like to become an inventor, what that job is, what it looks like,” says Van Reenen. “A huge part of this is education, exposure, allowing someone to dream of better possibilities for themselves.”

Van Reenen realizes that his solution might not seem as sexy as erecting a steel and glass-walled building filled with brilliant researchers. But policy-based solutions as simple as improved funding to underfunded public schools could be a much cheaper solution to closing the innovation gap than other, more complex fixes. In an increasingly knowledge-fueled economy, Van Reenen says, “This is the low-hanging fruit.”


America’s Big Ideas in Waiting

by Erica Fry

Crispr

Crispr-Cas9, a gene-editing system.
Molekuul—Science Photo Library/Getty Images

Earlier this year, researchers in Oregon changed the DNA of human embryos. Using Crispr, the breakthrough genome-editing technology that acts like a sort of molecular scissors, the scientists repaired a genetic mutation—a heritable heart condition—in the embryonic DNA. That once-unthinkable, almost God-like feat is just one of a number of stunning Crispr experiments that in recent years have taken the scientific world by storm.

The rapidly evolving technology—­actually self-guided bacterial proteins—allows scientists to snip and edit problematic DNA with relative quickness and speed, an ability that holds tantalizing promise for treating (or even curing) genetic disease and mutation, whether it be in humans, plants, or animals. Think drought-resistant crops and children free of their parents’ hereditary diseases. There are still plenty of kinks to work out, but scientists have already used Crispr to, among other things, restore hearing in mice, create low-fat pigs, and delay the browning of mushrooms.

Virtual Reality

An Airbus staffer views a cockpit with VR.
Kyodo News—Getty Images

It may seem like virtual reality is a nifty, if still somewhat clunky, technology for video enthusiasts and gamers—albeit with tangential benefits for online shopping, chatting, and entertainment. But leisure isn’t all those headsets are going to be good for. VR and its cousin augmented reality are also changing and bettering the way we perform at work.

The practical applications for immersive, virtual experience are myriad—from speeding and enriching product development (engineers and manufacturers can preview and tinker virtually) to training medical students and sharpening the skills of surgeons (who can practice before tricky procedures). Athletes use it to prep for big games; real estate professionals use it to show homes and spaces; and journalists and entertainers use it to enhance their content. Corporations from Airbus to Ford to Marriott to Carnival (the cruise line) have all found uses.

After all, experience is the best teacher, and increasingly—often because of VR—you don’t have to depend on the real world to get it.

Artificial Intelligence

Amazon's A.I.-powered Echo Dot.
Courtesy of Amazon

After decades fantasizing—and fretting—about the rise of artificial intelligence, the revolution (if not the singularity) finally seems nigh. From voice-activated assistants like the Amazon Echo to self-driving cars, machines are swiftly getting smarter, thanks to relatively recent breakthroughs in “deep learning.” Deep learning enables software—fed with ever-accumulating reams of data (that gets processed with ever-increasing computer power)—to recognize patterns and perform complex tasks much more quickly and reliably than humans ever could.

That means machines may be better suited to the time-consuming activities we humans do imperfectly—like picking stocks, diagnosing disease, detecting fraud, and identifying drug targets. In some cases, machines will make us more productive in the jobs we have; in others, they’ll make us obsolete. The stakes in this race are so high that Russian President Vladimir Putin has framed the matter as a modern-day space race: Whoever leads in A.I., he told a group of students earlier this year, will rule the world.


A version of this article appears in the Dec. 15, 2017 issue of Fortune with the headline “Closing America’s Idea Gap.”

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST