The phrase diminishing returns is such a cliché that few people give it much thought. Picking out the pecans from a bowl of salted nuts gives diminishing returns: The pieces of pecan in the bowl get rarer and smaller. The fingers keep finding almonds, hazelnuts, cashews, or even—God forbid—Brazil nuts. Gradually the bowl, like a moribund gold mine, ceases to yield decent returns of pecan.
Now imagine a bowl of nuts that has the opposite character. The more pecans you take, the larger and more numerous they grow. That is the human experience for the last 100,000 years. The global nut bowl has yielded ever more pecans.
Nobody predicted this. The pioneers of political economy expected eventual stagnation. Adam Smith, David Ricardo, and Robert Malthus all predicted that diminishing returns would eventually set in, that the improvement in living standards they were seeing would peter out. “The discovery, and useful application of machinery, always leads to the increase of the net produce of the country, although it may not, and will not, after an inconsiderable interval, increase the value of that net produce,” said Ricardo, who perceived an inexorable tendency toward what he called a “stationary state.” John Stuart Mill, conceding that returns were showing no signs of diminishing in the 1840s, put it down to luck. Innovation, he said, was an external factor, a cause but not an effect of economic growth.
Even Mill’s modest optimism was not shared by his successors. They feared that as discovery began to slow, competition would drive profits out of the increasingly perfect market until all that was left was rent and monopoly. With Smith’s invisible hand guiding myriad market participants possessing perfect information to profitless equilibria and vanishing returns, neoclassical economics gloomily forecast the end of growth.
To explain the modern global economy’s bottomless nut bowl, you must explain where the perpetual innovation machine and its increasing returns came from. They were not planned, directed, or ordered. They emerged, evolved, bottom up, from specialization and exchange. The accelerated exchange of ideas and people made possible by technology fueled the accelerating growth of wealth that has characterized the last century. Politicians, capitalists, and officials are flotsam bobbing upriver on the tide of invention.
Even so, the generation of new useful knowledge is far from uniform, steady, or continuous. Innovation is like a bush fire that burns brightly for a short time, then dies down before flaring up somewhere else. Fifty thousand years ago, the hottest hot spot was west Asia (ovens, bows and arrows); 10,000 years ago, the Fertile Crescent (farming, pottery); 5,000 years ago, Mesopotamia (metal, cities); 2,000 years ago, India (textiles, zero); 1,000 years ago, China (porcelain, printing); 500 years ago, Italy (double-entry bookkeeping, Leonardo); 400 years ago, the Low Countries (the Amsterdam Exchange Bank); 300 years ago, France (Canal du Midi); 200 years ago, England (steam); 100 years ago, Germany (fertilizer); 75 years ago, America (mass production); 50 years ago, California (credit card); 25 years ago, Japan (Walkman). No place remains for long the leader in knowledge creation.
Just as the bush fire breaks out in different parts of the world at different times, so it leaps from technology to technology. Today, as during the printing revolution of 500 years ago, communication is aflame with increasing returns, but transport is spluttering with diminishing returns. A greater and greater amount of effort is needed to squeeze the next few miles per gallon out of vehicles of any kind, whereas each additional tranche of megabits comes more cheaply.
But the greatest impact of an increasing-return wave comes long after the technology is invented. It comes when the technology is democratized. Gutenberg’s printing press took decades to generate the Reformation. Today’s container ships go not much faster than a 19th-century steamship, and today’s Internet sends each pulse little quicker than a 19th-century telegraph—but everybody is using them, not just the rich. Jets travel at the same speeds they did in the 1970s, but budget airlines are new.
So what is the flywheel of the perpetual innovation machine that drives the modern world? Why has innovation become routine? How was it that, in Alfred North Whitehead’s words, “The greatest invention of the 19th century was the invention of the method of invention?”
Francis Bacon was the first to make the case that inventors are applying the work of discoverers and that science is the father of invention. Modern politicians agree. The recipe for making new ideas is easy, they say: Pour public money into science, which is a public good because nobody will pay for the generation of ideas if the taxpayer does not, then watch new technologies emerge from the downstream end of the pipe.
It used to be popular to argue that the European scientific revolution of the 17th century unleashed the rational curiosity of the educated classes, whose theories were then applied in the form of new technologies, which in turn allowed standards of living to rise. But history shows this account is backward. Few of the inventions that made the industrial revolution owed anything to theory.
It is true that England had a scientific revolution in the late 1600s, but the influence of scientists like Isaac Newton and Robert Hooke on what happened in England’s manufacturing industry in the following century was negligible. The industry that was transformed first and most, cotton spinning and weaving, was of little interest to scientists. The jennies, gins, frames, mules, and looms that revolutionized the working of cotton were invented by tinkering businessmen, not thinking boffins. It has been said that nothing in their designs would have puzzled Archimedes.
Even the later stages of the industrial revolution are replete with examples of technologies that were developed in remarkable ignorance of why they worked. This was especially true in the biological world. Aspirin was curing headaches for more than a century before anybody had the faintest idea of how. Penicillin’s ability to kill bacteria was finally understood around the time bacteria learned to defeat it.
Most technological change comes from attempts to improve existing technology. It happens on the shop floor among apprentices and mechanics or in the workplace among the users of computer programs, and only rarely as a result of the application and transfer of knowledge from the ivory tower.
Perhaps money drives the innovation engine. The way to incentivize innovation, any Silicon Valley venture capitalist will tell you, is to bring capital and talent together. For most of history, people have been adept at keeping them apart.
In imperial Rome, scores of unknown slaves no doubt knew how to make better olive presses, better watermills, and better wool looms, while scores of plutocrats knew how to save, invest, and consume. But the two lived miles apart, separated by venal middlemen who had no desire to bring them together. An anecdote repeated by several Roman authors drives home the point. A man demonstrates to Emperor Tiberius his invention of an unbreakable, malleable form of glass, hoping for a reward. Tiberius asks if anybody else knows his secret and is assured nobody does. So Tiberius beheads the man to prevent the new material from reducing the relative value of gold to that of mud. The moral of the tale—whether it is true or not—is not just that Roman inventors received negative reward for their pains but that venture capital was so scarce that the only way to get a new idea funded was to go to the emperor.
The flowering of innovation in 18th-century Britain and late-20th-century California was powered by immigrants attracted to vast accumulations of wealth and efficient capital markets. The financing of innovation gradually moved inside firms as the 20th century progressed, baked into the budgets of companies haunted by the Schumpeterian fear that innovation could pull their whole market from them and dazzled by dreams that they could pull the whole market from under their rivals. But companies are perpetually discovering that their R&D budgets get captured by defensive and complacent corporate bureaucrats. The history of the computer industry is littered with examples of big opportunities missed by dominant players, which thereby find themselves challenged by fast-growing new rivals. IBM and Digital Equipment suffered this fate, and so will Apple, Microsoft, and Google. The great innovators are still usually outsiders.
Money is certainly important in driving innovation, but it is by no means paramount. Even in the most entrepreneurial economies, very little saving finds its way to innovators. Victorian British inventors lived under a regime that spent a large proportion of its outgoings on interest payments, sending a signal that the safest thing for rich folk to do with their money was to collect rent on it from taxes on trade. Today plenty of money is wasted on research that does not develop, and plenty of discoveries are made without the application of much money. When Mark Zuckerberg invented Facebook in 2004 as a Harvard student, he needed very little R&D funding. Even when he expanded it into a business, his first investment of $500,000 from PayPal founder Peter Thiel was tiny compared with what entrepreneurs needed in the age of steam or railways.
Perhaps property is the answer. Somebody will not invest time and effort in planting a crop in his field if he cannot expect to profit from it, a fact that Stalin, Mao, and Robert Mugabe learned the hard way. Surely nobody will invest time and effort in developing a new tool or building a new kind of organization if he cannot keep at least some of the rewards.
Yet intellectual property is very different from real property, because it is useless if you keep it to yourself, and an abstract concept can be infinitely shared. These features create an apparent dilemma for those who would encourage inventors. People get rich by selling each other things (and services), not ideas. Manufacture the best bicycles, and you profit handsomely; come up with the idea of the bicycle, and you get nothing because it is soon copied. If innovators are people who make ideas, rather than things, how can they profit from them? Does society need to invent a special mechanism to surround new ideas with fences, to make them more like houses and fields?
There is little evidence that patents really drive inventors to invent. In the second half of the 19th century, neither Holland nor Switzerland had a patent system, yet both countries flourished and attracted inventors. The list of significant 20th-century inventions that were never patented includes the automatic transmission, Bakelite, ballpoint pens, cellophane, cyclotrons, gyrocompasses, jet engines, magnetic recording, power steering, safety razors, and zippers. By contrast, the Wright brothers effectively grounded the nascent aircraft industry in the United States by enthusiastically defending their 1906 patent on powered flying machines.
Intellectual property can help. A patent can be a godsend to a small firm trying to break into the market of an established giant. In the pharmaceutical industry, where government insists on a massively expensive regime of testing for safety and efficacy before a product launch, innovation without some form of patent would be impossible. But modern patent systems are too often a gauntlet of phantom tollbooths, raising fees from passing inventors and thus damaging enterprise. And intellectual property does very little to explain why some times and places are more innovative than others.
Governments can take credit for a list of big inventions, from nuclear weapons to the Internet, from radar to satellite navigation. Yet government is also notorious for its ability to misread technical change. In America a truly breathtaking outburst of government-led idiocy appeared in the 1980s under the name of Sematech. Based on the premise that the future lay in big companies manufacturing memory chips (which increasingly were being made in Asia), it poured $100 million into chip manufacturers on the condition that they stop competing with each other and pool their efforts to stay in what was fast becoming a commodity business. As late as 1988, industrial planners were still criticizing the fragmented companies of Silicon Valley as “chronically entrepreneurial” and incapable of long-term investing.
There are some things, like large hadron colliders and moon missions, that probably no private company would be allowed by its shareholders to provide, but are we so sure that even these would not catch the fancy of a Buffett, Gates, or Mittal if they were not already being funded by taxpayers? Can you be sure that if NASA had not existed, some rich man would not by now have spent his fortune on a man-on-the-moon program for the prestige alone? Public funding crowds out the possibility of knowing an answer to that question.
A large 2003 study by the Organization for Economic Cooperation and Development concluded that government spending on R&D has no observable effect on economic growth. Indeed, it “crowds out resources that could be alternatively used by the private sector, including private R&D.” Governments have almost completely ignored this rather astonishing conclusion.
It is the ever-increasing exchange of ideas that causes the ever-increasing rate of innovation in the modern world.
Innovators are in the business of sharing. It is the most important thing they do, for unless they share their innovation it can have no benefit for them or for anybody else. And the one activity that got much easier to do after about 1800, and has gotten dramatically easier recently, is sharing. Travel and communication disseminated information much faster and further. Newspapers, technical journals, and telegraphs spread ideas as fast as they spread gossip. In a recent survey by the economists Rajshree Agarwal and Michael Gort of 46 major inventions, the time it took for the first competing copy to appear fell steadily from 33 years in 1895 to three years in 1975. And the speed has increased ever since.
When Hero of Alexandria invented a steam engine in the first century A.D. and employed it in opening temple doors, news of his invention spread so slowly and to so few people that it may never have reached the ears of cart designers. Ptolemaic astronomy was ingenious and precise, if not quite accurate, but it was never used for navigation because astronomers and sailors did not meet. The secret of the modern world is its gigantic interconnectedness. Ideas are having sex with other ideas from all over the planet with ever-increasing promiscuity. The telephone had sex with the computer and spawned the Internet.
Technologies emerge from the coming together of existing technologies into wholes that are greater than the sum of their parts. Henry Ford once candidly admitted that he had invented nothing new: He had “simply assembled into a car the discoveries of other men behind whom were centuries of work.” Inventors like to deny their ancestors, exaggerating the unfathered nature of their breakthroughs, the better to claim the full glory (and sometimes the patents) for themselves. Thus Americans learn that Edison invented the incandescent light bulb out of thin air, when his less commercially slick forerunners, Joseph Swan in Britain and Alexander Lodygin in Russia, deserve at least to share the credit.
End users, too, have joined in the mating frenzy. Adam Smith recounted the tale of a boy whose job was to open and close the valve on a steam engine and who, to save time, rigged up a device to do it for him. He no doubt went to his grave without imparting the idea to others, or would have had it not been immortalized by the Scottish sage. Today he would have shared his “patch” with like-minded others on a chat site and eventually gotten credit for the innovation in his own Wikipedia entry.
We may soon be living in a post-capitalist, post-corporate world where individuals are free to come together in temporary aggregations to share, collaborate, and innovate, and where websites enable people to find employers, employees, customers, and clients anywhere in the world. This is also, as the evolutionary psychologist Geoffrey Miller reminds us, a world that will put “infinite production ability in the service of infinite human lust, gluttony, sloth, wrath, greed, envy, and pride.” But that is roughly what the elite said about cars, cotton factories, and (I’m guessing) wheat and hand axes too.
Were it not for this inexhaustible river of invention and discovery irrigating the fragile crop of human welfare, living standards would stagnate. Even with population tamed, fossil energy tapped, and trade free, the human race could quickly discover the limits to growth without new knowledge. Trade would sort out who was best at making what; exchange could spread the division of labor to best effect, and fuel could amplify the efforts of every factory hand, but eventually there would be a slowing of growth. A menacing equilibrium would loom.
In that sense, Ricardo and Mill were right. But so long as it can hop from country to country and from industry to industry, discovery is a fast-breeder chain reaction; innovation is a feedback loop; invention is a self-fulfilling prophecy. Equilibrium and stagnation are not only avoidable in a free-exchanging world. They are impossible.
Matt Ridley (rationaloptimist.com) is the author of The Rational Optimist, from which this article is adapted. Copyright © 2010 by Matt Ridley. Reprinted by arrangement with Harper, an imprint of HarperCollins Publishers. This column first appeared at Reason.com.