The statement “everything that could be invented has been invented” is frequently misattributed to the late-nineteenth-century American patent commissioner Charles Holland Duell. The Economist once credited him with the remark, and sites such as “kool kwotes” still reproduce it. In fact, Duell believed the opposite. “In my opinion,” he wrote at the turn of the century, “all previous advances in the various lines of invention will appear totally insignificant when compared with those which the present century will witness. I almost wish that I might live my life over again to see the wonders which are at the threshold.” While this prediction turned out to be on the money, the belief that “the end of invention” is near is very much alive in our age, despite ample evidence of accelerating technological progress.

Pessimism is most prevalent among economists such as Northwestern University professor Robert J. Gordon, who expects growth to slow to a small fraction of what it was in the past. Gordon predicts that the disposable income of the bottom 99 percent of Americans will grow at just 0.2 percent per year—one-tenth the average rate of U.S. economic growth in the twentieth century. Innovation, he maintains, will not be enough to offset the headwinds that will buffet Western industrialized economies in the next half-century—aging populations, declining educational achievement, and rising inequality. And he is not alone in this dismal view. In The End of Science, published in 1996, journalist John Horgan declared that “the modern era of rapid scientific and technological progress appears to be not a permanent feature of reality, but an aberration, a fluke. . . . Science is unlikely to make any significant additions to the knowledge it has already generated. There will be no revelations in the future comparable to those bestowed upon us by Darwin or Einstein or Watson and Crick.”

Certainly, it is difficult to know exactly in which direction technological change will move and how significant it will be. Much as in evolutionary biology, all we know is history. Yet something can be learned from the past, and it tells us that such pessimism is mistaken. The future of technology is likely to be bright.

The first thing to note is that the twentieth century experienced probably as many headwinds, albeit of a different kind, as Gordon foresees for the twenty-first. Industrialized nations fought two massive world wars and experienced the Great Depression, the Cold War, and the rise of totalitarian regimes in much of Europe and Asia. In the past, such catastrophes might have been enough to set economies back for hundreds of years or even to condemn entire societies to stagnation or barbarism. Yet none of them could stop the power of ever-faster innovation in the twentieth century to stimulate rapid growth in much of the industrialized and industrializing world.

Keep in mind, too, that economic growth, measured as the growth of income per capita (corrected for inflation), is not the best measure of what technological change does. True, technology increases productivity by making it possible to produce goods and services more efficiently (at lower cost). But much of what it does is to put on the market new products (or vastly improved ones) that may be quite inexpensive relative to their benefits. Many of the most important inventions of the late nineteenth and twentieth centuries are things that we would not want to do without today; yet they had little effect on the national accounts because they were so inexpensive: aspirin, lightbulbs, water chlorination, bicycles, lithium batteries, wheeled suitcases, contact lenses, digital music, and more.

Further, our outdated conventions of national income accounting fail to capture fully the many ways in which technology can transform human life for the better. For instance, national income calculations do not count “leisure” as a valuable good. People who are not working are not producing, and this is simply “bad,” in Gordon’s view, because they are not adding to economic output. But it may well be that a leisurely life is the best “monopoly profit,” as Nobel Prize winner John Hicks already noted in 1935. And thanks to new technology, leisure—even involuntary leisure such as unemployment—can be more enjoyable than ever before. At little cost, anyone can now watch a bewildering array of sports events, movies, and operas from the comfort and safety of a living room on a high-definition flat-screen TV. If the technology of the twentieth century did anything, it vastly augmented our ability to have a good time when we are not working. Yet, while the average individual in an industrialized country nowadays has far more leisure hours and many more enjoyable things in his or her life than the typical person did a century ago, such things hardly show up in the national income statistics.

Many pessimists make their predictions by extrapolating on current technological trends. Jan Vijg, a distinguished geneticist with an interest in history, is disappointed that airplanes do not fly any faster than they did 50 years ago and that the basic design of our automobiles has not changed. He notes that improvements still take place but tend to be marginal and subject to diminishing returns. Some of the most convenient devices that appeared in the twentieth century, such as air-conditioning and antibiotics, are already in use everywhere. Maybe the low-hanging technological fruit has all been picked, as George Mason University economist Tyler Cowen has said, and there is little left to invent.

But consider those things that ignited rapid progress in the past. The technological historian Derek Price has emphasized that the tools that technology makes available to science help determine the rate of progress. Typically, we tend to see scientific discoveries as a causal factor in technology: as physics and chemistry improve, inventors can design new products and materials. But the reverse is equally true: as scientists get better tools (made, say, by instrument makers and lens grinders), they can advance knowledge, which in turn leads to technological progress. This creates a virtuous circle that has been responsible in the past for the miraculous, technology-driven events that created modern economies. It is not easy to pinpoint when that virtuous circle started, but one salient event occurred in the seventeenth century, when microscopes and telescopes first emerged and enabled scientists to see what no human had ever seen. The development of the barometer led to the discovery of the atmosphere, soon to be followed by steam (that is, atmospheric) engines. The process accelerated after 1750. Another example: the greatest breakthrough in nineteenth-century medicine—the discovery of the germ theory of disease—was made possible by improved microscopes, which reduced optical aberration. Modern economic growth would surely have fizzled out had it not been for the way science and technology reinforced each other.

If this historical model holds some truth, the best may still be to come for modern societies. Only in recent decades has science learned to use high-powered computing and the storage of massive amounts of searchable (and thus accessible) data at negligible costs. The vast array of instruments and machines that can see, analyze, and manipulate entities at the sub-cellular and sub-molecular level promise advances in areas that can be predicted only vaguely. But these tools, to beat Cowen’s metaphor into the ground, allow us to build taller ladders to pick higher-hanging fruit. We can also plant new trees that will grow fruits that no one today can imagine.

A second reason technological progress will continue unabated has to do with the emergence of a competitive global marketplace, which will encourage the spread of new technology from its originating locations to other users who do not wish to be left behind. The idea that such competition leads to more rapid development is nothing new. The great eighteenth-century historian Edward Gibbon noted: “Europe is now divided into twelve powerful, though unequal, kingdoms, three respectable commonwealths, and a variety of smaller, though independent, states. . . . In peace, the progress of knowledge and industry is accelerated by the emulation of so many active rivals.”

Of course, competition among states can occasionally lead to war, which threatens to wipe out the gains of human ingenuity. The competitive process among nations is thus a double-edged sword. The launching of the Soviet Sputnik in 1957 triggered a huge effort by the United States to expand research and technical education—but five years later, the superpowers came frighteningly close to a nuclear confrontation. Most states today realize that peaceful interstate competition in the marketplace requires staying current with the most advanced technology—but terrorists and rogue states want to stay current, too, for very different reasons.

All the same, this economic and political competition is what will protect progress from the forces amassed against it. Vijg rightly points out that much promising innovation is hampered by regulation and resistance, some of it of an ideological nature. Advances in pharmaceuticals, for instance, have slowed to a trickle because of regulatory agencies’ and large corporations’ fear of taking risks, concerns about possible lawsuits, and pressure from single-issue groups such as animal-rights activists. Antinuclear and environmental groups slow down progress in other areas. If the United States were the entire world, this might well impede progress to the point where it would be no match for Gordon’s headwinds. But it is not. The argument that “if we don’t do this, someone else will” should prove more powerful than the concerns of groups that regard a new technology with suspicion.

In 1955, Fortune magazine published The Fabulous Future: America in 1980, an essay collection in which some leading American businessmen and intellectuals were asked to make predictions. In some ways, they were overly optimistic, especially in their faith in nuclear power, which made John von Neumann predict that by 1980, “energy may be free, much like unmetered air.” They also failed to foresee some of the most radical breakthroughs: communication satellites, lasers, and fiber-optic technology. Since then, Raymond Kurzweil’s prediction of a coming “singularity” (a cosmic merging of human minds and supercomputers) and Jeremy Rifkin’s vision of “the end of work” (recently reformulated with better logic by MIT economist Erik Brynjolfsson) have generated much skepticism about the wisdom of making specific predictions about the future.

All the same, I will venture a guess about one feature of the future: technology will go “small.” Twentieth-century technology was primarily about “large” things. The technological historian Vaclav Smil has pointed out in a set of remarkable books that the diesel engine and the gas turbine were the main technological breakthroughs that made globalization possible (he calls them “prime movers”). To these, we can add shipping containers, satellite communications, and oil-drilling platforms (among others). Energy was generated by massive power stations. Materials were produced by gigantic steel mills. Huge airplanes and tall cell towers embodied what the twentieth century could do. But the twenty-first century may be very different. Our ability to manipulate molecules and little pieces of DNA is made possible by tools that would make Robert Hooke’s jaw drop: from the electron microscopes developed in the 1930s by two German engineers (Ernst Ruska and Max Knoll) to the scanning-tunneling and ion-beam devices used today for nanotechnology. Manipulating DNA molecules, sorting cells, and sequencing and splicing genes may offer a better path to a better future than building supersonic planes.

If technology were to go small, it would hardly mean that it would be unimportant. Exciting breakthroughs will come from genetic modification of living beings. While people have always been able to change the looks and features of animals and plants, genetic modification lets us fine-tune their features according to preference. It is to selective breeding what a surgeon’s scalpel is to a heavy ax. There is some reluctance to do this kind of modification, perhaps because it is so radical and the full effects cannot be known. Some people are queasy about creating new species, and not without good reasons. But the opportunities are so dazzling that someone will take the risk. If Europeans continue to resist what they call “Frankenfoods,” some other nation will develop—and profit from—them. Genetically modified crops may be capable of withstanding rising global temperatures, thus helping us adapt to climate change. Scientists may also be able to “teach” more plants to manufacture their own fertilizer, resist pests, and protect themselves from harmful insects and thus mitigate negative environmental effects from pesticides and other harmful aspects of modern agriculture.

Something comparable is on the horizon in material science. For thousands of years, the human race was constrained and defined by the materials at its disposal, hence terms such as the “Bronze Age.” Many technological ideas could not be realized because the materials that were available to inventors simply were not adequate to translate design into reality. But modern materials, increasingly developed at the nanotechnological level, promise to deliver custom-ordered properties in terms of hardness, resilience, elasticity, and so on. New resins, advanced ceramics, new solids, carbon nanotubes—all are in the process of development or of being perfected, and an air of excitement permeates sites such as MaterialsToday.com. Supercomputers can now be used to simulate the equations of quantum mechanics that predict the properties of new artificial materials—heralding a revolution in material science that may make the synthetic substances of the twentieth century look like the Stone Age by comparison. Not all those options will pan out, of course, and there is always a danger of abuse or accidents. New technology is never without risk—but neither is technological stagnation.

Another area in which “small may be beautiful” is 3-D printing, already on the near horizon. It promises to deliver “mass customization” on an unprecedented scale, in which an ever-growing array of medical devices and consumer products can be designed according to the exact specifications of the buyer, from artificial limbs to tablecloths. But even more important may be the expanding use of robots. We should not think of robots as the clunky metal creatures of 1950s sci-fi movies inspired by Isaac Asimov or the automated machinery in assembly plants. Robots, sensors, and actuators can be made in any size or shape, to hear and see what we cannot, go where we cannot go, and perform household tasks of any kind, from walking dogs to making beds to taking care of children. Mass customization will make technologies available for a range of human needs that were traditionally carried out by housewives and servants.

Dismissing progress because airplanes have not gotten any faster is like complaining in 1890 that technology has been unable to make horses run faster. In a world of instant communications with growing bandwidth, the role that distance plays may be quite different from the past, in ways we are only starting to imagine. Whether we are facing the “death of distance,” as British writer Frances Cairncross predicted in 2001, is hard to say, but geography certainly is not the impediment to human activities that it used to be. One exciting development that may radically change daily life is that more and more work can be carried out from a distance. If workers are spending much of their day in front of a computer terminal, it may not matter where that terminal is located, whether it is in a wireless neighborhood café or their child’s bedroom at home.

The factory system emerged in the late eighteenth century and imposed a discipline of time and space like nothing seen before on its workers. If that discipline were to be seriously modified, people would not necessarily do less work but merely different work. But if it meant less time wasted in rush-hour traffic and more opportunities to multitask, such a shift would be a significant improvement in people’s lives—one that would, incidentally, not show up in productivity statistics. At this time, the opportunity to work from home primarily applies to workers in the knowledge services, but even non-knowledge jobs are being automated and roboticized at a rapid rate these days; the control and supervision of machinery on the shop floors of manufacturing plants or large retail warehouses are increasingly performed by workers in front of monitors. The end of the factory system does not mean that everyone will work from home; it may just mean a greater flexibility in where and when workers carry out their tasks. Such a sea change will be massively important to the quality of life of many millions of people worldwide (not to mention the environment), yet it will not necessarily register as economic “growth.”

Why is it so crucial that technological progress continue? Are people in the industrialized world not rich enough? Why not just try to share our capabilities with nations that have been less fortunate? Why keep piling more and more technology on top of what we already have? Not even the most ardent techno-enthusiasts would argue that innovation is an unmitigated blessing. New technology disrupts life in many ways; it forces people to abandon familiar and comfortable practices, causes skills and equipment to become obsolete, and alienates those members of the population who have a harder time adapting. The “digital divide” separates the young and the old, the educated and the uneducated, the urban and the rural. It makes many people miserable, frustrated, and disconnected. But we have no choice but to continue innovating technologically. If we do not, someone else will.

There is, moreover, a deeper reason that technological progress must keep going. Progress has not only been disruptive; it has also been untidy. It has not been a straight line toward a better life. As Edward Tenner pointed out in his pathbreaking book Why Things Bite Back, the history of technology is permeated with unintended consequences and negative side effects of innovation. How could it not be? After all, if every possible implication of a new technology was known beforehand, it would hardly be an innovation. Some cases of technology creating an unexpected mess are notorious, such as asbestos (originally touted as a fireproof and totally safe new building material) or adding lead to gasoline to prevent engine knock. To deal with such negative effects, we need not less but more innovation—to clean up the mess of earlier technological change where something went awry. Much like medication, technological progress almost always has side effects, but bad side effects are rarely a good reason not to take medication and a very good reason to invest in the search for second-generation drugs. To a large extent, technical innovation is a form of adaptation—not only to externally changing circumstances but also to previous adaptations.

The history of technological change provides endless examples of this. Burning fossil fuels has had unintended consequences—air pollution, above all. New technology has cleaned up a lot, but increased reliance on energy from cleaner hydrocarbons created what may be the mother of all bite-backs: changes in the earth’s climate. Here, too, technological solutions could be on the horizon, whether through geo-engineering or the rapid and continual improvement of the generation of non-carbon-emitting energy. Such developments, requiring trillions of dollars of investment, could unleash decades of economic growth. Dealing with the messy aftermath of previous innovations should be counted as a growth-inducing tailwind—one that Gordon fails to take into account.

Antibiotics, perhaps the most significant medical breakthrough of the twentieth century, have their own built-in bite-back mechanism: the ability of germs to mutate and develop drug resistance. We urgently need new antibiotics using different mechanisms. Given how much more we now know about molecular and cellular biology compared with what Alexander Fleming knew in 1928, these new antibiotics seem feasible (see “Germs and the City,” Spring 2007).

In many instances of technology-induced environmental damage, too, the answer is not less but more and better technology. Ecological deterioration and over-irrigation have caused water salinization, and a third of the world’s rice paddies have salt problems. But scientists are developing salt-tolerant rice varieties through advanced mutagen-inducing techniques. Even more striking is the bite-back involving nitrate fertilizer. Nitrogen fixing was developed before World War I in Germany, and it has provided the world with an unlimited supply of nitrates, essential to agriculture. But nitrates pollution has become increasingly acute in many waterways and oceans. The technological fix to this problem is to allow plants to manufacture their own nitrogen or, more accurately, to “coach” them to live symbiotically with nitrogen-fixing bacteria, as a few plants—such as clover—do naturally.

Consider, too, the bite-back that has occurred in the effect of technology on the human body. Through most of history, the majority of people have lived on the edge of subsistence, continually worrying about whether they would get enough to eat. Fats and sugar were luxuries, not the source of worries about diabetes. Through new agricultural techniques and improved crops, agriculture can now feed most societies more than adequately (though too many pockets of malnutrition and famine remain). The bite-back for the well fed has been, of course, an epidemic of obesity and the many physiological and psychological costs it entails. Yet obesity is also not a problem beyond our reach: some people are more prone to it than others. Once we find out in greater detail why, the road to radical and effective weight control may open. Scientists know more all the time about the hundreds of species of microbiota that inhabit our gut and the genes that regulate our digestive system, and they are learning how to manipulate the metabolic factors that determine who will gain weight.

Bite-back to new technology occurs in every stage of life. We are living much longer than we did a century ago and having far fewer children. Population aging is considered a headwind. An older population, after all, is less employable, less productive, and possibly less entrepreneurial. It also contributes to rising medical costs. Ironically, in the early nineteenth century, most economists were pessimistic because they perceived rising fertility and population growth to be the headwind that would stop economic growth. Today it is falling fertility and declining mortality (and the resulting aging of the population) that seem to be the problem. But medical technology has been able to slow down the onset of decrepitude. Older people not only live longer; they remain functional for longer—gadgets help them walk, see, hear, and stay active to an age that, in earlier centuries, would have condemned them to helplessness. Alzheimer’s and other forms of dementia are the subject of feverish research. The retirement age of 65 is not carved in stone; it was created in a time when life was shorter. Another bite-back, perhaps less likely to be resolved by invention, concerns the sharp fall in the birthrate due to the great fertility decline, in part made possible by improved contraceptive technology. Industrialized nations, in particular, are experiencing this decline. But more and more women fated to infertility in the past can now conceive.

In short, technology will continue to develop and change human life and society at a rate that may well dwarf even the dazzling developments of the twentieth century. Not everyone will like the disruptions that this progress will bring. The concern that what we gain as consumers, viewers, patients, and citizens, we may lose as workers is fair. The fear that this progress will create problems that no one can envisage is equally realistic. Yet technological progress still beats the alternatives; we cannot do without it. Whether it will be accompanied by economic growth as traditionally measured is hard to know, but so what?

Photo: Despite wonders like the Hubble Telescope, prominent thinkers believe that the greatest technological breakthroughs may have already occurred. (F&A ARCHIVE/THE ART ARCHIVE AT ART RESOURCE, NY)

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next