There’s reason to think not. Times have changed since Einstein worked by day as a clerk in the Swiss Patent Office, and by night as a World-Historical Physicist. Back then, science and technology were still largely the province of private patrons and individual inventors working in their basement labs. These days, vast networks of laboratories sponsored by governments, universities, big corporations and wealthy venture capitalists are all pushing to find the new new thing. Discovery and invention, in the developed countries at least, have become regularized. The insights of individuals are still important, of course, but the overall effort relies less on any one genius. “In the late 19th century, you had predominantly the private inventor,” says Yale historian Daniel Kevles. “Now you have the organized inventor. Even the big conceptual scientific leaps are much less likely to occur nowadays. Scientific fields are crowded with geniuses. Everybody’s working at the big problems all the time.”
What that means is that our society has become more steady and trustworthy in producing earth-shattering advances. The need for breakthroughs on demand started during World War II, when the U.S. military wanted its antiaircraft shells to inflict damage even when they missed enemy planes. Nobody knew how to get the bombs to explode in midair at just the right moment, so the Pentagon funded the applied-physics lab at Johns Hopkins University in Baltimore, Maryland, and staffed it with experts in plastics, electromagnetics and other specialties. Their proximity fuse was decisive in winning the war. And the effort to make it, along with the Manhattan Project to build the atomic bomb, set the precedent for organizing an array of expertise according to the need for a particular invention.
This shift in the methodology of discovery has complicated matters. It’s chiefly responsible for the burgeoning complexity of machines, but also for the growing complexity of the act of inventing and building. The Pentagon awards a contract for a new jet fighter to a prime contractor, which passes the various systems and subsystems and components down through layers of subcontractors. “Henry Ford could understand every piece of his assembly line,” says Don Kash, a technology expert at George Mason University in Washington, D.C. “Nobody can do that at Toyota.”
What’s different now, though, is how comfortable we’ve become with such complexity. Innovation is part of our lives in a way it hasn’t been for previous generations. In 1970, Alvin Toffler argued in “Future Shock” that technology was changing society so quickly that a person in the span of a single lifetime would find himself a stranger in his own culture. Toffler’s book struck home because many people felt that new technologies–in those days television, the birth-control pill and the transistor–were bringing about change at a pace that was disorienting and not a little disturbing. These days we’ve learned how to ride the rocket of innovation. “My father thought the world would be the same,” says Kash. “My children wake up every day thinking the world will be different.”
It’s important not to confuse these two trends. The fact that we have grown accustomed to, even jaded by, our scientific and technological progress doesn’t mean it hasn’t been mind-boggling. Certainly it’s easy to criticize the overheated rhetoric of the Internet boom. But we shouldn’t forget that even if the Web didn’t quite change everything, it certainly changed a lot of things. The past 10 years saw one of the most concentrated bursts of innovation in our history–not just the Internet, but the decoding of the human genome and the cloning of a sheep named Dolly come to mind.
It’s reasonable to assume, then, that the next decade will render even those radical changes forgettable. Science-fiction writer Arthur C. Clarke has said that he seldom predicts the future; he merely extrapolates from the present. A decade ago, with the bloom just coming off Japan, few people predicted that the American economy would so thoroughly dominate the world’s. An equal number of visionaries now might argue that by 2012, Europe will have supplanted the United States as the prime mover in the global economy. The Internet opened new worlds by linking your PC to other computers. These days its reach is already beginning to spread to tiny chips embedded in everyday objects (and even the human body). “Grid” computing will soon make it possible to spread massive computing tasks over many machines. The Internet will become even more present in our daily lives, and we’ll continue to grapple with the issues of security and privacy.
Based on what’s happening in today’s labs, perhaps the biggest Next Big Thing will come from the field of genetics. No need for any lightning bolts of insight to shake things up. In vitro fertilization already gives scientists the ability to create an embryo in a petri dish. Should the technology get good enough to make many embryos at once, genetic screening techniques, which already exist, will allow scientists to pick the one with the most highly prized traits. Outlawing eugenics in the United States or Europe won’t help much if the technology is being practiced in China. Rogue scientists are even now scrambling to create the first human clone; the event is likely to be disturbing. On the positive side, diagnostic tests using gene chips and other technologies may tell us if we’re susceptible to specific diseases or how we’ll respond to certain drugs. Armed with this information, doctors may be able to tailor our diets and our treatments to our own genetic idiosyncracies.
All this change will have an even broader impact than what we’ve experienced in the past: the steady churn of technological advances builds upon itself, sometimes with unanticipated results. Birth control, which has made it commonplace to have fewer children later in life, means the world’s population will get grayer, posing problems for governments and opportunities for business. As the number of women in the work force grows, too, men will have to cope with a lesser status. (Since they’ll live longer, though, they’ll have plenty of time to work it out.)
Perhaps most important, the flourishing of innovation increases the likelihood of further innovation, whether by corporate labs or wild-maned inventors. Einstein, for instance, used the technology around him as a mental springboard for his thoughts about physics and the nature of time. To synchronize clocks in all the continent’s far-flung railway stations, European engineers sent signals from Paris and Berlin out through wires and radio links. Some of the many inventions needed for such a system–signal relays, electromechanical devices to reset the clocks and so forth–might have passed across Einstein’s desk at the patent office. “Every day Einstein took the short stroll from his house, left down the Kramgasse, to the patent office,” writes science historian Peter Galiston of Harvard. “Every day he must have seen the great clock towers that presided over Bern with their coordinated clocks, and the myriad of street clocks branched proudly to the central telegraph office.” Perhaps his great mental leap didn’t come completely from left field: it may have had its genesis in the inventions of the day. Today’s budding scientists have an even more remarkable panorama to peruse. What seeds of change are they sowing?