Essay On Scientific Discoveries Of 20th And 21st Century

Inventors and inventions

by Chris Woodford. Last updated: January 31, 2018.

Have you ever dreamed of becoming a great inventor—of having a fantastically clever idea that changes society for the better and makes you rich in the process? The history of technology is, in many ways, a story of great inventors and their brilliant inventions. Think of Thomas Edison and the light bulb, Henry Ford and the mass-produced car, or, more recently, Tim Berners-Lee and the World Wide Web. Inventing isn't just about coming up with a great idea; that's the easy part! There's also the matter of turning an idea into a product that sells enough to recoup the cost of putting it on the market. And there's the ever-present problem of stopping other people from copying and profiting from your ideas. Inventing is a difficult and often exhausting life; many inventors have died penniless and disappointed after struggling for decades with ideas they couldn't make work. Today, many lone inventors find they can no longer compete and most inventions are now developed by giant, powerful corporations. So, are inventors in danger of going extinct? Or will society always have a place for brave new ideas and stunning new inventions? Let's take a closer look and find out!

Photo: The wheel is probably the greatest invention of all time, used in everything from cars and planes to wind turbines and computer hard drives. Even so, no-one knows who invented it or when.

What is invention?

Artwork: Thomas Edison's original patent for the electric lamp, granted in January 1880. This wasn't the first electric light, but it was the first really practical and commercially successful one. Courtesy of US Patent and Trademark Office.

That sounds like a trivial question, but it's worth pausing a moment to consider what "invention" really means. In one of my dictionaries, it says an inventor is someone who comes up with an idea for the first time. In another, an inventor is described as a person of "unique intuition or genius" who devises an original product, process, or machine. Dictionary definitions like these are badly out of date—and probably always have been. Since at least the time of Thomas Edison (the mid-to-late 19th century), invention has been as much about manufacturing and marketing inventions successfully as about having great ideas in the first place.

Some of the most famous inventors in history turn out, on closer inspection, not to have originated ideas but to have developed existing ones and made them stunningly successful. Edison himself didn't invent electric light, but he did develop the first commercially successful, long-lasting electric light bulb. (By creating a huge market for this product, he created a similarly huge demand for electricity, which he was busily generating in the world's first power plants.) In much the same way, Italian inventor Guglielmo Marconi can't really be described as the inventor of radio. Other people, including German Heinrich Hertz and Englishman Oliver Lodge, had already successfully demonstrated the science behind it and sent the first radio messages. What Marconi did was to turn radio into a much more practical technology and sell it to the world through bold and daring demonstrations. These days, we'd call him an entrepreneur—a self-starting businessperson who has the drive and determination to turn a great idea into a stunning commercial success.

Photo: Guglielmo Marconi didn't so much "invent" radio as make it practical and popular. Photo courtesy of US Library of Congress.

It's important not to underestimate the commercial side of inventing. It takes a lot of money to develop an invention, manufacture it, market it successfully, and protect it with patents. In our gadget-packed homes and workplaces, modern inventions seldom do completely original jobs. More often, they have to compete with and replace some existing gadget or invention to which we've already become attached and accustomed. When James Dyson launched his bagless cyclone vacuum cleaner, the problem he faced was convincing people that it was better than than the old-fashioned vacuums they had already. Why should they spend a fortune buying a new machine when the one they had already was perfectly satisfactory? Successful inventions have to dislodge existing ones, both from our minds (which often find it hard to imagine new ways of doing things) and from their hold on the marketplace (which they may have dominated for years or decades). That's another reason why inventing is so difficult and expensive—and another reason why it's increasingly the province of giant corporations with plenty of time and money to spend.

How and why do people invent things?

According to the well-known saying, "necessity is the mother of invention"; in other words, people invent things because society has difficult problems that need solving. There's some truth in this, though less than you might suppose. It would be more accurate to say that inventions succeed when they do useful jobs that people recognize need doing. But the reasons inventions appear in the first place often have little or nothing to do with "necessity," especially in the modern age when virtually every need we have is satisfied by any number of existing gadgets and machines. Where, then, do inventions come from and why do people invent them?

Scientific breakthroughs

Artwork: The discovery of how DNA worked revolutionized crime-fighting and forensic science—and will have huge impacts on medical science and technology in the future. Picture of a DNA double helix based on an artwork courtesy of US National Library of Medicine.

Some inventions appear because of scientific breakthroughs. DNA fingerprinting (the process by which detectives take human samples at crime scenes and use them to identify criminals) is one good example. It only became possible after the mid-20th century when scientists understood what DNA was and how it worked: the scientific discovery made possible the new forensic technology. The same is true of many other inventions. Marconi's technological development of radio followed on directly from the scientific work done by Lodge, Hertz, James Clerk Maxwell, Michael Faraday, and numerous other scientists who fathomed out the mysteries of electricity and magnetism during the 19th century. Generally, scientists are more interested in advancing human knowledge than in commercializing their discoveries; it takes a determined entrepreneur like Marconi or Edison to recognize the wider, social value of an idea—and turn theoretical science into practical technology.

Trial and error

But it would be very wrong to suggest that inventions (practical technologies) always follow on from scientific discoveries (often abstract, impractical theories). Many of the world's greatest inventors lacked any scientific training and perfected their ideas through trial and error. The scientific reasons why their inventions succeeded or failed were only discovered long afterward. Engines (which are machines that burn fuel to release heat energy that can make something move) are a good example of this. The first engines, powered by steam, were developed entirely by trial and error in the 18th century by such people as Thomas Newcomen and James Watt. The scientific theory of how these engines worked, and how they could be improved, was only figured out about a century later by Frenchman Nicolas Sadi Carnot. Thomas Edison, one of the most prolific inventors of all time, famously told the world that "Genius is one percent inspiration and 99 percent perspiration"; he had little or no scientific training and owed much of his success to persistence and determination (when he came to develop his electric light, he tested no fewer than 6000 different materials to find the perfect filament).

Photo: Steam engines weren't developed scientifically: they evolved slowly and gradually by trial and error. As Nicolas Sadi Carnot later pointed out, they could be extremely inefficient machines—which meant they used a huge amount of fuel (coal) to power themselves. But that didn't matter in an age where coal was relatively cheap and abundant and people cared less about pollution.

Inventions that evolve

Some inventions are never really invented at all—they have no single inventor. You can comb your way through thousands of years of history, from the abacus to the iPhone, and find not a single person who could indisputably be credited as the sole inventor of the computer. That's because computers are inventions that have evolved over time. People have needed to calculate things for as long as they've traded with one another, but the way we've done this has constantly changed. Mechanical calculators based on levers and gears gave way to electronic calculators in the early decades of the 20th century. As newer, smaller electronic components were developed, computers became smaller too. Now, many of us own cellphones that double-up as pocket computers, but there's no single person we can thank for it. Cars evolved in much the same way. You could thank Henry Ford for making them popular and affordable, Karl Benz for putting gasoline engines on carts to make motorized carriages, or Nikolaus Otto for inventing modern engines in the first place—but the idea of vehicles running on wheels is thousands of years old and its original inventor (or inventors) has long since disappeared in time.

Accidental inventions

Artwork: VELCRO®: George de Mestral chanced on the idea of a clothing fastener entirely by accident. Here's a drawing from his invention US Patent 3,009,235: Separable fastening device (filed 1958, granted 1961) courtesy of US Patent and Trademark Office.

Some inventions happen through pure luck. When Swiss inventor George De Mestral was walking through the countryside, he noticed how burrs from plants stuck to his clothes and were hard to pull away. That gave him the idea for the brilliant two-part clothing fastener that he called VELCRO®. Another inventor who got lucky was Percy Spencer. He was experimenting with a device called a magnetron, which turns electricity into microwave radiation for radar detectors (used for direction-finding in ships and planes), when he noticed that a chocolate bar in his pocket had started to melt. He realized the microwave radiation was generating heat that was cooking (and melting) the food—and that gave him the idea for the microwave oven. Teflon®, the super-slippery nonstick coating, was also discovered by accident when Roy Plunkett accidentally made some strange white goo in a chemical laboratory. Its amazing nonstick properties were only discovered and put to use later. All these inventions, and numerous others, were chance discoveries produced by accidents or mistakes.

Photo: The Teflon coating that makes this frying pan nonstick was another accidental invention.

Advantageous inventions

From IBM and Sony to Goodyear and AT&T, many of the world's biggest, best-known corporations have been built on the back of a single great invention. IBM, for example, grew out of an earlier company selling intricate mechanical census-counting machines developed by Herman Hollerith; Sony made its name selling cheap, high-quality radios made with tiny transistors; Goodyear owes its name (and its chief product) to Charles Goodyear, a hapless inventor who finally developed durable, modern, "Vulcanized" rubber after a lifetime of trial and error; AT&T can trace its roots back to the telephone patented by Alexander Graham Bell in 1876. But a modern company can't survive and thrive on one great idea alone. That's why so many companies have huge research and development laboratories where inspired scientists and engineers are constantly trying to come up with better ideas than the ones on which their original success was founded. As marketing genius Theodore Levitt pointed out in the 1960s, visionary companies need the courage to try to put themselves out of business by coming up with new products that make their existing ones obsolete; companies that rest on their laurels will be put out of business by their inventive competitors. This kind of corporate invention—companies trying to out-invent themselves and one another—is very much the way the world works now.

The world of corporate invention

Photo: Inventors have to start somewhere: The Apple ][ computer made Steve Jobs and Steve Wozniak rich and famous, but they started their lives making and selling their original Apple I in a garage belonging to Jobs' parents.

There are probably more people trying to invent things now than at any time in history, but relatively few of them are lone geniuses struggling away in home workshops and garages. There will always be room for lucky individuals who have great ideas and get rich by turning them into world-beating products. But the odds are stacked increasingly against them. It's unlikely you'll get anywhere tinkering away in your garage trying to invent a personal computer that will change the world, the way Steve Wozniak and Steve Jobs did back in the mid-1970s when they put together the first Apple Computer. To do that, you'd have to set yourself up in competition with—guess who—Apple Computer (which became the world's richest company in 2011), staffed with legions of brilliantly creative scientists, engineers, and designers, and with billions of dollars to spend on research and development. Really prolific inventors might file a few dozen patent applications during their lifetime, if they're lucky; but the world's most inventive company, IBM, files several thousand patents every single year. Companies like IBM have to keep on inventing to keep themselves in business: inventions are the fuel that keep them going.

Photos: Nylon—the power behind your toothbrush: Could anyone develop such a fantastic material tinkering away in a garage? Not likely. In our sophisticated 21st-century world, it takes well-funded corporate research labs to come up with amazing new chemical materials like this. Read how it was developed by Wallace Carothers for DuPont in our article on nylon.

Think of inventions in the 19th century and you'll come across lone inventors like Charles Goodyear, Thomas Edison, Alexander Graham Bell, George Eastman (of Kodak)—and many more like them. But think of inventing in the 20th and 21st century and you'll come across inventive corporations instead—such companies as DuPont (the chemical company that gave us nylon, Teflon®, Kevlar®, Nomex®, and many more amazing synthetic materials), Bell Labs (where transistors, solar cells, lasers, CD players, digital cellphones, commercial fax machines, and CCD light sensors were developed), and 3M (pioneers of Scotchgard textile protector and Post-It® Notes, to name only two of their best-known products). It was Thomas Edison who transformed the world of inventing, from lone inventors to inventive corporations, when he established the world's first ever invention "factory" at Menlo Park, New Jersey, in 1876.

These days, corporations dominate our world, and they dominate the world of inventing in exactly the same way. If it's your dream to become a great inventor, go for it and good luck to you—but be prepared to take on some very stiff, very well-funded, corporate competition. If you succeed, congratulations: maybe you'll prove to be the founder of the next Apple, AT&T, or IBM!

Find out more

On this website

Other websites

Books

Practical guides—for younger readers

  • Get Inventing! by Mary Colson. Raintree, 2014. A simple introduction probably best for ages 7–9.
  • The Kids' Invention Book by Arlene Erlbach. Lerner Publications, 2011/Scholastic, 2001. A great introduction to the "serious" business of inventing, including things like how to enter an inventing contest and how to file a patent.
  • Kids Inventing!: A Handbook for Young Inventors by Susan Casey. Jossey-Bass, 2005. What do you do when you have a great idea? How do you turn that into a real invention? This book introduces children to the practicalities of inventing.

Inventions and inventors, past and present—for younger readers

  • The Way Things Work Now by David Macaulay. DK/Houghton Mifflin, 2016. Want to be an inventor? You'll need a good grasp of how things work—and there's no better place to start than here. This is the latest edition of the classic introduction to mechanical, electronic, and digital inventions (for which I worked as a consultant).
  • Eyewitness Invention by Lionel Bender. Dorling Kindersley (DK), 2013. A good but now seriously dated account of classic and ancient inventions, missing essential information about modern technologies and most inventions that have appeared since the 1990s. Many excellent photos of old inventions from the Science Museum in London, England.
  • Inventors and Inventions: Marshall Cavendish, 2008. A great encyclopedia set for schools for which I wrote quite a lot of the longer articles and biographies, including detailed pieces about Bell, Benz, Edison, Morse, Tesla, and many other famous inventors. You can browse a few of the articles online by following the link.
  • 1000 Inventions and Discoveries by Roger Bridgman. Dorling Kindersley (DK), 2006. A whistle-stop tour of almost every invention you can think of, presented as "bite-sized information" for younger readers.

Practical guides—for older readers

  • Hardcore Inventing: Invent, Protect, Promote, and Profit from Your Ideas by Robert Yonover and Ellie Crowe. Skyhorse, 2014. A no-nonsense, illustrated guide.
  • Inventing For Dummies by Pamela Riddle Bird. John Wiley & Sons, 2004. A clear guide to developing, protecting, and marketing your own inventions.
  • Patently Female: From AZT to TV Dinners, Stories of Women Inventors and Their Breakthrough Ideas by Ethlie Ann Vare and Greg Ptacek. Wiley, 2002. A look at the contributions female inventors have made to modern life through such things as the dishwasher.

Videos

  • An Inventor's Story by Dr John C Taylor. The British inventor of kettle thermostats describes his life.
  • The Dyson Story: James Dyson explains how a deep commitment to engineering and practical technology can solve human problems.

Articles

  • This Guy Turns Kids' Fanciful Inventions Into Real Products by Margaret Rhodes. Wired, January 29, 2016. How Dominic Wilcox is trying to encourage a new generation of inventors by transforming wacky children's ideas into working products.
  • 10 inventors who didn't get mega-rich from their inventions: BBC News, July 4, 2013. Some of the world's greatest inventions barely earned pennies for their inventors.
  • Designers Versus Inventors by Alice Rawsthorn. The New York Times, April 21, 2013. Do we design things... or do we invent them? Sometimes we do both—and the distinction is worth making.
  • Taking an Invention From Idea to the Store Shelf by Alina Tugend. The New York Times, August 23, 2013. A good overview of what you can do to turn a brilliant idea into marketable reality.
  • Techstar Interview: Dean Kamen by Bill Robinson. The Huffington Post, 20 March 2012. A detailed interview with the prolific US inventor of the Segway, iBOT wheelchair, and others.
  • The Richard Dimbleby Lecture: Engineering the Future by James Dyson. BBC News, 9 December 2004. James Dyson explains why manufacturing is crucial to a successful economy—and why he considers engineering is the future.

We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.

The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.

Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.

There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.

Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. The US economist Tyler Cowen, in his essay The Great Stagnation (2011), argues that, in the US at least, a technological plateau has been reached. Sure, our phones are great, but that’s not the same as being able to fly across the Atlantic in eight hours or eliminating smallpox. As the US technologist Peter Thiel once put it: ‘We wanted flying cars, we got 140 characters.’

Economists describe this extraordinary period in terms of increases in wealth. After the Second World War came a quarter-century boom; GDP-per-head in the US and Europe rocketed. New industrial powerhouses arose from the ashes of Japan. Germany experienced its Wirtschaftswunder. Even the Communist world got richer. This growth has been attributed to massive postwar government stimulus plus a happy nexus of low fuel prices, population growth and high Cold War military spending.

But alongside this was that extraordinary burst of human ingenuity and societal change. This is commented upon less often, perhaps because it is so obvious, or maybe it is seen as a simple consequence of the economics. We saw the biggest advances in science and technology: if you were a biologist, physicist or materials scientist, there was no better time to be working. But we also saw a shift in social attitudes every bit as profound. In even the most enlightened societies before 1945, attitudes to race, sexuality and women’s rights were what we would now consider antediluvian. By 1971, those old prejudices were on the back foot. Simply put, the world had changed.

Subscribe to Aeon’s Newsletter

But surely progress today is real? Well, take a look around. Look up and the airliners you see are basically updated versions of the ones flying in the 1960s – slightly quieter Tristars with better avionics. In 1971, a regular airliner took eight hours to fly from London to New York; it still does. And in 1971, there was one airliner that could do the trip in three hours. Now, Concorde is dead. Our cars are faster, safer and use less fuel than they did in 1971, but there has been no paradigm shift.

And yes, we are living longer, but this has disappointingly little to do with any recent breakthroughs. Since 1970, the US Federal Government has spent more than $100 billion in what President Richard Nixon dubbed the ‘War on Cancer’. Far more has been spent globally, with most wealthy nations boasting well-funded cancer‑research bodies. Despite these billions of investment, this war has been a spectacular failure. In the US, the death rates for all kinds of cancer dropped by only 5 per cent in the period 1950-2005, according to the National Center for Health Statistics. Even if you strip out confounding variables such as age (more people are living long enough to get cancer) and better diagnosis, the blunt fact is that, with most kinds of cancer, your chances in 2014 are not much better than they were in 1974. In many cases, your treatment will be pretty much the same.

After the dizzying breakthroughs of the 20th century, physics seems to have ground to a halt

For the past 20 years, as a science writer, I have covered such extraordinary medical advances as gene therapy, cloned replacement organs, stem-cell therapy, life-extension technologies, the promised spin-offs from genomics and tailored medicine. None of these new treatments is yet routinely available. The paralyzed still cannot walk, the blind still cannot see. The human genome was decoded (one post-Golden Quarter triumph) nearly 15 years ago and we’re still waiting to see the benefits that, at the time, were confidently asserted to be ‘a decade away’. We still have no real idea how to treat chronic addiction or dementia. The recent history of psychiatric medicine is, according to one eminent British psychiatrist I spoke to, ‘the history of ever-better placebos’. And most recent advances in longevity have come about by the simple expedient of getting people to give up smoking, eat better, and take drugs to control blood pressure.

There has been no new Green Revolution. We still drive steel cars powered by burning petroleum spirit or, worse, diesel. There has been no new materials revolution since the Golden Quarter’s advances in plastics, semi-conductors, new alloys and composite materials. After the dizzying breakthroughs of the early- to mid-20th century, physics seems (Higgs boson aside) to have ground to a halt. String Theory is apparently our best hope of reconciling Albert Einstein with the Quantum world, but as yet, no one has any idea if it is even testable. And nobody has been to the Moon for 42 years.

Why has progress stopped? Why, for that matter, did it start when it did, in the dying embers of the Second World War?

One explanation is that the Golden Age was the simple result of economic growth and technological spinoffs from the Second World War. It is certainly true that the war sped the development of several weaponisable technologies and medical advances. The Apollo space programme probably could not have happened when it did without the aerospace engineer Wernher Von Braun and the V-2 ballistic missile. But penicillin, the jet engine and even the nuclear bomb were on the drawing board before the first shots were fired. They would have happened anyway.

Conflict spurs innovation, and the Cold War played its part – we would never have got to the Moon without it. But someone has to pay for everything. The economic boom came to an end in the 1970s with the collapse of the 1944 Bretton Woods trading agreements and the oil shocks. So did the great age of innovation. Case closed, you might say.

And yet, something doesn’t quite fit. The 1970s recession was temporary: we came out of it soon enough. What’s more, in terms of Gross World Product, the world is between two and three times richer now than it was then. There is more than enough money for a new Apollo, a new Concorde and a new Green Revolution. So if rapid economic growth drove innovation in the 1950s and ’60s, why has it not done so since?

In The Great Stagnation, Cowen argues that progress ground to a halt because the ‘low-hanging fruit’ had been plucked off. These fruits include the cultivation of unused land, mass education, and the capitalisation by technologists of the scientific breakthroughs made in the 19th century. It is possible that the advances we saw in the period 1945-1970 were similarly quick wins, and that further progress is much harder. Going from the prop-airliners of the 1930s to the jets of the 1960s was, perhaps, just easier than going from today’s aircraft to something much better.

But history suggests that this explanation is fanciful. During periods of technological and scientific expansion, it has often seemed that a plateau has been reached, only for a new discovery to shatter old paradigms completely. The most famous example was when, in 1900, Lord Kelvin declared physics to be more or less over, just a few years before Einstein proved him comprehensively wrong. As late as the turn of the 20th century, it was still unclear how powered, heavier-than-air aircraft would develop, with several competing theories left floundering in the wake of the Wright brothers’ triumph (which no one saw coming).

Lack of money, then, is not the reason that innovation has stalled. What we do with our money might be, however. Capitalism was once the great engine of progress. It was capitalism in the 18th and 19th centuries that built roads and railways, steam engines and telegraphs (another golden era). Capital drove the industrial revolution.

Now, wealth is concentrated in the hands of a tiny elite. A report by Credit Suisse this October found that the richest 1 per cent of humans own half the world’s assets. That has consequences. Firstly, there is a lot more for the hyper-rich to spend their money on today than there was in the golden age of philanthropy in the 19th century. The superyachts, fast cars, private jets and other gewgaws of Planet Rich simply did not exist when people such as Andrew Carnegie walked the earth and, though they are no doubt nice to have, these fripperies don’t much advance the frontiers of knowledge. Furthermore, as the French economist Thomas Piketty pointed out in Capital (2014), money now begets money more than at any time in recent history. When wealth accumulates so spectacularly by doing nothing, there is less impetus to invest in genuine innovation.

the new ideal is to render your own products obsolete as fast as possible

During the Golden Quarter, inequality in the world’s economic powerhouses was, remarkably, declining. In the UK, that trend levelled off a few years later, to reach a historic low point in 1977. Is it possible that there could be some relationship between equality and innovation? Here’s a sketch of how that might work.

As success comes to be defined by the amount of money one can generate in the very short term, progress is in turn defined not by making things better, but by rendering them obsolete as rapidly as possible so that the next iteration of phones, cars or operating systems can be sold to a willing market.

In particular, when share prices are almost entirely dependent on growth (as opposed to market share or profit), built-in obsolescence becomes an important driver of ‘innovation’. Half a century ago, makers of telephones, TVs and cars prospered by building products that their buyers knew (or at least believed) would last for many years. No one sells a smartphone on that basis today; the new ideal is to render your own products obsolete as fast as possible. Thus the purpose of the iPhone 6 is not to be better than the iPhone 5, but to make aspirational people buy a new iPhone (and feel better for doing so). In a very unequal society, aspiration becomes a powerful force. This is new, and the paradoxical result is that true innovation, as opposed to its marketing proxy, is stymied. In the 1960s, venture capital was willing to take risks, particularly in the emerging electronic technologies. Now it is more conservative, funding start-ups that offer incremental improvements on what has gone before.

But there is more to it than inequality and the failure of capital.

During the Golden Quarter, we saw a boom in public spending on research and innovation. The taxpayers of Europe, the US and elsewhere replaced the great 19th‑century venture capitalists. And so we find that nearly all the advances of this period came either from tax-funded universities or from popular movements. The first electronic computers came not from the labs of IBM but from the universities of Manchester and Pennsylvania. (Even the 19th-century analytical engine of Charles Babbage was directly funded by the British government.) The early internet came out of the University of California, not Bell or Xerox. Later on, the world wide web arose not from Apple or Microsoft but from CERN, a wholly public institution. In short, the great advances in medicine, materials, aviation and spaceflight were nearly all pump-primed by public investment. But since the 1970s, an assumption has been made that the private sector is the best place to innovate.

The story of the past four decades might seem to cast doubt on that belief. And yet we cannot pin the stagnation of ingenuity on a decline in public funding. Tax spending on research and development has, in general, increased in real and relative terms in most industrialised nations even since the end of the Golden Quarter. There must be another reason why this increased investment is not paying more dividends.

Could it be that the missing part of the jigsaw is our attitude towards risk? Nothing ventured, nothing gained, as the saying goes. Many of the achievements of the Golden Quarter just wouldn’t be attempted now. The assault on smallpox, spearheaded by a worldwide vaccination campaign, probably killed several thousand people, though it saved tens of millions more. In the 1960s, new medicines were rushed to market. Not all of them worked and a few (thalidomide) had disastrous consequences. But the overall result was a medical boom that brought huge benefits to millions. Today, this is impossible.

The time for a new drug candidate to gain approval in the US rose from less than eight years in the 1960s to nearly 13 years by the 1990s. Many promising new treatments now take 20 years or more to reach the market. In 2011, several medical charities and research institutes in the UK accused EU-driven clinical regulations of ‘stifling medical advances’. It would not be an exaggeration to say that people are dying in the cause of making medicine safer.

Risk-aversion has become a potent weapon in the war against progress on other fronts. In 1992, the Swiss genetic engineer Ingo Potrykus developed a variety of rice in which the grain, rather than the leaves, contain a large concentration of Vitamin A. Deficiency in this vitamin causes blindness and death among hundreds of thousands every year in the developing world. And yet, thanks to a well-funded fear-mongering campaign by anti-GM fundamentalists, the world has not seen the benefits of this invention.

Apollo couldn’t happen today, not because we don’t want to go to the Moon, but because the risk would be unacceptable

In the energy sector, civilian nuclear technology was hobbled by a series of mega-profile ‘disasters’, including Three Mile Island (which killed no one) and Chernobyl (which killed only dozens). These incidents caused a global hiatus into research that could, by now, have given us safe, cheap and low-carbon energy. The climate change crisis, which might kill millions, is one of the prices we are paying for 40 years of risk-aversion.

Apollo almost certainly couldn’t happen today. That’s not because people aren’t interested in going to the Moon any more, but because the risk – calculated at a couple-of-per-cent chance of astronauts dying – would be unacceptable. Boeing took a huge risk when it developed the 747, an extraordinary 1960s machine that went from drawing board to flight in under five years. Its modern equivalent, the Airbus A380 (only slightly larger and slightly slower), first flew in 2005 – 15 years after the project go-ahead. Scientists and technologists were generally celebrated 50 years ago, when people remembered what the world was like before penicillin, vaccination, modern dentistry, affordable cars and TV. Now, we are distrustful and suspicious – we have forgotten just how dreadful the world was pre-Golden Quarter.

we could be in a world where Alzheimer’s was treatable, clean nuclear power had ended the threat of climate change, and cancer was on the back foot

Risk played its part, too, in the massive postwar shift in social attitudes. People, often the young, were prepared to take huge, physical risks to right the wrongs of the pre-war world. The early civil rights and anti-war protestors faced tear gas or worse. In the 1960s, feminists faced social ridicule, media approbation and violent hostility. Now, mirroring the incremental changes seen in technology, social progress all too often finds itself down the blind alleyways of political correctness. Student bodies used to be hotbeds of dissent, even revolution; today’s hyper-conformist youth is more interested in the policing of language and stifling debate when it counters the prevailing wisdom. Forty years ago a burgeoning media allowed dissent to flower. Today’s very different social media seems, despite democratic appearances, to be enforcing a climate of timidity and encouraging groupthink.

Does any of this really matter? So what if the white heat of technological progress is cooling off a bit? The world is, in general, far safer, healthier, wealthier and nicer than it has ever been. The recent past was grim; the distant past disgusting. As Steven Pinker and others have argued, levels of violence in most human societies had been declining since well before the Golden Quarter and have continued to decline since.

We are living longer. Civil rights have become so entrenched that gay marriage is being legalised across the world and any old-style racist thinking is met with widespread revulsion. The world is better in 2014 than it was in 1971.

And yes, we have seen some impressive technological advances. The modern internet is a wonder, more impressive in many ways than Apollo. We might have lost Concorde but you can fly across the Atlantic for a couple of days’ wages – remarkable. Sci-fi visions of the future often had improbable spacecraft and flying cars but, even in Blade Runner’s Los Angeles of 2019, Rick Deckard had to use a payphone to call Rachael.

But it could have been so much better. If the pace of change had continued, we could be living in a world where Alzheimer’s was treatable, where clean nuclear power had ended the threat of climate change, where the brilliance of genetics was used to bring the benefits of cheap and healthy food to the bottom billion, and where cancer really was on the back foot. Forget colonies on the Moon; if the Golden Quarter had become the Golden Century, the battery in your magic smartphone might even last more than a day.

Syndicate this Essay

Future of TechnologyHistory of ScienceHistory of TechnologyProgress & ModernityAll topics →

Michael Hanlon

was a science journalist whose work appeared in The Sunday Times and The Daily Telegraph, among others. His last book was In the Interests of Safety (2014), co-written with Tracey Brown. He lived in London.

aeon.co

0 thoughts on “Essay On Scientific Discoveries Of 20th And 21st Century

Leave a Reply

Your email address will not be published. Required fields are marked *