Placeholder Image

Subtitles section Play video

  • I didn't always love unintended consequences,

  • but I've really learned to appreciate them.

  • I've learned that they're really the essence

  • of what makes for progress,

  • even when they seem to be terrible.

  • And I'd like to review

  • just how unintended consequences

  • play the part that they do.

  • Let's go to 40,000 years before the present,

  • to the time of the cultural explosion,

  • when music, art, technology,

  • so many of the things that we're enjoying today,

  • so many of the things that are being demonstrated at TED

  • were born.

  • And the anthropologist Randall White

  • has made a very interesting observation:

  • that if our ancestors

  • 40,000 years ago

  • had been able to see

  • what they had done,

  • they wouldn't have really understood it.

  • They were responding

  • to immediate concerns.

  • They were making it possible for us

  • to do what they do,

  • and yet, they didn't really understand

  • how they did it.

  • Now let's advance to 10,000 years before the present.

  • And this is when it really gets interesting.

  • What about the domestication of grains?

  • What about the origins of agriculture?

  • What would our ancestors 10,000 years ago

  • have said

  • if they really had technology assessment?

  • And I could just imagine the committees

  • reporting back to them

  • on where agriculture was going to take humanity,

  • at least in the next few hundred years.

  • It was really bad news.

  • First of all, worse nutrition,

  • maybe shorter life spans.

  • It was simply awful for women.

  • The skeletal remains from that period

  • have shown that they were grinding grain morning, noon and night.

  • And politically, it was awful.

  • It was the beginning of a much higher degree

  • of inequality among people.

  • If there had been rational technology assessment then,

  • I think they very well might have said,

  • "Let's call the whole thing off."

  • Even now, our choices are having unintended effects.

  • Historically, for example,

  • chopsticks -- according to one Japanese anthropologist

  • who wrote a dissertation about it

  • at the University of Michigan --

  • resulted in long-term changes

  • in the dentition, in the teeth,

  • of the Japanese public.

  • And we are also changing our teeth right now.

  • There is evidence

  • that the human mouth and teeth

  • are growing smaller all the time.

  • That's not necessarily a bad unintended consequence.

  • But I think from the point of view of a Neanderthal,

  • there would have been a lot of disapproval

  • of the wimpish choppers that we now have.

  • So these things are kind of relative

  • to where you or your ancestors happen to stand.

  • In the ancient world

  • there was a lot of respect for unintended consequences,

  • and there was a very healthy sense of caution,

  • reflected in the Tree of Knowledge,

  • in Pandora's Box,

  • and especially in the myth of Prometheus

  • that's been so important

  • in recent metaphors about technology.

  • And that's all very true.

  • The physicians of the ancient world --

  • especially the Egyptians,

  • who started medicine as we know it --

  • were very conscious

  • of what they could and couldn't treat.

  • And the translations of the surviving texts say,

  • "This I will not treat. This I cannot treat."

  • They were very conscious.

  • So were the followers of Hippocrates.

  • The Hippocratic manuscripts also --

  • repeatedly, according to recent studies --

  • show how important it is not to do harm.

  • More recently,

  • Harvey Cushing,

  • who really developed neurosurgery as we know it,

  • who changed it from a field of medicine

  • that had a majority of deaths resulting from surgery

  • to one in which there was a hopeful outlook,

  • he was very conscious

  • that he was not always going to do the right thing.

  • But he did his best,

  • and he kept meticulous records

  • that let him transform that branch of medicine.

  • Now if we look forward a bit

  • to the 19th century,

  • we find a new style of technology.

  • What we find is,

  • no longer simple tools,

  • but systems.

  • We find more and more

  • complex arrangements of machines

  • that make it harder and harder

  • to diagnose what's going on.

  • And the first people who saw that

  • were the telegraphers of the mid-19th century,

  • who were the original hackers.

  • Thomas Edison would have been very, very comfortable

  • in the atmosphere of a software firm today.

  • And these hackers had a word

  • for those mysterious bugs in telegraph systems

  • that they called bugs.

  • That was the origin of the word "bug."

  • This consciousness, though,

  • was a little slow to seep through the general population,

  • even people who were very, very well informed.

  • Samuel Clemens, Mark Twain,

  • was a big investor

  • in the most complex machine of all times --

  • at least until 1918 --

  • registered with the U.S. Patent Office.

  • That was the Paige typesetter.

  • The Paige typesetter

  • had 18,000 parts.

  • The patent had 64 pages of text

  • and 271 figures.

  • It was such a beautiful machine

  • because it did everything that a human being did

  • in setting type --

  • including returning the type to its place,

  • which was a very difficult thing.

  • And Mark Twain, who knew all about typesetting,

  • really was smitten by this machine.

  • Unfortunately, he was smitten in more ways than one,

  • because it made him bankrupt,

  • and he had to tour the world speaking

  • to recoup his money.

  • And this was an important thing

  • about 19th century technology,

  • that all these relationships among parts

  • could make the most brilliant idea fall apart,

  • even when judged by the most expert people.

  • Now there is something else, though, in the early 20th century

  • that made things even more complicated.

  • And that was that safety technology itself

  • could be a source of danger.

  • The lesson of the Titanic, for a lot of the contemporaries,

  • was that you must have enough lifeboats

  • for everyone on the ship.

  • And this was the result

  • of the tragic loss of lives

  • of people who could not get into them.

  • However, there was another case, the Eastland,

  • a ship that capsized in Chicago Harbor in 1915,

  • and it killed 841 people --

  • that was 14 more

  • than the passenger toll of the Titanic.

  • The reason for it, in part, was

  • the extra life boats that were added

  • that made this already unstable ship

  • even more unstable.

  • And that again proves

  • that when you're talking about unintended consequences,

  • it's not that easy to know

  • the right lessons to draw.

  • It's really a question of the system, how the ship was loaded,

  • the ballast and many other things.

  • So the 20th century, then,

  • saw how much more complex reality was,

  • but it also saw a positive side.

  • It saw that invention

  • could actually benefit from emergencies.

  • It could benefit

  • from tragedies.

  • And my favorite example of that --

  • which is not really widely known

  • as a technological miracle,

  • but it may be one of the greatest of all times,

  • was the scaling up of penicillin in the Second World War.

  • Penicillin was discovered in 1928,

  • but even by 1940,

  • no commercially and medically useful quantities of it

  • were being produced.

  • A number of pharmaceutical companies were working on it.

  • They were working on it independently,

  • and they weren't getting anywhere.

  • And the Government Research Bureau

  • brought representatives together

  • and told them that this is something

  • that has to be done.

  • And not only did they do it,

  • but within two years,

  • they scaled up penicillin

  • from preparation in one-liter flasks

  • to 10,000-gallon vats.

  • That was how quickly penicillin was produced

  • and became one of the greatest medical advances of all time.

  • In the Second World War, too,

  • the existence

  • of solar radiation

  • was demonstrated by studies of interference

  • that was detected by the radar stations of Great Britain.

  • So there were benefits in calamities --

  • benefits to pure science,

  • as well as to applied science

  • and medicine.

  • Now when we come to the period after the Second World War,

  • unintended consequences get even more interesting.

  • And my favorite example of that

  • occurred beginning in 1976,

  • when it was discovered

  • that the bacteria causing Legionnaires disease

  • had always been present in natural waters,

  • but it was the precise temperature of the water

  • in heating, ventilating and air conditioning systems

  • that raised the right temperature

  • for the maximum reproduction

  • of Legionella bacillus.

  • Well, technology to the rescue.

  • So chemists got to work,

  • and they developed a bactericide

  • that became widely used in those systems.

  • But something else happened in the early 1980s,

  • and that was that there was a mysterious epidemic

  • of failures of tape drives

  • all over the United States.

  • And IBM, which made them,

  • just didn't know what to do.

  • They commissioned a group of their best scientists

  • to investigate,

  • and what they found was

  • that all these tape drives

  • were located near ventilation ducts.

  • What happened was the bactericide was formulated

  • with minute traces of tin.

  • And these tin particles were deposited on the tape heads

  • and were crashing the tape heads.

  • So they reformulated the bactericide.

  • But what's interesting to me

  • is that this was the first case

  • of a mechanical device

  • suffering, at least indirectly, from a human disease.

  • So it shows that we're really all in this together.

  • (Laughter)

  • In fact, it also shows something interesting,

  • that although our capabilities and technology

  • have been expanding geometrically,

  • unfortunately, our ability to model their long-term behavior,

  • which has also been increasing,

  • has been increasing only arithmetically.

  • So one of the characteristic problems of our time

  • is how to close this gap

  • between capabilities and foresight.

  • One other very positive consequence

  • of 20th century technology, though,

  • was the way in which other kinds of calamities

  • could lead to positive advances.

  • There are two historians of business

  • at the University of Maryland,