INDY Week 3.09.16

Page 24

indypage

THE END: WHAT RELIGION AND SCIENCE TELL US ABOUT THE APOCALYPSE By Phil Torres (Pitchstone Publishing, 288 pp.)

Apocalypse How?

NANOBOTS, BIOTECH, AND A.I. ARE THE NEAR FUTURE—OR THE END BY BRIAN HOWE

INDY: How did you get interested in the apocalypse? PHIL TORRES: I grew up in a fundamentalist evangelical family; the philosophy was sort of Dispensationalist. There was always this background of eschatological expectation— lots of discussion about the Rapture. I have a vivid memory of being afraid Bill Clinton was the Antichrist. Eschatology is the heart of the major world religions. If you were to excise it, 24 | 3.9.16 | INDYweek.com

there really wouldn’t be anything left. It’s the ultimate hope. Around 2001, there was this new field, existential risk studies, that began to pop up in universities, notably Oxford and Cambridge. To an extent, their warnings are similar to those of religious people, but with a completely different basis— one in science and evidence. Apocalyptic tendencies go back at least to Zoroaster, two millennia before Christ. But since 1945 and the atomic bomb, when the possibility of secular apocalypse emerged on the world stage, there has been a proliferation of risks associated with biotechnology, nanotechnology, and artificial intelligence. So first I was drawn in by religion, and then, ultimately, fascinated by the possibility of human extinction where these scenarios are based on legitimate science. Tell us more about existential risk studies. We used to be haunted by improbable worstcase scenarios, like volcano eruptions, but the likelihood of extinction was quite low. Many say 1945—or, you could argue, before that, when automobiles were adopted en masse, burning fossil fuels—was the first big-picture hazard, and the fact that we survived the Cold War, a lot of scholars say, was luck. We entered a new epoch in which self-annihilation was a probability, and right behind it was climate change. Peering into the future, you can see the threat rising. Existential risk studies is focused on understanding these risks and determining strategies for eliminating them. It’s necessarily partly speculative, but if feeling around in the dark is the best we can do, let’s do it. Now is the time to think as clearly as we can about these threats. What are the existential risks of biotechnology and synthetic biology? The primary danger is the creation of designer pathogens. Nature has a check on lethality;

mentioned beneficial things, but you could also print out weapons. This will enable individuals to acquire arsenals. How is it different than 3-D printing? 3-D printing is also called additive printing; you feed it plastic and it builds an object. But this is about synthesizing the material from the atom up— plastics, metals, whatever. The other risk of nanotechnology is building immensely more powerful computers, which could feed into the danger of superintelligence. And one more risk involves autonomous nanobots rather than a nanofactory. Phil Torres PHOTO BY ADAM GRAETZ

Humanity has long been obsessed with the end of the world. All of the major religions are built on eschatological bedrock; parsing the apocalypse as an act of God befitted a world threatened mainly by natural disasters. But as these metaphysical beliefs wane in the uniquely imperiled twenty-first century, we are waking up to a new, almost incredible, but very real set of threats to our species’ survival. Instead of a volcano eruption or meteor strike, we have to worry—the day after tomorrow, if not quite today—about being wiped out by designer pathogens, subjugated by superintelligent computers, or mulched into gray goo by self-replicating nanobots. Because these dangers emerge from our own godlike technological powers, we can no longer turn to a distant deity, or even something as vague as fate, for succor or recrimination. If the worst comes to pass, we’ll only have ourselves to blame. Carrboro’s Phil Torres illuminates this fascinating, unnerving terrain in his new book, The End: What Religion and Science Tell Us About the Apocalypse. Torres, who specializes in existential risk studies, trained in philosophy at the University of Maryland and Harvard and in neuroscience at Brandeis. His learning fuels a book that is academically rigorous but accessible for a general audience, aided by his personal story of coming to a scholarly interest in the apocalypse through a religious upbringing. The INDY recently sat down with Torres for a pleasant chat about humanity's extinction.

if it’s too lethal it’s not going to get to the next host, so there’s natural selection. But if you were to weaponize Ebola to make it more contagious, you don’t need selfish genes. You can give it properties it would never naturally acquire. Biotechnology is about modifying natural structures; synthetic biology is about creating structures. It allows you to create entirely new germs. It’s entirely plausible that the lethality of rabies, the contagiousness of the common cold, and so on could be consolidated in a single microbe. If effectively aerosolized, you could create a pandemic of unprecedented proportions. What about nanotechnology? One danger is the creation of nanofactories, which, theoretically, could be small enough to fit on your desktop. You feed it a really simple molecule and, moving the atoms individually, it would assemble objects. All the properties around us—transparency, hardness, softness— these are just built up combinatorily from the atomic level. So a nanofactory could build virtually any object: a new computer, clothes. I

Superintelligent nanobots? Kind of the opposite, actually—these nanobots are really dumb. Rather than moving molecules around in a factory, these are microscopic robots you can program. Imagine swallowing the surgeon so it can fix your heart. But the other possibility is nanobots designed to self-replicate. You drop them on a table and they start manipulating its molecules to create clones. A terrorist with the ultimate suicide wish could release a swarm of mindlessly, exponentially reproducing micromachines and not implausibly turn the Earth to dust. Scientists call this the “gray goo scenario.” That’s an issue of extraordinary power and increasing accessibility, which also applies to biotechnology and synthetic biology. The risks of nuclear weapons seem pretty self-evident. The existential issue is primarily the creation


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.