Monday, March 25, 2002

Technology and Ethics in the 21st Century

On Monday 19 November 2001, I attended a panel discussion at Washington National Cathedral: "Are We Becoming an Endangered Species? Technology and Ethics in the 21st Century." I will attempt to capture the flow of the discussion between the four panelists and then offer some comments of my own on the important topics that were raised.

Judy Woodruff, anchor and senior correspondent at CNN, served as moderator. In addition to daily reporting, Woodruff anchors "Inside Politics," and she moderated several Republican presidential debates for CNN's Election 2000 coverage. She introduced the four panel members:

Bill Joy is co-founder of Sun Microsystems and a pioneer in development of the Internet. He entered this debate with his article, "Why the Future Doesn't Need Us," in the April 2000 issue of Wired.

Ray Kurzweil invented the first flatbed scanner, the first text-to-speech synthesizer, and "pattern recognition" designed to teach computers how to recognize the abstract patterns that dominate human thinking. He is the author of The Age of Spiritual Machines: When Computers Exceed Human Intelligence. Kurzweil came prepared with a briefing paper.

Anne Foerst is visiting professor for Theology and Computer Science at St. Bonaventure University. She recently served as a research scientist at the Artificial Intelligence Laboratory at MIT and a theological advisor on MIT's projects to develop embodied, autonomous robots.

Bill McKibben, a former staff writer for The New Yorker, is author of The End of Nature, the first book for a general audience about global warming, as well as The Age of Missing Information. (He also has the distinction of being classified as an "environmental wacko" by Rush Limbaugh.)

Woodruff introduced the topic by referring to Bill Joy's article: "It is most of all the power of destructive self-replication in genetics, nanotechnology, and robotics (GNR) that should give us pause." Such technology makes for cheap production of goods, but an accident could destroy the world. It could also be used for warfare. It does not reproduce by means of succeeding generations – its memory lives forever, making for an inhuman intelligence. Meanwhile, dangerous information is widely available on the internet – a boon to science, but also a threat through availability to terrorists. Are we an endangered species?

Bill Joy: Yes, we are becoming an endangered species. Technology poses the danger of catastrophe. For example, research is currently under way to combine genetic material from the AIDS and Ebola viruses. Can we construct technological fixes to technological dangers? Defense is difficult – for instance, bacteria are becoming resistant to antibiotics. We cannot look to technology to save us from technology.

Ray Kurzweil: Perhaps we should relinquish certain kinds of technology, on the grounds that we already have enough. If we had told people of the 18th century what the 20th century would look like, they would have said, "Let's not go there." But there were also advantages to technology – it has relieved suffering and extended life spans. There is still suffering that technology can relieve. And I cannot agree that defensive technology is necessarily weaker than the dangers.

Anne Foerst: We are created in God's image. When we are creative, we partake of God's mark in us. We cannot forbid technology. It is our nature to want to know things. Forbidding it would just push it underground, where it would no longer be subject to public scrutiny. Technology is often seen as "other" – as our opponent. But it is really an extension of us. The end of paradise came when man began judging what was good and evil – our knowledge is limited, unlike God's.

Bill McKibben: Are we at a moment that is in continuity with past development? Or are we, rather, at a special moment of discontinuity? [McKibben argues for the latter.] We have let technology grow too large, to the point where we can interfere with nature at the macro level. Theologians are now talking about a "trans-human" or "post-human" future. This marks the current moment as special – different from every moment since the world began.

(From here on, a paragraph might jump from one point to another, as a panelist responds in order to various comments by the other panelists, and my hurried notes do not provide much introductory and transitional material.)

Joy: If crazed individuals can determine the future, this is undemocratic – it gives us no say in our future. The Enlightenment had a moral basis in the relief of suffering, but now we have new gadgets just for convenience. The moral foundation of science is now absent.

Kurzweil: We cannot relinquish nanotechnology, without relinquishing all technology. Miniaturization is an ongoing trend in all technological fields, not a distinct field in itself. Banning technology would condemn billions to suffering. The fact that we have kept software viruses to a minor nuisance level shows that technological defense can work. We must increase the resources we devote to expansion of defensive technologies.

McKibben: Let's look at some concrete examples. Genetic manipulation can let us raise the intelligence or the height of our offspring. Nanotechnology that allows the construction of anything from carbon can produce technology more intelligent than humans. We are not philosophically equipped to come out on the other side of this looking anything like human beings – some tech advocates are quite candid about this. It is time for human beings to mature in a way that we have not done previously.

Kurzweil: The future will not be post-human, but post-biological.

Foerst: I bring perspectives from two fields – MIT robotics and theology. We as humans are part of nature – we eat, and drink, and die like all other animals. But we are distinct, in that we have cognition and intelligence. Experiments have shown that our cognition is embodied – that's what it means to be human. Our bodies are a means towards the end of community – we cannot download our intelligence and be disembodied humans. If we are in the image of God, we are relational – i.e., partners. We try to build machines that are partners, and they will be suitable partners, more like us than current machines, with built-in ("embedded") feelings and ethics. Even the most advanced current technology cannot match the embeddedness of insects. We cannot merge with technology, but we can interact with it, as we do with chimps and dolphins. In creating a new species of robots in our own image, we will fulfilling our own being in God's image.

Joy [citing as an example of hubris someone whose name I did not catch]: He wants to find that part of us that makes us believe in God – but only in order to decide whether to keep it or not. Why are we still doing research on anthrax? Vaccines already exist. (But malaria became worse as a result of having to resist vaccines.) We must see the world as it really is – not an idealized world of unlimited scientific progress.

Kurzweil: We are already close to breakthroughs versus disease. We need to have laws and ethics to limit and direct scientific research. We are seeing the next step of evolutionary progress – increasing complexity freeing thinking from the limitations of biological form is a spiritual quest. It would be immoral not to allow technological progress. We are the only species that uses language and whose technology progresses.

McKibben: The lust for technological development is a glib exercise in spiritual development. Humans have long tried to decide how to limit ourselves. Invention is not what makes us human, but the ability to decide not to do something that we are capable of.

After this, there was a period for questions from the audience.

A member of Friends of the Earth: We want to stop genetic engineering, cloning, and designer babies. Joy responds: The advice, "Think twice before doing something dangerous," is, unfortunately, not in the American spirit. Foerst responds: There was recently an ad in Ivy League magazines advertising for an egg from a woman who meets particular requirements for $50,000. The problem is that our society has an ideal type and punishes those who do not meet it.

Another environmentalist: Animals do not have the option of developing technological defenses to survive. Joy responds: Alfred Nobel wanted to improve the world, but killed his brother. We can't afford progress because we can't afford the budget for defense against it. We can't even fight antibiotic-resistant bacteria. Perhaps we need a market mechanism to make labs take out insurance against terrorist use of their experimental materials. Kurzweil responds: We are scared of the duality of our own nature. There are both good and bad sides of technology.

A progressive: Progress is not only technological development, but also improving equity in the social system – in this case, greater access to the benefits of technology. Foerst responds: The "world-wide web" is a misnomer – it is not world-wide. It promises a sort of Kingdom of God that it cannot deliver. Communities are more inclusive – we have learned to accept that people are different from us. People from other cultures are offended by the strength of Western values, which displace other value systems. Kurzweil responds: Technological advances reduce prices. The lagging edge of technology also moves ahead, even if it remains behind the leading edge.

A literature professor, whose class was reading Frankenstein: When humans begin to create life, is this the ultimate evil? Foerst responds: I interpret the story differently. It is not about hubris and a monster, but about denying a creature's personhood because of external criteria.

A questioner asked about the Kyoto Treaty. McKibben responds: It is not good that we were the only country to opt out of this and other treaties. There is no need to create of a race of robots to interact with – we have 6 million people to interact with. We are now breeding super-salmon that are four times as big as natural salmon, and they are already escaping and interbreeding with wild salmon. This is just one more example of our re-making the world according to our own self-serving criteria.

A final questioner: What needs to happen in the education system to bring about the moral decision-making required? Foerst responds: We need more interdisciplinarity. Science and the humanities need to learn from each other. We need to stop teaching current science as objective truth and teach the history and philosophy of science with it. This will make for more responsible citizens.

My comments

When Ray Kurzweil says, "The future will not be post-human, but post-biological," he reveals quite candidly the heresy that he is advocating: Gnosticism. The notion that our humanity is separable from our biology is irreconcilable with the incarnational principle that lies at the heart of Christianity – and of reality itself. A non-biological being, by definition, cannot be human (though perhaps it could be a person).

But others, with better semantic grounding than Kurzweil, are just as Gnostic in explicitly advocating a post-human future. Here is where hard technology meets the soft New Age – in the notion that we must "evolve" to a "higher level," shedding our obsolete humanity along the way. (This use of "evolve" has nothing to do with the more familiar use of the term in biology, but those who advocate post-humanity enjoy at least sounding like they are being scientific.) This is just another statement of Gnostic religion.

As always, Gnostic religion is opposed by Incarnational religion. For Christians, our humanity is not something to outgrow and discard, but something to fulfill, perfect, and glorify, by the grace of God. Gnosticism relies on the obvious but false premise that divinity and humanity are mutually exclusive and, moreover, mutually antagonistic. For the Gnostic, in order to become more divine one must become less human. But we look to the example of Christ, who was fully human and fully divine. In him, we see that it is only by fulfilling our God-given human nature that we can hope to become divine.

Kurzweil also expresses the modern notion that suffering is the ultimate (only?) evil, and that the alleviation of suffering trumps every other concern. I suppose that downloading our brains into machines would eliminate all physical suffering. But suffering is part of being human. While we ought to work to relieve the suffering of our brothers and sisters, we must not sacrifice our humanity or theirs in pursuit of this goal. Rather, we must remember that Christ, who shared our humanity, also shared our suffering.

The other panelists all opposed Kurzweil's gospel of technological salvation, at least some of the time in some ways. McKibben showed particular awareness of the importance of defending and preserving our humanity in the face of technological threats. Joy warned against the amorality and hubris that always seem to accompany modernity and technology. And Foerst emphasized the incarnate nature of our humanity and of community.

On the other hand, I think Foerst is naive in her vision of a future race of robot buddies for mankind. She says this creative urge is just evidence that we are made in God's image. I would suggest it is evidence that we are still falling for the serpent's lie that we can become like God by satisfying our curiosity.