Institute on Biotechnology and the Human Future

 Home

 :: About IBHF

 :: Center on Nanotechnology and
      Society

Genetic Discrimination Germline Intervention Gene Patents Nanotechnology Human Cloning Reproductive Technology



 Themes

 :: Arts

 :: Bio 101

 :: Business

 :: Eugenics

 :: Human "Enhancement"

 :: International

 :: Human Cognome Project



 Topics

   Genetic Discrimination

   Germline Intervention

   Gene Patents

   Nanotechnology

   Human Cloning

   Reproductive Technology



 Resources

 Commentaries

 News

 Events

 

Another View of the Singularity Summit: Is the "Singularity" for Everyone?

Dawn M. Willow, J.D.
Legal Fellow
Institute on Biotechnology and the Human Future and the Center on Nanotechnology and Society at Chicago-Kent College of Law/Illinois Institute of Technology


When 1997 world chess champion Gary Kasparov was defeated by a computer, and when computers became capable of mimicking the styles of Bach, Mozart, and Chopin so closely that music aficionados could hardly distinguish between the human-composed and the machine-composed scores, the distinction between man and machine became obscured as computers seemed to demonstrate skills that require creativity -- a quality thought to be a uniquely human characteristic. But, while computers may become capable of more and more human-like tasks or exceed certain human cognitive abilities, can artificial intelligence ever capture human ingenuity? Are we approaching technological changes that will merge biological and non-biological intelligence, fuse the man-machine relationship, and blur the lines between reality and virtual reality? These ideas and others were the topics of discussion at the Singularity Summit, held on May 13, 2006, at Stanford University. The event drew more than 1,800 technophiles, scientists, academics, entrepreneurs, and curious observers from across the country.

Ray Kurzweil, pre-eminent scientist and author of The Singularity is Near: When Humans Transcend Biology, keynoted the event. He spoke non-dogmatically about accelerating change, exponential growth, and the paradigm shifts of science that are, in theory, leading to the "Singularity." One panelist referred to the "Singularity" as the "intelligence explosion." Put another way, the concept of the "Singularity," as described by the Singularity Institute for Artificial Intelligence at Stanford, is defined as the "capab[ility] of technologically creating smarter-than-human intelligence, perhaps through enhancement of the human brain, direct links between computers and the brain, or Artificial Intelligence. This event is called the 'Singularity' by analogy with the singularity at the center of a black hole-just as our current model of physics breaks down when it attempts to describe the center of a black hole, our model of the future breaks down once the future contains smarter-than-human minds."

The field of artificial intelligence (A.I.) brings together the schools of neuroanatomy, evolutionary psychology, mathematics, nanotechnology, and computing, among many other disciplines. A.I. technology is used to create culturally liberating inventions such as real-time language translators and self-operated cars. But will other applications create a technological trap for their creators? In order to answer both technical questions concerning the advancement of A.I., as well as philosophical questions concerning the future of humankind in an A.I.-dominated world, speakers addressed: the ways the brain differs from conventional computers; what distinguishes humans from other primates; and how the laws of technological acceleration may transform humanity.

For example, Kurzweil pointed out that much of human intelligence is based on pattern recognition‹the quintessential example of self-organization (i.e., the process by which the internal organization of a system increases in complexity without being guided or managed by an outside source)-and that replicating this skill via A.I. presents a great challenge for scientists. However, through reverse engineering of the human brain and the leveraging of pattern recognition, Kurzweil ambitiously predicted that A.I. will surpass the human mind in just a few decades. Kurzweil explained that reverse engineering of the auditory cortex could derive principles of operation, which can be expressed as mathematics and simulated by computer programming, and that, thereby, the brain's capacity could be compressed into about 20 megabytes of data. He further predicted that, by 2020, the power of the human brain could be contained in a personal computer for $1,000.

These Singularity predictions were welcomed by many transhumanists and post-human futurists in attendance, who seem to be working towards the ultimate goal of creating a utopia characterized by immortality, and which they postulate may be realized by "uploading" the content of one's brain on to a computational substrate in order to exist in a non-corporeal form somewhere in cyberspace-where one could "live" forever. Nevertheless, Douglas Hostader, professor of cognitive science and computer science, and adjunct professor of history and philosophy of science, philosophy, comparative literature, and psychology at Indiana University, offered some critique of these Singularity predictions that blur the lines between science and science fiction. Hofstadter expressed concern about the idea of humans becoming software entities inside of computing hardware. For instance, if sentient life can exist in silicon substrates, could the Singularity bring about a "planned obsolescence" of humans as we exist today?

The idea of existing or achieving happiness in a transhuman state facilitated by A.I. raises the questions of what it means to be human, and the purpose and essence of our existence, as well as how, how long, and where we should exist. During the question-and-answer session, one audience member asked the panel if any religious studies been undertaken to address the implications of A.I. After a moment of silence (not in reverence), one panelist wryly stated something along the lines of: "I hope not."

Perhaps, a transhumanist would find no need for religion in a world where we can upload ourselves into cyber-heaven. While some scientists may find religion a stagnating factor to technological progress, and while science and religion may arrive at conclusions about our human nature, Singularity enthusiasts and people of faith both share a strong and hopeful vision for the future of humanity-although the means by which this vision should become reality differs.

The Sanctity of Life in a Brave New World
A Manifesto on Biotechnology and Human Dignity
Lori B. Andrews
How Art Challenges Us to Consider the Human Life
Brent Blackwelder
Cloning, Germline Engineering, Designer Babies, And The Human Future
Nigel M. de S. Cameron
An Idea Whose Time has Come
George J. Annas
Genism, Racism, and the Prospect of Genetic Genocide
Stuart A. Newman
Averting the Clone Age: Prospects and Perils of Human Developmental Manipulation
19 J. Contemp. Health L. & Pol'y 431 (2003).
Jordan Paradise
European Opposition to Exclusive Control Over Predictive Breast Cancer Testing and the Inherent Implications for U.S. Patent Law and Public Policy: A Case Study of the Myriad Genetics’ BRCA Patent Controversy
59 Food and Drug Law Journal 133-154 (2004)
(With permission from FDLI)
Byron Sherwin
Patents and Patients: Human Gene Patenting and Jewish Legal Ethics
M. Ellen Mitchell
Human Dimensions in Technological Advances
Nigel M. de S. Cameron
and Jennifer Lahl

California's Bizarre Cloning Proposition
Rosario Isasi
Cloning in the Developing World
Henk Jochemsen
Cloning prohibitions in Europe
as presented at Toward a Concensus on Cloning, Washington, D.C., July 9, 2004
(Adobe pdf file)
David Prentice
The Cloning Debate at the United Nations
as presented at Toward a Concensus on Cloning, Washington, D.C., July 9, 2004
(Adobe pdf file)