« First Impressions | Main | Newsweek reporting significantly accurate computer thought decoding study »

January 11, 2008

Addressing Existential Risks Associated With The Singularity

Ryan's comments on my Singularity Institute for Artificial Intelligence article were very thought provoking for me. Some of the technological advances coming have existential risks and I wanted to solidify my strategy in addressing those risks.

I have tried to be careful about which organizations I have associated with based on my philosophy of actively enabling the Singularity in a safe manner. The Singularity Institute for Artificial Intelligence has a goal of promoting the development of friendly Artificial General Intelligence. The Lifeboat Foundation is a think tank which contemplates the risks associated with the Singularity and acts as a voice of reason in promoting a balanced approach to these technologies.

You can see a thought experiment going on right now at the Lifeboat Foundation with this poll assessing specific existential risks (runaway nanotech, unfriendly artificial intelligence, asteroid impacts, etc) and my thoughts on how to allocate a hypothetical $100M budget.

The Lifeboat Foundation has six active programs (plus a number of planned programs):

AsteroidShield
To protect against devastating asteroid strikes.

BioShield
To protect against bioweapons and pandemics.

InternetShield
As the Internet grows in importance, an attack on it could cause physical as well as informational damage. An attack today on hospital systems or electric utilities could lead to deaths. In the future an attack could be used to alter the output that is produced by nanofactories worldwide leading to massive deaths.

LifeShield Bunkers
Developing fallback positions on Earth in case programs such as our BioShield and NanoShield fail globally or locally.

NanoShield
To protect against ecophages and nonreplicating nanoweapons.

SecurityPreserver
To prevent nuclear, biological, and nanotechnological attacks from occurring by using surveillance and sousveillance to identify terrorists before they are able to launch their attacks.

Space Habitats
To build fail-safes against global existential risks by encouraging the spread of sustainable human civilization beyond Earth.

As with anything that has the power to be used in inappropriate ways, I believe that these technologies will try to be exploited by undesirable entities no matter what we do, so it is paramount that we stay ahead by developing these technologies first and develop ways to protect ourselves against their misuse.

Only a small minority of the general public has knowledge of these ideas at this point because of relatively low media coverage. Over the next few years, as the time horizon for deploying them shortens, the general public will begin calling for more regulation, and it's organizations like the SIAI and the Lifeboat Foundation that they will turn to for guidance.

Suggested Posts

Singularity Institute for Artificial Intelligence
What is the Singularity?
The Singularity Effect
Upcoming Artificial Intelligence Events
Distributed Computing Projects and the Singularity

Chris K. Haley, NestedUniverse.net. Subscribe Get free RSS or email updates here. 

Comments


Sponsors