Singularity Institute for Artificial Intelligence

Artificial Intelligence Science Singularity Technology

Listen to this article as a podcast

Recently, I continued my philosophy of active participation in causes that I support by donating to the Singularity Institute for Artificial Intelligence (SIAI) at the Associate Donor level. What is the SIAI? Their vision is stated simply and elegantly:

"In the coming decades, humanity will likely create a powerful artificial intelligence. The Singularity Institute for Artificial Intelligence exists to confront this urgent challenge, both the opportunity and the risk."

I spoke with Tyler Emerson, the Executive Director of the Institute, about the Institute's goals and philosophy. To say that I was extremely impressed by what I heard is an understatement. Tyler is a remarkable individual who shares this extraordinary vision of the future, is committed to gathering public support for it, and is taking action because of the immense benefits that it will bring to mankind. It was an extremely satisfying and motivating experience to be able to hear his philosophy.

What will the creation of a superintelligence bring? The best analogies that I know compare that question to asking a group of chimps what the discovery of electricity will lead to. They simply don't have the right resources to be able to even understand the question. This statement is not meant to degrade the human experience in any way. Human history is remarkable and beautiful beyond compare. As humans, we have an instinct that drives us to improve our quality of life, and creating a superintelligence is the next logical step. However, by definition, a superintelligence is more intelligent than us, and we may not be able to fully understand its reasoning, unless we choose to merge with it through various technologies that will make this possible.

SIAI has an excellent article on the benefits, goals, and ways to manage the risks associated with creating a superintelligence. Here's an excerpt to whet your appetite:

"A smarter-than-human AI absorbs all unused computing power on the then-existent Internet in a matter of hours; uses this computing power and smarter-than-human design ability to crack the protein folding problem for artificial proteins in a few more hours; emails separate rush orders to a dozen online peptide synthesis labs, and in two days receives via FedEx a set of proteins which, mixed together, self-assemble into an acoustically controlled nanodevice which can build more advanced nanotechnology."

Imagine a smarter-than-human intelligence going on to create cures for cancer, AIDS and other diseases as well as solving a whole host of problems that have vexed us for years.

I strongly encourage you to support SIAI in any way that you can. I was motivated to lend my support because I do not want to watch one of the most important events in human history as a bystander. I want to take an active part in it, and forever be associated with this remarkable journey.

Related Posts

The Singularity Summit 2008
Singularity Institute for Artificial Intelligence
What is the Singularity?
The Singularity Effect
Upcoming Artificial Intelligence Events

Chris K. Haley, NestedUniverse.net. Subscribe here.

4 thoughts on “Singularity Institute for Artificial Intelligence

  1. Don’t you ever feel that these types of scientific projects are opening up a can of worms we shouldn’t be opening? I am all for bettering the human race though scientific discoveries, but some of this research is down right frightening IMHO.
    – Ryan

  2. That’s a great thought provoking question. You’ve given me an idea for my next article – I’d like to solidify my strategy in that regard.
    There are a significant number of existential risks associated with these technologies, and that’s why I have been careful about who I associate myself with. The Singularity Institute has a goal of developing Friendly Artificial General Intelligence. Also, the main reason I joined the Lifeboat Foundation was that it provides a way to thoughtfully manage existential risks associated with the Singularity. As with anything that has the power to be used in inappropriate ways, I believe that these technologies will try to be exploited by undesirable entities no matter what we do, so it’s best that we stay ahead of the game by developing them first, and developing ways to protect ourselves by using them.
    Only a small minority of the general public has knowledge of these ideas at this point because of relatively low media coverage. Over the next few years, as the time horizon for deploying them shortens, the general public will begin calling for more government regulation, and it’s organizations like the SIAI and Lifeboat Foundation that they will turn to for guidance.

  3. How exactly does the Lifeboat Foundation “manage existential risks?” From what I have read so far, it appears the results from this research are unknown. And if you don’t know what the potential outcomes are, how can you mitigate risk?

Comments are closed.