User Tools

Site Tools


cs356:david_lanner

David's Abstract and Citations

Abstract

The emergence of pure or strong Artificial Intelligence whose intelligence far exceeds that of humans is known as the (Technological) Singularity. Its occurrence is as assured as the continued fulfillment of the trend predicted by Moore's law- that the number of transistors able to be fit onto a chip doubles approximately every two years.

This is no coincidence; the biggest obstacle to the development of AI has been the limit of hardware processing speeds. As the years go by, however, and the number of transistors able to be fit onto an integrated circuit increases, the world will see the birth of machines with intelligence. At first, humans will have the greater intelligence. Then for a brief moment in time, their intelligence will match that of humans. Soon after, however, will come a great explosion in machine intelligence, reaching heights that are likely inconceivable by our minds. This is inevitable.

In my paper, I will argue several predictions for the future of AI (i.e. various possibilities and probabilities of the Singularity happening), and how those futures might come about from the world as it is now.

draft that is rough

PowerPoint Presentation

Citations

Dyson, George B. Darwin Among The Machines. Cambridge, MA: Perseus Books, 1998. Print.

Kurzweil, Ray. "The Law of Accelerating Returns". 2001.

Vinge, Vernor. "What If the Singularity Does NOT Happen?". 2007.

cs356/david_lanner.txt · Last modified: 2010/04/28 20:50 by wikiwikwik