Wednesday, March 18, 2015

Robots Taking Over The World!: Not Such A Crazy Fear ...

In his 1942 short story "Runaround," Isaac Asimov introduced the Three Laws of Robotics. Their simple message — that robots should never harm humans or disobey orders — became guiding principles for robotics and, more recently, artificial intelligence.
Now, 73 years later, luminaries of technology and science are questioning whether the emerging AI industry can possibly uphold the so-called Three Laws for the long term.
Stephen Hawking (left) has said that full AI "could spell 
the end of the human race." Elon Musk (center) 
and Bill Gates share similar fears.
Esteemed British scientist Stephen Hawking issued perhaps the starkest assessment, telling the BBC late last year, "The development of full artificial intelligence could spell the end of the human race."
Hawking's message echoed comments made earlier in the year by Tesla (NASDAQ:TSLA) founder Elon Musk, who told an audience of students at the Massachusetts Institute of Technology in October that AI might be humankind's "biggest existential threat."
Then, in January, none other than Bill Gates weighed in, saying during an "ask me anything" session on Reddit that he doesn't understand why there isn't more concern about AI's path.
So what gives? Is AI pushing humanity toward a grim ending?
The answer depends on whom you ask. And if you ask well-known Silicon Valley analyst Rob Enderle, the answer is an almost certain "yes."
One of these two is a robot. Find out HERE.
Enderle sits on the Futurists Board and the Social Factors Board of the nonprofit Lifeboat Foundation; its mission is to safeguard humankind from existential threats, AI included. Enderle says that the group operates under the assumption that "singularity" — when AI matches human intelligence — will be achieved around 2040. He says that society should take the concerns of Hawking, Musk and Gates very seriously.
Read the rest of the Story HERE.

If you like what you see, please "Like" us on Facebook either here or here. Please follow us on Twitter here.


No comments: