Physicist Stephan Hawking Warns Artificial Intelligence Could Destroy Us

At the opening ceremony of the 2017 Web Summit in Lisbon on Monday, Stephen Hawking claimed artificial intelligence could be the greatest advance in human history or that it could completely destroy us.

“The rise of AI could be the worst or the best thing that has happened for humanity. AI could develop a will of its own ,” said
physicist Stephen Hawking .

“Success in creating effective AI, could be the biggest event in the history of our civilization. Or the worst. We just don’t know. So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it.”

“Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy,” Hawking added.

“Perhaps we should all stop for a moment and focus not only on making our AI better and more successful, but also on the benefit of humanity,” Hawking said.

Who Gets to Determine?

Excuse me for asking, but who gets to determine, in advance, what is or what isn't a "benefit of humanity"?

Is it me, you, Goldman Sachs, President Trump. the EU, China, ISIS, or AI itself?

I can partially answer that question. It sure isn't me.

Mike "Mish" Shedlock

I agree with Stehpan's fears of new ways for a little minority to control a majority. We're going to have drones (AI enabled, the size of a tennis ball) that can beat a well-armed soldier in a 1-to-1 fight. 100 vs 100 fast information sharing will make armed humans look like balloons in front of a pillbox. Creating a swarm of millions of those drones is not too hard, is it? Killing off parts of a population is going to be just one mouse click away. And the guys doing the clicking are not going to be you, me, Mish or Stephan.

Long before AI becomes a threat, we'll do ourselves in with genetic engineering. Imagine constructing a virus with any properties you wish and then using it as a weapon.

we already live in a world where the systems we have created develop with no conscious intent on our part. separating thought and action gave western man his great advantage, but such an arrangement gives action a life of its own. as the bees of the machine world, our role is fixed, only massive social engineering will enable us to pollinate fields of our own choosing.

“The rise of AI could be the worst or the best thing that has happened for humanity." Somebody flip a coin.

Correct me if I m wrong, but I suppose that hawking himself benefit in this AI

Why bother with AI when real intelligence is doing a good job of destroying us.

AI could want to destroy us. We would flee.
Maybe we lost before when women started to get the liberty to reproduce or not ? Who knows ?

I see AI as the cavalry that just might arrive in time to pickup from collapsing HI (human intelligence)

"Excuse me for asking, but who gets to determine, in advance, what is or what isn't a 'benefit of humanity'?"That's an easy question to answer: Elizabeth Warren and the Democrat party.

Natural stupidity will kill us long before AI gets a chance at us.