Stephen Hawking Says Earth Will ‘Be A Sizzling Ball Of Fire By 2600’ If AI Doesn’t Kill Us All First

stephen hawking ai earth ball fire

Shutterstock


Speaking via video at the 2017 Tencent WE Summit in Beijing, Stephen Hawking says if artificial intelligence doesn’t kill us all first the Earth will be a ‘sizzling ball of fire by 2600.’

Man, what a Debby Downer.

So what does the genius suggest we all do about this impending apocalypse? Be even more environmentally-friendly? Hells, naw, he says we need to just get the flock off this dying planet.

Reports Metro

Prof Hawking says our only chance is to escape our planet – and says his Breakthrough Starshot project could be the first step on the way.

Professor Hawking said that he hoped that tiny spaceships propelled by beams of light could reach Alpha Centauri within his audience’s lifetime.

Hawking says, “Such a system could reach Mars in less than an hour, or reach Pluto in days, pass Voyager in under a week and reach Alpha Centauri in just over 20 years.”

Back in June, at the Starmus Festival in Norway, Hawking similarly stated…

“We have given our planet the disastrous gift of climate change. Rising temperatures, reduction of the polar ice caps, deforestation and decimation of animal species. We can be an ignorant, unthinking lot.

“We are running out of space and the only places to go to are other worlds. It is time to explore other solar systems. Spreading out may be the only thing that saves us from ourselves. I am convinced that humans need to leave Earth.”

Alrighty then!

But wait, Hawking didn’t just stop there with his doomsday prophecies. No, sir. CNET reports that he also dropped more bad news for us humans in another speech at the Tencent WE Summit when discussing artificial intelligence…

“Success in creating effective AI could be the biggest event in the history of our civilization. Or the worst. We just don’t know. So we cannot know if we will be infinitely helped by AI, or ignored by it and sidelined, or conceivably destroyed by it.”

He continued…

“Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy.”

This pretty much echoes what Hawking said about AI in a recent interview with Wired magazine

“I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that improves and replicates itself. This will be a new form of life that outperforms humans.”

Sheesh, Stephen, do you need a hug or something?

Hawking wasn’t all gloom and doom though. Despite his warnings, he says he is still an optimist.

“Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one, industrialization. We will aim to finally eradicate disease and poverty. Every aspect of our lives will be transformed.”

Frankly, I am not all that concerned. I am more worried about the world ending on November 19th when Planet X, Niburu, comes crashing into our planet.

Perhaps that super smart Russian guy who says he’s from Mars can help us out. I mean, certainly he knows more than someone like Stephen Hawking, right? He’s from Mars, for God’s sake!