Mankind is doomed, if will not leave the Earth - VIDEO PDF Print E-mail
Thursday, 26 February 2015 09:00

If humanity is to survive in the long term, it needs to find a way to leave the planet Earth. Familiar words? This time they were recited by famous astrophysicist Stephen Hawking. Perhaps people have less than 200 years to figure out how to escape from their home planet, Hawking said in a recent interview with Big Think. Otherwise, our species could face extinction.

"It is quite difficult to avoid disaster in the next hundred years, let alone the next thoUSAnd or million years - said Hawking. - Our only chance of long-term survival - not to remain on planet Earth, and spread in space. "

People stranded on Earth, waiting for the risk of two types of disasters, says Hawking. The first type we can create your own, for example, causing a radical change in climate or creating a nuclear or biological weapons. Can wipe us off the face of the Earth and a number of cosmic phenomena. asteroid collided with the Earth, will kill most of the population and leave the rest of the planet uninhabitable. Gamma-ray supernova near the Milky Way may also be damaging to life on Earth.

Life on Earth may also be under threat of extraterrestrial civilizations, as Hawking likes to say. Dangerous aliens can take possession of the planet and its resources for their own use. For the survival of our species would be safer to have a backup plan in the form of other worlds.

"The human race is not necessary to keep all your eggs in one basket, or on one planet. Let's hope we do not drop the basket until redistribute the load. "

The other day, Hawking was asked what he would have disadvantages people changed, and what virtue improved, if it were possible. He said: "Most of all I would like to correct a man's aggression. She gave an advantage in survival in the days of cavemen to get more food, territory or partner with whom you can continue the race, but today it threatens to destroy us all. "

In November, Elon Musk, CEO of SpaceX and Tesla, warned that the chance that something dangerous will happen as a result of machines with artificial intelligence, can shoot in five years. Previously, he argued that the development of autonomous thinking machines akin to "call a demon." Speaking at a symposium AeroAstro Centennial in October, Musk described the artificial intelligence as our "biggest threat to survival." Then he said: "I think we should be very careful with artificial intelligence. If I could suggest what will be our biggest threat to their survival, then this is it. We need to be very careful with artificial intelligence. I am more and more inclined to think that there must be some regulatory oversight, perhaps at the national and international level, just to know that we do not make stupid mistakes. "

"With artificial intelligence, we call the demon. You know those stories where a guy with a pentagram and holy water ... he thought he could control the demon, but no. "

Go back to Hawking. "The quality that I would like to improve - this is empathy, compassion. It keeps us in a state of peace and love. " The professor also added that the space race for men is "life insUrance" and must continue.

"Sending people to the Moon changed the future of the human race so that we do not even suspect - he said. - It did not solve our immediate problems on planet Earth, but allowed us to look at them from a different angle, inside and out. "

"I believe that the long-term future of the human race has to be space and that it is an important life insUrance for our future survival, because it can prevent the extinction of mankind by colonizing other planets."

In December 2014, Hawking gave a warning to others - that artificial intelligence could mean the end of the human race. Speaking at an event in London, a physicist told BBC, that "the development of high-grade artificial intelligence can put an end to the human race."

In January, a group of scientists and entrepreneurs, including Elon Musk and Stephen Hawking, signed an open letter, promising to initiate Security Studies AI. The letter warns that without warranty of any kind development of intelligent machines could mean a dark future for mankind.



Related news items:
Newer news items:
Older news items:


Add comment