Stephen Hawking's warnings: What he predicted for the future
Stephen Hawking's fame was founded
on the research he did on general relativity and black holes. But he
often stepped outside his own field of research, using his recognition
to highlight what he saw as the great challenges and existential threats
for humanity in coming decades. His pronouncements drove headlines in
the media, which sometimes proved controversial.
Leaving Earth
Hawking was clearly troubled that we were putting all our eggs in one basket - that basket being Earth. For decades, Hawking had been calling for humans to begin the process of permanently settling other planets. It made news headlines again and again.
Hawking's rationale was that humankind would eventually fall victim to an extinction-level catastrophe - perhaps sooner rather than later. What worried him were so-called low-probability, high impact events - a large asteroid striking our planet is the classic example. But Hawking perceived a host of other potential threats: artificial intelligence, climate change, GM viruses and nuclear war to name a few.
In 2016, he told the BBC: "Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or 10,000 years.
- Visionary physicist Stephen Hawking dies
- Obituary: Stephen Hawking
- A life in pictures
- The book that made Hawking a star
Here, Hawking's views dovetailed with those of entrepreneur Elon Musk, another science superstar whose cogitations attract widespread attention. In 2013, Musk told a conference: "Either we spread Earth to other planets, or we risk going extinct. An extinction event is inevitable and we're increasingly doing ourselves in."
In line with his thoughts on the matter, Hawking also attached his name to a project researching technologies for interstellar travel - the Breakthrough Starshot initiative.
Rise of the machines?
Hawking recognised the great opportunities that arose from advances in artificial intelligence, but also warned about the dangers.In 2014, he told the BBC that "the development of full artificial intelligence could spell the end of the human race".
Hawking said the primitive forms of artificial intelligence developed so far had already proved very useful; indeed, the tech he used to communicate incorporated a basic form of AI. But Hawking feared the consequences of advanced forms of machine intelligence that could match or surpass humans.
Some academics thought the comments drew on outdated science fiction tropes. Others, such as Prof Bradley Love, from UCL, agreed there were risks: "Clever AI will create tremendous wealth for society, but will leave many people without jobs," he told The Conversation.
But he added: "If we are going to worry about the future of humanity we should focus on the real challenges, such as climate change and weapons of mass destruction rather than fanciful killer AI robots."
Tipping point
Hawking regarded global warming as one of the biggest threats to life on the planet. The physicist was particularly fearful of a so-called tipping point, where global warming would become irreversible. He also expressed concern about America's decision to pull out of the Paris Agreement.
"We are close to the tipping point where global warming becomes irreversible. Trump's action could push the Earth over the brink, to become like Venus, with a temperature of 250 degrees, and raining sulphuric acid," he told BBC News.
However, Hawking was in plentiful company in regarding global warming as one of the great challenges of centuries to come.
Shhhh, keep it down
There's a whole field of science, known as Seti (The Search for Extra-Terrestrial Intelligence) dedicated to listening for signals from intelligent beings elsewhere in the Universe. But Hawking cautioned against trying to actively hail any alien civilisations that might be out there.In 2010, he told the Discovery Channel that aliens might simply raid Earth for resources and then move on.
"If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans," he said.
"We only have to look at ourselves to see how intelligent life might develop into something we wouldn't want to meet."
But others saw the logic in Hawking's comments. Ian Stewart, a mathematician at Warwick University, commented: "Lots of people think that because they would be so wise and knowledgeable, they would be peaceful. I don't think you can assume that."
Controversial headlines
The media attention gave him an unprecedented platform. But some in the scientific community were occasionally less enthusiastic about the resulting headlines than the journalists who wrote them.Indeed, I've been asked in the past why the British media seemed to hang on Hawking's every word.
Prof Sir Martin Rees, the Astronomer Royal, said: "He had robust common sense, and was ready to express forceful political opinions.
"However, a downside of his iconic status was that that his comments attracted exaggerated attention even on topics where he had no special expertise - for instance philosophy, or the dangers from aliens or from intelligent machines."
But many would also argue that, beyond individual statements or headlines, Hawking had a unique ability to connect with the public.
They would say that the "hype" this sometimes generated was an inevitable by-product of his household name status. Instead, we should focus on a greater good - his ability to bring science to the attention of people who might otherwise never have given it a second thought.
It's testament to his success as a communicator that the mourning for this champion of rational thinking extends far beyond the scientific community.
Comments
Post a Comment