Hawking's AI warning: A prescient voice on humanity's future
On Wednesday (8th January), Stephen Hawking would have turned 83. Unfortunately, the world-renowned physicist passed away in 2018. Before his death, he shared his vision of the future and drew attention to the threat posed by the development of artificial intelligence (AI).
Stephen Hawking is one of the most famous physicists in history. He, along with Roger Penrose, developed the concept of singularity and hypothesized that black holes can emit radiation. The scientist was incredibly busy despite his illness—amyotrophic lateral sclerosis—which confined him to a wheelchair. Additionally, he lost his ability to speak and communicated with the help of a synthesizer. Despite these challenges, he left a significant scientific legacy.
Hawking also shared his observations on artificial intelligence. However, his vision of AI development was very grim. Until the end of his life, the scientist warned humanity of the danger. In 2012, he said that "the development of full artificial intelligence could mean the end of the human race."
Hawking on AI. He warned humanity just before his death
AI could develop independently, redesigning itself at an ever-increasing pace. Humans, limited by slow biological evolution, couldn't compete and would be displaced—said Stephen Hawking when AI was just in its infancy.
His words are reiterated by the portal ladbible.com. The physicist signed a letter to the UN along with a hundred other experts warning against AI development. He remained faithful to his opinion on AI until his last moments.
I fear that AI may completely replace humans—he said a year before his death to Wired magazine.
In the book Brief Answers to the Big Questions, published several months after his death, he wrote: "We may face an intelligence explosion that could ultimately lead to machines surpassing us in intelligence more than we surpass snails. It's tempting to dismiss the notion of highly intelligent machines as pure science fiction, but that would be a mistake—and potentially the biggest mistake in our history."