In March this year, several well-known entities and AI researchers, including Elon Musk and Apple co-founder Steve Wozniak, penned an open letter addressed to AI labs globally to pause the development of large-scale AI systems, quoting “profound risks to society and humanity.”
Now, a new group of prominent AI experts and industry leaders has issued a stark warning about the potential dangers of artificial intelligence, calling for urgent action to mitigate the risk of extinction from AI. The one-sentence statement, posted on the Center for AI Safety website, reads: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.” Google AI veteran Geoffrey Hinton, also known as the ‘Godfather of AI,’ is at the top of the list of signatories.
Additionally, OpenAI CEO Sam Altman and Google DeepMind co-founder Demis Hassabis are also on the list of signatories.
The statement comes amid a surge of innovation and competition in the field of large language models (LLMs), such as OpenAI’s ChatGPT, Microsoft’s Bing AI and Google’s Bard, which can generate realistic and diverse texts on various topics. While these models have several potential benefits, they also pose a serious challenge when it comes to ethics, security, authenticity and creativity.
The aim with the statement is to help raise awareness and spark a conversation about the potential long-term and existential risks of AI, which may not be fully understood by its creators or general users. The statement echoes the sentiments of US President Joe Biden, who recently said that “it remains to be seen” if AI is dangerous, but also stressed the need for tech companies to ensure the safety of their products before releasing them to the public.
Image credit: Shutterstock
Source: Center for AI Safety Via: Engadget
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.