Explorer

Elon Musk And Other Tech Officials Call For 'Pause On Giant AI Experiments'

The release of GPT-4 by San Francisco-based company OpenAI prompted the open letter, which has already received more than 1,300 signatures, including those of Elon Musk, Apple co-founder Steve Wozniak.

Twitter chief and tech billionaire Elon Musk has signed an open letter to pause the development of powerful artificial intelligence (AI) systems to allow time to make sure they are safe. 

The open letter titled 'Pause Giant AI Experiments' is being circulated by the Future of Life Institute, a US-based institute that works with the aim "to steer transformative technologies away from extreme, large-scale risks and towards benefiting life." The institute is also funded by Elon Musk. 

The release of GPT-4 by San Francisco-based company OpenAI prompted the open letter, which has already received more than 1,300 signatures, including those of Elon Musk, Apple co-founder Steve Wozniak, and US Presidential Candidate 2020 Andrew Yang.

"AI systems with human-competitive intelligence can pose profound risks to society and humanity. Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable," said the letter. 

A founding investor in OpenAI, Musk served on the organization's board for a number of years, and his automaker firm Tesla is also developing AI systems to support, among other things, its self-driving technology.

The letter quoted from a blog written by OpenAI founder Sam Altman, "at some point, it may be important to get independent review before starting to train future systems".

"We agree. That point is now. Therefore, we call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4," the letter noted. 

Also Read: Elon Musk Now Most Followed Person On Twitter, Leaves Behind Barack Obama

"This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium," it said further.  

It further said that AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts. These protocols should ensure that systems adhering to them are safe beyond a reasonable doubt.

However, the letter did not detail the dangers of GPT-4.

View More
Advertisement
Advertisement
25°C
New Delhi
Rain: 100mm
Humidity: 97%
Wind: WNW 47km/h
See Today's Weather
powered by
Accu Weather
Advertisement

Top Headlines

'India-Guyana Ties Made Of Soil, Sweat': PM Modi At Guyanese Parliament, Says 'Democracy First, Humanity First'
'India-Guyana Ties Made Of Soil, Sweat': PM Modi At Guyanese Parliament, Says 'Democracy First, Humanity First'
Maharashtra: Two Latest Exit Polls Show Landslide Victory For Mahayuti, BJP To Emerge As Largest Party
Maharashtra: Two Latest Exit Polls Show Landslide Victory For Mahayuti, BJP As Largest Party
Pakistan: Over 50 Killed, 20 Injured As Militants Open Fire On Passenger Vehicles In Khyber Pakhtunkhwa
Pakistan: Over 50 Killed, 20 Injured As Militants Open Fire On Passenger Vehicles In Khyber Pakhtunkhwa
International Criminal Court Issues Arrest Warrants Against Israeli PM Netanyahu, Ex-Minister Gallant
Arrest Warrants Issued Against Israeli PM Benjamin Netanyahu, Ex-Minister Gallant
Advertisement
ABP Premium

Videos

India Emerges as G20's Growth Leader, Check Out the Latest GDP Rankings | ABP NewsAirtel-Nokia Partnership: Nokia’s Spectacular Comeback, Shakes Up the Telecom Sector | ABP NewsAdani Group in Turmoil: Bribery Scandal Rocks Shares, Plunge by 20% | ABP NewsPLI Scheme: Transforming India's Manufacturing Sector into a Global Powerhouse

Photo Gallery

Embed widget