Skip to content

Musk Warns of War Against The Machines

Elon Musk and over 1,000 other tech gurus have signed a letter calling for Artificial Intelligence labs to halt the development of powerful new AI systems for at least six months. Signatories of the letter are worried that the AI systems are rapidly surpassing our ability to manage them. They asked developers to “immediately pause for at least six months the training of AI systems more powerful than GPT-4.” Among a rather extensive list of concerns are grievances about AI systems making jobs obsolete and tech-savvy dissidents using them to spread convincing and divisive propaganda. Although the tech world seemed united in its quest for an AI-skeptical society, one person was notably absent from signing on to the letter: the CEO of OpenAI, Sam Altman. Altman’s company has been at the forefront of AI development and raised these concerns in the first place.


FOX NEWS: Elon Musk, Apple co-founder, other tech experts call for pause on ‘giant AI experiments’: ‘Dangerous race’

By Chris Pandolfo; March 29, 2023

Elon Musk, Steve Wozniak, and a host of other tech leaders and artificial intelligence experts are urging AI labs to pause development of powerful new AI systems in an open letter citing potential risks to society.

The letter asks AI developers to “immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.” It was issued by the Future of Life Institute and signed by more than 1,000 people, including Musk, who argued that safety protocols need to be developed by independent overseers to guide the future of AI systems. GPT-4 is the latest deep learning model from OpenAI, which “exhibits human-level performance on various professional and academic benchmarks,” according to the lab. 

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” the letter said.

The letter warns that at this stage, no one “can understand, predict, or reliably control” the powerful new tools developed in AI labs. The undersigned tech experts cite the risks of propaganda and lies spread through AI-generated articles that look real, and even the possibility that Ai programs can outperform workers and make jobs obsolete. 

“AI labs and independent experts should use this pause to jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts,” the letter states.

“In parallel, AI developers must work with policymakers to dramatically accelerate development of robust AI governance systems.”

The signatories, which include Stability AI CEO Emad Mostaque, researchers at Alphabet-owned DeepMind, as well as AI heavyweights Yoshua Bengio and Stuart Russell, emphasize that AI development in general should be not paused, writing that their letter is calling for “merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities.”

According to the European Union’s transparency register, the Future of Life Institute is primarily funded by the Musk Foundation, as well as London-based effective altruism group Founders Pledge, and Silicon Valley Community Foundation.

Musk, whose electric car company Tesla uses AI for its autopilot system, has previously raised concerns about the rapid development of AI. 

Since its release last year, Microsoft-backed OpenAI’s ChatGPT has prompted rivals to accelerate developing similar large language models, and companies to integrate generative AI models into their products.

Notably absent from the letter’s signatories was Sam Altman, CEO of OpenAI. 

Photo: Michael Gonzalez/Getty Images

Today's News.
For Conservatives.
Every Single Day.

News Opt-in
(Optional) By checking this box you are opting in to receive news notifications from News Rollup. Text HELP for help, STOP to end. Message & data rates may apply. Message frequency varies. Privacy Policy & Terms: textsinfo.com/PP
This field is for validation purposes and should be left unchanged.