By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
China ObserverChina Observer
Notification
Latest News
“A Victory for Justice & Multilateralism” – Iranian Ambassador Praises Pakistan’s Stand
January 24, 2026
Dr. Faisal Mushtaq honoured as Entrepreneur of the Year at LUMS
January 23, 2026
Smart parking brings greater convenience, efficiency to Jinan, Shandong
January 16, 2026
Deepening China-ASEAN food trade brings benefits to both sides
January 16, 2026
Dynamic Chongqing drone light show: a fusion of tradition and technology
January 16, 2026
Aa
  • Home
  • Pakistan
  • China
  • Sports
  • World
  • Business
  • Technology
  • Entertainment
  • Tourism
  • Videos
  • Health
  • More
    • Articles
    • Currency Rates
    • Gold Rates
    • Daily Horoscope
Reading: Musk and others warn that fast “big AI projects” could be hazardous to humanity.
Share
Aa
China ObserverChina Observer
  • Home
  • Pakistan
  • China
  • Sports
  • World
  • Business
  • Technology
  • Entertainment
  • Tourism
  • Videos
  • Health
  • More
Search
  • Home
  • Pakistan
  • China
  • Sports
  • World
  • Business
  • Technology
  • Entertainment
  • Tourism
  • Videos
  • Health
  • More
    • Articles
    • Currency Rates
    • Gold Rates
    • Daily Horoscope
Have an existing account? Sign In
Follow US
China Observer > Blog > Technology > Musk and others warn that fast “big AI projects” could be hazardous to humanity.
Technology

Musk and others warn that fast “big AI projects” could be hazardous to humanity.

March 30, 2023 4 Min Read
Updated 30/03/23 at 10:16 AM
Share
4 Min Read

Hundreds of celebrated artificial intelligence (AI) researchers including Tesla owner Elon Musk have undersigned an open letter recommending AI labs to revisit gigantic AI systems, ringing alarm bells over the “profound risks” these bots pose to society and humanity.

According to the letter, published by the nonprofit Future of Life Institute, AI labs are currently locked in an “out-of-control race” to develop and deploy machine learning systems “that no one — not even their creators — can understand, predict, or reliably control.”

“AI systems with human-competitive intelligence can pose profound risks to society and humanity,” said the open letter.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.

AI engineers around the world want to make sure that these powerful AI systems should be allowed to take logical time for the researchers to make sure they were safe.

Among the signatories of the letter are author Yuval Noah Harari, Apple co-founder Steve Wozniak, Skype co-founder Jaan Tallinn, politician Andrew Yang, and several well-known AI researchers and CEOs, including Stuart Russell, Yoshua Bengio, Gary Marcus, and Emad Mostaque.

The letter was mainly prompted by the release of GPT-4 from the San Francisco firm OpenAI.

The company says its latest model is much more powerful than the previous version, which was used to power ChatGPT, a bot capable of generating tracts of text from the briefest of prompts.

“Therefore, we call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4,” says the letter. “This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium.”

Musk was an initial investor in OpenAI, spent years on its board, and his car firm Tesla develops AI systems to help power its self-driving technology, among other applications.

The letter, hosted by the Musk-funded Future of Life Institute, was signed by prominent critics as well as competitors of OpenAI like Stability AI chief Emad Mostaque.

The letter quoted from a blog written by OpenAI founder Sam Altman, who suggested that “at some point, it may be important to get independent review before starting to train future systems”.

“We agree. That point is now,” the authors of the open letter wrote.

“Therefore, we call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.”

They called for governments to step in and impose a moratorium if companies failed to agree.

The six months should be used to develop safety protocols, AI governance systems, and refocus research on ensuring AI systems are more accurate, safe, “trustworthy and loyal”.

The letter did not detail the dangers revealed by GPT-4.

You Might Also Like

Zong Advances Autonomous Network Capabilities with Pilot Launch of Self-Intelligent RAN Optimization Platform

Chinese automaker empowering consumer choice and innovation

‘Energy oasis’ rises from Gobi desert

Commercial spaceflight fuels China’s space exploration efforts

China’s robot industry doubles revenue in five years

admin March 30, 2023
Share this Article
Facebook Twitter Whatsapp Whatsapp LinkedIn Email Print
Leave a comment

Leave a Reply Cancel reply

You must be logged in to post a comment.

Follow US

Find US on Social Medias
Facebook Like
Twitter Follow
Instagram Follow
Youtube Subscribe
Bilateral RelationsWorld

“A Victory for Justice & Multilateralism” – Iranian Ambassador Praises Pakistan’s Stand

Corporate

Dr. Faisal Mushtaq honoured as Entrepreneur of the Year at LUMS

China

Smart parking brings greater convenience, efficiency to Jinan, Shandong

China

Deepening China-ASEAN food trade brings benefits to both sides

China

Dynamic Chongqing drone light show: a fusion of tradition and technology

You Might Also Like

Technology

Zong Advances Autonomous Network Capabilities with Pilot Launch of Self-Intelligent RAN Optimization Platform

December 31, 2025
ChinaTechnology

Chinese automaker empowering consumer choice and innovation

December 16, 2025
ChinaTechnology

‘Energy oasis’ rises from Gobi desert

December 10, 2025
ChinaTechnology

Commercial spaceflight fuels China’s space exploration efforts

December 10, 2025
logo-chinaoberver-tranparent-small

About US

We influence 20 million users and is the number one business and technology news network on the planet.
  • Contact
  • Blog
  • Complaint
  • Advertise
Menu
  • Contact
  • Blog
  • Complaint
  • Advertise

Market Performers

Subscribe US

Weather Widgets for Websites

©China observer. All Rights Reserved.

Removed from reading list

Undo
Welcome Back!

Sign in to your account

Lost your password?