Contact Information

New York

We Are Available 24/ 7. Call Now.

In an open letter signed by Elon Musk, the co-founder of about six companies including electric car maker Tesla, rocket producer SpaceX and parent company of Starlink now operating in Africa; the advancement of AI is cited as a risk to society.

The advancement relates to developing systems more powerful than OpenAI’s newly launched GPT-4. ChatGPT (Generative Pre-trained Transformer) AI was a tip of the iceberg. There’s more and it scares the billionaire and more than 1,000 experts who signed the open letter.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” as stated in the letter issued by the Future of Life Institute.

Elon, with artificial intelligence experts and industry executives are calling for a six-month pause in developing more advanced AI systems.

However, some executives and experts in the AI space didn’t sign the letter. Some of the big names include the CEO of OpenAI, Sam Altman; and CEOs of Alphabet and Microsoft, Sundar Pichai and Satya Nadella among others.

Although the future is scary if AI continues to grow rapidly. But we can’t take away the good part such as its ability to make human work much easier and efficient. AI today engages in human-like dialogues, composes music, develops and summarizes documents.

While an AI break sounds like a good idea, people like James Grimmelmann, a professor of digital and information law at Cornell University thinks Elon Musk is nothing but a hypocrite.

“It is … deeply hypocritical for Elon Musk to sign on given how hard Tesla has fought against accountability for the defective AI in its self-driving cars,” James said. “A pause is a good idea, but the letter is vague and doesn’t take the regulatory problems seriously.”

It appears that Elon is also one of the co-founders of OpenAI. Perhaps, he sees the whole AI board and prays that the whole world sees it too.

By Elijah Christopher 



Leave a Reply

Your email address will not be published. Required fields are marked *