Elon Musk says AI is 'dangerous technology' and needs regulating to ensure it's 'operating within the public interest'

Elon Musk, wearing a black suit and tie, waves to photographers outside a courthouse
Elon Musk in San Francisco in January.Justin Sullivan/Getty Images
  • Elon Musk expressed his concerns about artificial intelligence at Tesla's investor day on Wednesday.

  • The Tesla CEO believed he "may have done some things" that accelerated the "dangerous technology."

  • His comments were prompted by an investor asking whether AI could help Musk make cars.

Elon Musk doesn't think artificial intelligence will help Tesla make cars "anytime soon."

He made the comment at Tesla's investor day on Wednesday in response to a question from a shareholder.

But he reiterated his concerns about the technology. "I'm a little worried about AI stuff. I think it's something we should be concerned about," Musk said.


"We should need some kind of regulatory authority or something, overseeing AI development and making sure it's operating within the public interest."

Musk's thoughts on AI chimed with the views of OpenAI's chief technology officer, Mira Murati, who's said that AI tools should be regulated as they could be used by "bad actors."

Musk described AI as "quite a dangerous technology" in his response to the investor, adding he feared he "may have done things to accelerate it."

Last month, Musk said that unchecked AI could pose a threat to society in an address at the World Government Summit in Dubai. In 2018 he said the two things that most stressed him out were production difficulties with the Tesla Model 3,  and the dangers of AI.

Musk cofounded OpenAI, the company behind ChatGPT, the chatbot that has generated much attention since its release last November. People have been using it for side hustles using the AI tool, while others have used it to write cover letters.

Insider's Adam Rogers wrote about how ChatGPT, or other similar AI tools like Microsoft's Bing, are "bullshit engines" and why they shouldn't be trusted.

Read the original article on Business Insider