OpenAI’s 2023 Breach Led to Hackers Stealing Company’s AI Secrets: Report

OpenAI’s 2023 Breach Led to Hackers Stealing Company’s AI Secrets: Report

A hacker gained access to the internal messaging systems at OpenAI last year and stole details about the design of the company’s artificial intelligence technologies, the New York Times reported on Thursday.

The hacker lifted details from discussions in an online forum where employees talked about OpenAI’s latest technologies, the report said, citing two people familiar with the incident.

However, they did not get into the systems where OpenAI, the firm behind chatbot sensation ChatGPT, houses and builds its AI, the report added.

Microsoft Corp-backed OpenAI did not immediately respond to a Reuters request for comment.

OpenAI executives informed both employees at an all-hands meeting in April last year and the company’s board about the breach, according to the report, but executives decided not to share the news publicly as no information about customers or partners had been stolen.

OpenAI executives did not consider the incident a national security threat, believing the hacker was a private individual with no known ties to a foreign government, the report said. The San Francisco-based company did not inform the federal law enforcement agencies about the breach, it added.

OpenAI in May said it had disrupted five covert influence operations that sought to use its AI models for “deceptive activity” across the internet, the latest to stir safety concerns about the potential misuse of the technology.

The Biden administration was poised to open up a new front in its effort to safeguard the U.S. AI technology from China and Russia with preliminary plans to place guardrails around the most advanced AI Models including ChatGPT, Reuters earlier reported, citing sources.

In May, 16 companies developing AI pledged at a global meeting to develop the technology safely at a time when regulators are scrambling to keep up with rapid innovation and emerging risks.

© Thomson Reuters 2024

Affiliate links may be automatically generated – see our ethics statement for details.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *