ChatGPT Bug Reportedly Exposed Users' Chat Histories: Report
The conversational robot ChatGPT has transformed how people view artificial intelligence technology and how regulators should monitor it to protect against risks.
Created by US startup OpenAI, ChatGPT appeared in November and was quickly seized upon by users amazed at its ability to answer difficult questions clearly, write sonnets or code, and provide information on loaded issues.
ChatGPT has even passed medical and legal exams set for human students, scoring high marks.
But the technology also comes with many risks as its learning system and similar competitor models are integrated into commercial applications.
According to a Bloomberg story, a recent bug appears to have given other users of OpenAI's ChatGPT access to the titles of some users' previous chats with the AI chatbot.
The OpenAI temporarily shut down its ChatGPT service on Monday morning after receiving reports of this bug.
An OpenAI spokesperson told the news outlet that the titles were visible in the user-history sidebar that typically appears on the left side of the ChatGPT webpage. The chatbot was temporarily disabled after the company heard these reports, the spokesperson said. The substance of the other users' conversations was not visible.
According to reports, OpenAI attributed the error to undisclosed open-source software.
The news report mentioned that only the titles of past chats with the chatbot were visible, not the full text of the conversations.
Last week, OpenAI released GPT-4, the next-generation version of the technology that powers ChatGPT and Microsoft's new Bing browser, with similar safeguards. In the first day after it was unveiled, GPT-4 stunned many users in early tests and a company demo with its ability to draft lawsuits, pass standardized exams, and build a working website from a hand-drawn sketch.
However, it will be interesting to see how the global community will react to the power of artificial intelligence in the future and what measures will be taken to counter the negative impact and technical errors of this technology.
(With inputs from agencies)
Comments
Post a Comment