Within two months of its launch, ChatGPT picked up over 100 million users, making it the most rapidly growing tool ever created.
However, with these millions of users actively logging into the conversational AI software and thousands more added daily, it significantly increases the potential of a data breach.
On March 20, OpenAI deliberately took the service offline to fix a bug that directly caused the first-ever data breach reported for ChatGPT.
On Tuesday, the AI platform underwent an urgent maintenance session when a user exploited a system vulnerability, enabling them to access the titles from other users’ chat histories.
After taking the AI bot offline for around 10 hours for further investigation, a severe security concern came to light.
The chat history bug might have inadvertently exposed personal information belonging to approximately 1.2 percent of ChatGPT Plus subscribers who were active during a 9-hour window.
In an official statement, OpenAI said that during this 9-hour window, “it was possible for some users to see another active user’s first and last name, email address, payment address, the last four digits (only) of a credit card number, and credit card expiration date.”
OpenAI attributed Monday’s ChatGPT outage and data breach, during which users encountered other users’ details and chat queries, to a Redis client open-source library bug.
In essence, OpenAI employs Redis for caching user data.
On March 20, at 0100 PT, a modification in the service’s systems resulted in a surge of activity, which led to the cache dispensing incorrect information due to the client library flaw.
The data leakage continued until 1000 PT on the same day.
Speaking to the users via Twitter, OpenAI CEO Sam Altman formally apologized for the data breach and said they “feel awful about this.”
The company has since addressed the issue and shared technical details in a report.
However, during the fixing of the bug, the company discovered the same flaw might have also been responsible for a breach involving more personal data.