The quality rich article I would expect from the DailyMail, also the title, very informative, not editorialized at all, yep.<p>Btw, only 4 digits of the card details were accessible, official announcement: <a href="https://openai.com/blog/march-20-chatgpt-outage" rel="nofollow">https://openai.com/blog/march-20-chatgpt-outage</a>
As ChatGPT/OpenAI’s products grow in popularity so will its value to hackers. I have no doubt people using GPT are discussing sensitive details about operations of their business/personal lives with it.<p>So OpenAI should take cybersecurity seriously. Credit card details are nothing compared to the chat logs. Chat logs will be of high value.<p>Also I’ve seen the idea floating around, especially with typed languages like TypeScript, that developers write just the signature of function and have GPT/Copilot implement it. If developers trust the output and don’t care… What are the chances someone can trick GPT into producing unsafe code? There are attack vectors via the chat interface, training data, physical attacks via employees. Phishing an OpenAI employee to gain convert access to the infra/model.<p>If I was an intelligence agency, gaining covert access to OpenAI backend would be primary objective.
> We took ChatGPT offline Monday to fix a bug in an open source library...
Which library was that?<p>Update: Nvm, it was redis-py.
<a href="https://openai.com/blog/march-20-chatgpt-outage" rel="nofollow">https://openai.com/blog/march-20-chatgpt-outage</a>
I just signed-up for GPT Plus after weeks of deliberation. Even with this news, which seems to have been resolved, I am happy to have access to Plus. It's amazing.