fbpx
News

OpenAI will reward you with up to $20,000 for finding ChatGPT bugs

Rewards range from $200 to $20,000

OpenAI is committed to making the ChatGPT experience better for all users. The platform has announced a new bug bounty program that tasks the public with finding bugs in ChatGPT, including vulnerabilities and security flaws.

“We are inviting the global community of security researchers, ethical hackers, and technology enthusiasts to help us identify and address vulnerabilities in our systems,” wrote OpenAI. “We are excited to build on our coordinated disclosure commitments by offering incentives for qualifying vulnerability information. Your expertise and vigilance will have a direct impact on keeping our systems and users secure.”

OpenAI is partnering with Bugcrowd, a crowdsourced cybersecurity platform, to manage the submission of bugs and the eventual reward process. The bounty program is open to all, and rewards range from $200 to $20,000 USD (about $269 to $26,876 CAD) for low-severity and exceptional discoveries, respectively.

ChatGPT has experienced several bugs in the past. In a recent incident, the entire system went offline after users reported seeing titles of chats they weren’t a part of. Further, Twitter user @rez0__ discovered over 80 secret plugins while hacking ChatGPT.

It’s worth noting that not all issues reported to the company will be eligible for a reward. OpenAI has stated that issues such as jailbreaking or getting the model to say or pretend to do anything negative will not qualify. It remains to be seen how successful OpenAI’s Bug Bounty Program will be in mitigating security risks for ChatGPT. However, this initiative highlights the importance of cybersecurity OpenAI’s commitment to making AI safe for all.

Click here to participate in OpenAI’s Bug Bounty Program.

Source: OpenAI

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments