1 link tagged with all of: ai-research + safety + openai + bio-bounty + jailbreak
Links
OpenAI is inviting applications for its bio bug bounty program focused on testing universal jailbreaks for ChatGPT agents. Participants can earn up to $25,000 by identifying effective jailbreak prompts to answer bio/chem safety questions, with applications opening on July 17, 2025.
bio-bounty ✓
jailbreak ✓
safety ✓
openai ✓
ai-research ✓