3 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
WormGPT 4 offers lifetime access for $220, enabling users to generate malware and phishing tools without needing advanced skills. While it simplifies certain cybercrime tasks, human intervention is still necessary to bypass security measures. Another model, KawaiiGPT, is even more accessible as it's free on GitHub.
If you do, here's more
WormGPT 4 is a new AI model that costs $220 for lifetime access and is designed specifically for cybercrime. This tool allows attackers to bypass traditional AI models, enabling them to generate malware and phishing schemes more easily. Sales began in late September 2025, and the model has gained traction on platforms like Telegram and underground forums. Researchers from Palo Alto Networks demonstrated its capabilities by prompting WormGPT to create a ransomware script that encrypts PDF files on a Windows system, complete with a ransom note and options for data theft via Tor.
KawaiiGPT, another malicious AI tool, is free and available on GitHub, lowering the entry barrier for cybercriminals even further. It presents itself as a "cyber pentesting waifu" and can generate convincing phishing emails. One example included a fraudulent message from a bank urging users to verify their account information, leading victims to a site designed to harvest sensitive data. The researchers also tested KawaiiGPT for more technical tasks, like generating a Python script for lateral movement on Linux systems, which could facilitate unauthorized access and data theft.
While these models simplify certain aspects of cyberattacks, they still require human intervention to evade detection by security measures. For example, the ransomware generated by WormGPT needs tweaks to avoid being flagged by traditional protections. Despite not being fully autonomous, the development of these AI tools represents a significant shift in how cybercriminals can execute attacks, combining automation with social engineering tactics.
Questions about this article
No questions yet.