5 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Research reveals over 4,500 Clawdbot/Moltbot instances are publicly exposed, allowing attackers to extract sensitive data like API keys and WhatsApp session credentials. The vulnerabilities stem from insecure design, misconfigured dashboards, and excessive permissions. Immediate action is recommended for users to mitigate risks.
If you do, here's more
Over 4,500 instances of Clawdbot/Moltbot, an AI assistant designed for personal task management, have been found exposed online, primarily concentrated in the US, Germany, Singapore, and China. Security research indicates that these instances, meant for internal use, are accessible through public IPs, making sensitive data vulnerable. Attackers can exfiltrate crucial credentials, including API keys from .env files and session tokens from creds.json files, allowing for unauthorized surveillance, particularly via WhatsApp.
The vulnerabilities stem from several design flaws in Clawdbot/Moltbot. One significant issue is the "exec tool" feature, which allows users to execute shell commands without adequate security controls. This means that malicious actors can manipulate the system through simple chat commands. Researchers demonstrated the ease of accessing sensitive data by extracting .env files and creds.json files, which contain critical authentication information for various messaging platforms. The potential for live monitoring of incoming messages and user interactions poses serious privacy and security risks.
The findings highlight a broader issue where the rapid adoption of AI tools outpaces the security measures necessary to protect sensitive information. While Clawdbot/Moltbot offers impressive functionalities across multiple messaging platforms, the lack of proper security has left users exposed to significant risks. The research emphasizes the importance of addressing these gaps to prevent potential system takeovers and unauthorized data access.
Questions about this article
No questions yet.