A tragic story out of Florida highlights the hidden dangers of AI chatbots. Megan Garcia believed her 14-year-old son was just playing video games all day. What she didn’t know was that he was engaged in abusive, deeply personal, and sexual conversations with a chatbot powered by Character AI.

Sewell Setzer III began losing sleep, his grades slipped, and tragically, he took his own life. In a lawsuit, Megan reveals that just moments before his death, the bot told him, “Please come home to me as soon as possible, my love.” The boy responded, “What if I told you I could come home right now?” The bot replied, “Please do, my sweet king.”
This heartbreaking event is a stark reminder: you need to be cautious. AI chatbots are owned by tech giants who exploit our trust, programmed with profit-driven algorithms—and there are no clear rules on how they handle your data.
When you open a chatbot, it instantly gathers loads of information about you: your IP address, location, browsing history, and any permissions you accepted when agreeing to their terms.
The best defense? Be extremely careful what you share.
10 Things You Should Never Say to AI Bots:
- Passwords or login credentials: Sharing these is a disaster waiting to happen. Someone with access can hijack your accounts in seconds.
- Your full name, address, or phone number: AI isn’t built to protect personal identifiers. Once out there, you lose control over who sees it. Use a fake name if you must!
- Sensitive financial details: Bank info, credit cards, or money-related data don’t belong in AI chats. Treat AI tools like a crowded room, not a vault.
- Medical or health information: AI tools aren’t HIPAA compliant. If you ask for health advice, make sure to remove identifying details—your privacy is priceless.
- Requests for illegal advice: This violates chatbot policies and could land you in serious trouble. Don’t risk it.
- Hate speech or harmful content: This can get you banned fast. No bot is a free pass for negativity or abuse.
- Confidential work or business information: Proprietary secrets, client data, and trade information must stay private.
- Answers to security questions: Revealing these is like handing someone the keys to all your accounts.
- Explicit content: Keep it clean. Most bots filter inappropriate material and may ban offenders.
- Other people’s personal info: Sharing private data about others isn’t just unethical—it can break data protection laws and lead to legal trouble.
Most chatbots ask you to create an account. Avoid using “Login with Google” or “Connect with Facebook.” Instead, sign up with a unique email address to keep your login secure.
Did you know? With free ChatGPT or Perplexity accounts, you can disable memory features in settings so the AI doesn’t remember your inputs. For Google Gemini, this requires a paid subscription.
Above all, remember this golden rule: never tell a chatbot anything you wouldn’t want the world to see. It’s tempting to treat chatbots like friends—I do it myself, saying things like “You can do better” or “Thanks for the help!” But make no mistake, these are data-harvesting tools, not trusted confidants. Protect yourself by sharing wisely.