These days, people are turning to ChatGPT for everything from help writing a professional email to couples therapy and while it can be a helpful tool, there are some limits to keep in mind. Experts caution that there are a few specific things we should never share with the artificial intelligence platform.
When you reveal something to a chatbot, “you lose possession of it,” explains Jennifer King, a fellow at the Stanford Institute for Human-Centered Artificial Intelligence. So if you want to protect your privacy, this is the personal or sensitive information you should avoid sharing with ChatGPT or an AI chatbot.
- Identity information - This includes things like your Social Security number, driver’s license number, date of birth and even address and phone numbers. An OpenAI spokeswoman says, “We want our AI models to learn about the world, not private individuals, and we actively minimize the collection of personal information.” But it’s safer to avoid sharing identifying information at all.
- Medical results - The healthcare industry values confidentiality and tries to protect patients’ personal information, but AI chatbots aren’t typically part of the confidentiality protection. If you want ChatGPT to interpret lab work or medical test results, King recommends cropping or editing the document so you’re only sharing the test results.
- Financial accounts - Sharing bank account and investment account numbers is never a good idea as these can be hacked and used to access your money.
- Login information - You may be tempted to share your account usernames and passwords with ChatGPT when it’s helping with tasks, but it’s risky. AI agents don’t keep information secure, so you’re better off not revealing login info and using a password manager instead.
- Private corporate information - If you’re using ChatGPT or another chatbot for something at work, like drafting emails or editing documents, experts warn there’s a possibility of exposing client data or trade secrets by mistake. To protect your privacy, be sure to delete every conversation after it’s over, as Jason Clinton, Anthropic’s chief information security officer, says companies usually permanently get rid of “deleted” data after 30 days.
Source: NY Post