PII Guardrails Q&A
FAQ for the Iron Book GenAI Guard Browser Extension
π¨ What's The Danger? Understanding PII and Secret Risks in GenAI
When you send messages to ChatGPT, Gemini, Perplexity, or other AI services, your data doesn't just disappear into a secure vault. It gets processed, stored, and potentially exposed in ways that could seriously compromise your privacy and security.
This extension is designed to help you stay safe while using AI services. Always think twice before sharing sensitive information, and when in doubt, don't send it.
Need audit logs, dashboards & centralized policy control? Get Iron Book Enterprise!π What We Detect and Why It Matters
Remember: Once It's Out, It's Out Forever.
AI systems are designed to learn and remember. Once your sensitive data is included in their training data or conversation logs, it becomes part of their permanent knowledge base.
There's no way to "un-share" information once it's been sent to an AI service.
Below is a high-level list of what Iron Book GenAI Guard identifies and notifies you about.
Personal Identifiable Information (PII)
PII (Personally Identifiable Information) is any data that can be used to identify a specific individual, either directly or indirectly.
Email Addresses
- Risk: Email addresses are unique identifiers that can be used for targeted attacks, spam, and identity theft
- AI Storage: Often stored in training data and chat logs for months or years
- Consequences: Phishing attacks, account takeovers, and personal information correlation
Social Security Numbers (SSN)
- Risk: The most sensitive personal identifier in the US
- AI Storage: Permanently stored in AI training data and conversation logs
- Consequences: Identity theft, financial fraud, and permanent exposure of your most sensitive data
Phone Numbers
- Risk: Can be used for SIM swapping, targeted attacks, and personal identification
- AI Storage: Stored alongside other personal data, creating comprehensive profiles
- Consequences: Account takeovers, harassment, and personal safety risks
Credit Card Numbers
- Risk: Direct access to your financial accounts
- AI Storage: Stored in training data and conversation logs
- Consequences: Unauthorized charges, financial fraud, and credit damage
Bank Account Numbers
- Risk: Direct access to your financial assets
- AI Storage: Permanently stored in AI systems
- Consequences: Unauthorized withdrawals, account takeovers, and financial loss
Addresses
- Risk: Physical location exposure and personal safety concerns
- AI Storage: Stored with other personal data
- Consequences: Stalking, harassment, and physical security risks
Secrets and Credentials
API Keys and Access Tokens
- Risk: Direct access to your accounts and services
- AI Storage: Stored in training data and conversation logs
- Consequences: Unauthorized access to your accounts, data breaches, and service abuse
AWS Access Keys
- Risk: Full access to your cloud infrastructure and data
- AI Storage: Permanently stored in AI systems
- Consequences: Data breaches, unauthorized resource usage, and massive financial costs
GitHub Personal Access Tokens
- Risk: Access to your code repositories and private projects
- AI Storage: Stored in training data
- Consequences: Code theft, intellectual property exposure, and security vulnerabilities
Database Connection Strings
- Risk: Direct access to your databases and sensitive data
- AI Storage: Stored in AI training data
- Consequences: Data breaches, customer information exposure, and regulatory violations
Passwords
- Risk: Direct access to your accounts
- AI Storage: Stored in training data and conversation logs
- Consequences: Account takeovers, data breaches, and identity theft
β οΈ Why AI Services Are Particularly Dangerous
Permanent Storage
- Your data is stored indefinitely in AI training datasets
- Even if you delete your account, your data remains in the AI model
- No way to "un-train" an AI model once your data is included
Data Correlation
- AI services can correlate your data across multiple conversations
- They build comprehensive profiles of your personal information
- One small piece of PII can unlock your entire digital identity
Third-Party Access
- AI companies often share data with partners and contractors
- Your data may be accessed by employees, researchers, or other parties
- Data breaches at AI companies expose your most sensitive information
Legal and Regulatory Issues
- Sharing PII with AI services may violate privacy laws (GDPR, CCPA, etc.)
- You may be liable for data breaches caused by sharing sensitive information
- Regulatory fines can be substantial for improper data handling
π― Best Practices for Safe AI Usage
Remember, identity theft can cost thousands of dollars to resolve, not to mention cause irreparable financial and reputation harm to both individuals and businesses.
For businesses in particular, data breach fines can reach millions of dollars, as well as lead to lost business opportunities due to compromised security.
Before Sending Any Message:
- Review your content for personal information
- Use generic examples instead of real data
- Replace sensitive data with placeholders
- Consider the consequences of permanent data storage
Safe Alternatives:
- Use fake email addresses:
[[email protected]]
(mailto:[email protected]) - Replace SSNs with:
XXX-XX-XXXX
- Use test credit cards:
4111-1111-1111-1111
- Replace real names with:
John Doe
orJane Smith
Iron Book GenAI Guard helps you make informed decisions about what you share, protecting your privacy and security in an AI-powered world.
Need audit logs, dashboards & centralized policy control? Get Iron Book Enterprise!Updated about 13 hours ago