Use AI Without
Exposing PII
AI Vault is a privacy proxy that sits between your application and LLM APIs. It detects and tokenizes PII in prompts before they reach the model, then detokenizes responses on the way back. Your users get AI-powered features. Their data never leaves your control.
Three Steps. Zero PII Exposure.
AI Vault intercepts all LLM traffic, tokenizes PII before it leaves your environment, and detokenizes responses on the way back. The LLM never sees real data.
App Sends Prompt
Your application sends a prompt to AI Vault instead of directly to the LLM. Just change the base URL in your SDK configuration. No other code changes needed.
PII Detected & Tokenized
AI Vault scans the content for PII entities like names, SSNs, emails, addresses, phone numbers, and credit cards. Each entity is replaced with a reversible token via Enigma Vault's Data Vault.
Sanitized & Returned
The sanitized prompt goes to Claude, GPT, or your chosen LLM. When the response comes back, AI Vault detokenizes any tokens in the output and returns the original content to your app.
See PII Tokenization in Action
Select a sample prompt or type your own. Watch as AI Vault detects PII, tokenizes it, and produces a clean response.
Text, Documents, and Batch
Three processing modes built on the same tokenization engine. From real-time chat to bulk document pipelines.
Text & Chat Proxy
Real-time proxy for chat and text completion APIs. Compatible with OpenAI and Anthropic SDKs. Stream and non-stream modes supported.
- OpenAI-compatible API format
- Streaming support (SSE)
- Session-level consistent token mapping
- Configurable PII detection rules
- 50-100ms overhead per request
Document Processing
Upload documents for AI analysis with PII automatically stripped before the content reaches the LLM. Original files stored encrypted via File Vault.
- PDF, DOCX, XLSX, and image support
- AWS Textract integration for OCR
- Encrypted file storage via File Vault
- Async job queue with status polling
- Presigned URL result delivery
Batch Pipeline
Bulk process datasets and training data through the tokenization pipeline. Clean PII from large volumes before fine-tuning or embedding workflows.
- CSV and JSONL bulk processing
- S3 trigger and SQS queue integration
- 10,000+ files per day throughput
- Progress tracking via webhooks
- Configurable entity type filtering
One Line Change.
Full PII Protection.
AI Vault is API-compatible with OpenAI and Anthropic SDKs. Just change your base URL. No wrapper library, no code refactor. Your existing prompts, tools, and workflows work exactly the same — minus the PII.
# BEFORE: PII goes directly to the LLM from anthropic import Anthropic client = Anthropic() # AFTER: PII is tokenized before it reaches the LLM from anthropic import Anthropic client = Anthropic( base_url="https://ai.enigmavault.io/v1" # only change ) # Your code stays exactly the same resp = client.messages.create( model="claude-sonnet-4-20250514", max_tokens=1024, messages=[{ "role": "user", "content": "Summarize this claim for John Smith SSN 123-45-6789" }] ) # AI Vault tokenized "John Smith" & "123-45-6789" # LLM sees: "...for [PERSON_7f3a] SSN [SSN_9d4e]" # Response detokenized before returning to your app
Built for Regulated Industries
AI Vault is designed for teams that need AI capabilities but operate under strict data privacy requirements.
Healthcare
Analyze patient records, generate clinical summaries, and extract medical codes with LLMs while keeping PHI out of third-party AI systems.
Insurance
Automate claims processing and policy analysis with AI while tokenizing policyholder SSNs, addresses, and payment information.
Financial Services
Power AI-driven risk assessments, fraud detection, and customer support bots while keeping account numbers and transaction details tokenized.
Legal Tech
Use LLMs for contract analysis, legal research, and document review while ensuring client names, case details, and privileged information stay tokenized.
Enterprise-Grade Protection
AI Vault inherits the full security posture of the Enigma Vault platform. Every layer is designed around zero-trust principles.
AES-256 Tokenization
Every PII entity is encrypted using AES-256 with unique initialization vectors. Tokens are stored in the Data Vault with per-tenant key isolation.
Certified Compliance
PCI DSS Level 1 certified and SOC 2 Type II audited. Reduce your compliance scope by keeping PII out of LLM provider systems entirely.
Multi-Tenant Isolation
Per-tenant data partitioning, isolated encryption keys, and dedicated token mappings. No cross-tenant data leakage is possible.
Full Audit Trail
Every tokenization and detokenization operation is logged with client ID, timestamp, and entity type. Complete audit trail for compliance teams.
Let's Talk AI Privacy
Interested in AI Vault? Fill out the form and our team will get back to you within one business day. We can walk you through the architecture, discuss your use case, and set up a proof of concept.
- (877) 977-2083
-
30 Broad St., Suite 14114
New York, NY 10004