Quick Start
Get OGuardAI running and make your first API call in under a minute
Start the Server
docker run -p 8080:8080 \
-e GUARDAI_PORT=8080 \
-e GUARDAI_SESSION_SECRET=change-me-32-byte-secret-value!! \
ghcr.io/oronts/oronts-guardai/oguardai-server:latestOne binary. Built-in regex detectors. Sealed sessions. No Python, no Redis.
How It Works
Check Health
curl http://localhost:8080/v1/health
# {"status":"healthy","version":"0.1.0","uptime_seconds":1.2}Transform Text
Send sensitive text to OGuardAI. It detects PII, replaces it with semantic tokens, and returns the safe version along with an encrypted session state blob.
curl -X POST http://localhost:8080/v1/transform \
-H "Content-Type: application/json" \
-d '{"input": "Contact julia@firma.de about invoice #12345"}'Response:
{
"safe_text": "Contact {{email:e_001}} about invoice #12345",
"session_id": "01916a3e-7b2c-7000-8000-000000000001",
"session_state": "eyJhbGci...",
"entities": [
{
"token": "{{email:e_001}}",
"type": "email",
"span": { "start": 8, "end": 23 }
}
]
}Rehydrate (Restore)
After the LLM responds using the tokenized text, restore the original values:
curl -X POST http://localhost:8080/v1/rehydrate \
-H "Content-Type: application/json" \
-d '{
"output": "I have emailed {{email:e_001}} about the invoice.",
"session_state": "eyJhbGci..."
}'Response:
{
"restored_text": "I have emailed julia@firma.de about the invoice."
}What Happened
- OGuardAI detected
julia@firma.deas an email entity - Replaced it with
{{email:e_001}}-- a semantic token the LLM can reason about - Stored the mapping in an AES-256-GCM encrypted session blob
- On rehydrate, decrypted the blob and restored the original value
The LLM never saw the real email address.
Detect Only
You can also run detection without transformation to see what entities OGuardAI finds:
curl -X POST http://localhost:8080/v1/detect \
-H "Content-Type: application/json" \
-d '{"input": "SSN: 123-45-6789, email: julia@firma.de"}'Round-Trip Test
Run a full transform-then-rehydrate cycle to verify end-to-end behavior:
# Transform
RESPONSE=$(curl -s -X POST http://localhost:8080/v1/transform \
-H "Content-Type: application/json" \
-d '{"input": "Contact Julia at julia@firma.de or call 555-0123."}')
echo "$RESPONSE" | jq .
# Rehydrate
SESSION_STATE=$(echo "$RESPONSE" | jq -r '.session_state')
curl -s -X POST http://localhost:8080/v1/rehydrate \
-H "Content-Type: application/json" \
-d "{\"output\": $(echo "$RESPONSE" | jq '.safe_text'), \"session_state\": \"$SESSION_STATE\"}" \
| jq .Using the CLI
If you have the OGuardAI CLI installed, you can transform and detect from the command line:
oguardai transform --input "Contact julia@firma.de"
oguardai detect --input "SSN: 123-45-6789"Using the OpenAI Proxy
For the fastest integration with existing OpenAI code, start the proxy and change one URL:
# Start the proxy
oguardai-proxy --target https://api.openai.com --policy default --port 8081from openai import OpenAI
client = OpenAI(
api_key="sk-...",
base_url="http://localhost:8081/v1" # Only change needed
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Help customer Julia Schneider (julia@firma.de)"}]
)
# OpenAI sees: "Help customer `{{person:p_001}}` (`{{email:e_001}}`)"
# Response is automatically restored before reaching your codeDocker Compose (Full Stack)
For a full local setup with NER detection and Redis sessions:
GUARDAI_SESSION_SECRET=your-production-secret \
docker compose -f deploy/docker/docker-compose.yml up --buildThis starts the Rust API server (port 3000), Python NER detector (port 9090), and Redis (port 6379).
API Reference
| Endpoint | Purpose |
|---|---|
POST /v1/transform | Transform text, return safe output + session state |
POST /v1/rehydrate | Restore tokens using session state |
POST /v1/detect | Detect entities only (no transformation) |
POST /v1/evaluate-policy | Evaluate policy against entities |
GET /v1/health | Health check |
GET /v1/capabilities | List entity types, languages, detectors |
POST /v1/transform/stream | Stream transform with SSE |
POST /v1/rehydrate/stream | Stream rehydrate with SSE |
Next Steps
- Installation -- explore all deployment options
- Configuration -- configuration file and environment variable reference
- SDK Guide -- integrate with Python or TypeScript
- OpenAI Proxy -- transparent proxy for OpenAI/Anthropic
- Security Guarantees -- what OGuardAI does and does not guarantee