Security & Data Handling
A plain-English overview of where dusto data lives, how it's protected, and where we're still building. Dusto is in early access — we'd rather be honest about gaps than perform certifications we don't hold.
The infrastructure dusto runs on.
Dusto runs primarily in AWS us-east-1 (Lambda, DynamoDB, S3, SES, Cognito, KMS), with payments on Stripe and LLM inference via OpenAI (orchestrator, drafters, agent planner) and AWS Bedrock (Amazon Nova Lite for the urgency classifier). The full list is at /legal/subprocessors.
At rest and in transit.
At rest
S3 buckets are encrypted with AES-256 (server-side, AWS-managed). DynamoDB encryption at rest is enabled with AWS-owned KMS keys. Cognito verification codes are encrypted with a dedicated KMS key.
In transit
TLS everywhere. All AWS service calls go over HTTPS. Cognito and API Gateway are HTTPS-only.
How you log in.
AWS Cognito with email + password (12+ characters; upper, lower, and numeric required) or Google sign-in. Refresh tokens are valid for 30 days; ID and access tokens for 1 hour. MFA is not currently enforced — it is on the roadmap.
How accounts stay separated.
Every API endpoint scopes its queries by the Cognito user ID extracted from a verified JWT. There is no cross-account access between dusto users.
What we send to the model.
The agent processes your email content to generate digests, alerts, and replies. Today we send content to two providers: OpenAI for the orchestrator, drafters, and agent planner; and AWS Bedrock (Amazon Nova Lite) for the per-message urgency classifier.
We do not redact email bodies before sending them to OpenAI; the urgency classifier receives only a bounded excerpt plus triage headers. Each provider's API terms govern their handling of the content. We are working toward formal no-training data-processing agreements with each and will publish updates as they land. If you need to keep email content out of an LLM entirely, dusto is not the right product today.
How long we keep things.
- Raw inbound MIME in S3: 21 days, auto-expired by S3 lifecycle rule.
- Parsed inbound bodies in S3: 21 days, auto-expired.
- Parsed outbound bodies in S3: kept until you delete your account — these are the emails dusto produced for you, and the same content is also delivered to your destination inbox.
- Agent traces (curation runs, prompts, tool calls): 90 days, auto-expired by DynamoDB TTL.
- Account, billing ledger, and dedup metadata: until you delete your account.
What 'delete' actually does.
DELETE /me cascades a hard delete: DynamoDB rows (your account, emails, traces, verification tokens), S3 objects (raw MIME and parsed bodies), Stripe customer (best-effort detach), and your Cognito user. Immediate and irreversible. Export first if you want a copy.
What we log, what we don't.
Structured logs include account identifiers and resource paths. They do not include email bodies, subjects, or addresses today, and we run a redaction safety rail to keep it that way. Logs are stored in AWS CloudWatch.
What we keep a copy of.
DynamoDB point-in-time recovery is currently disabled at our launch scale; account state can be reconstructed from Cognito and Stripe if needed. The S3 raw-mail bucket has no separate backup layer — the 21-day expiration is intentional. Terraform state is encrypted in S3 with a DynamoDB lock table.
Third parties we rely on.
See /legal/subprocessors for the full list. Material changes will be posted there and emailed to account holders.
Reporting a security issue.
Please email admin@spicadust.com. We don't run a paid bug bounty during early access; we appreciate responsible disclosure and will credit you on request.
What we don't have yet.
In the spirit of not overstating: here's what dusto is not today.
- No SOC 2 or ISO 27001 certification yet.
- No MFA enforcement yet (planned, opt-in first).
- Our LLM providers (OpenAI, AWS Bedrock) do not yet have no-training contractual carve-outs — we are working toward formal DPAs with each.
- Single-region deployment (AWS us-east-1); no geographic redundancy.
- Not yet self-certified under the EU–US Data Privacy Framework. For v1, transfers of EU/UK personal data rely on Standard Contractual Clauses (with the UK Addendum where applicable); DPF certification is on the v1.x roadmap.
- No published bug-bounty program; responsible disclosure to admin@spicadust.com is appreciated and we credit on request.
We're working through these. This page will update as they land.