Storage Backends
Briefcase provides immutable storage backends for long-term audit retention.
These are separate from the Rust-backed SqliteBackend — they write to
object storage with Object Lock for tamper-proof, WORM compliance.
All backends inherit from PythonStorageBackend and implement:
async def write(self, key: str, record: bytes) -> StorageReceipt
A StorageReceipt contains:
@dataclass
class StorageReceipt:
key: str # Object key (path) in the bucket
hash: str # SHA-256 hex digest of the written bytes
timestamp: datetime # UTC timestamp of the write
backend_metadata: dict # Provider-specific metadata (bucket, region, etc.)
S3ObjectLockStorage
Stores decision records as JSON objects in an AWS S3 bucket with Object Lock retention. Records become immutable for the specified retention period (WORM compliance).
Prerequisites
pip install boto3
Your S3 bucket must have Object Lock enabled (cannot be enabled after creation).
Constructor
from briefcase.storage import S3ObjectLockStorage
storage = S3ObjectLockStorage(
bucket="decisions-archive",
region="us-east-1",
retention_days=2555, # 7 years (typical regulatory requirement)
access_key=None, # AWS access key ID (or use env/IAM role)
secret_key=None, # AWS secret access key
endpoint_url=None, # Custom endpoint (for MinIO, LocalStack, etc.)
)
| Parameter | Default | Description |
|---|---|---|
bucket | required | S3 bucket name (must have Object Lock enabled) |
region | "us-east-1" | AWS region |
retention_days | 2555 | Retention period in days |
access_key | None | Explicit credentials. Falls back to environment/IAM. |
secret_key | None | Explicit credentials. |
endpoint_url | None | Override for MinIO, LocalStack, or custom S3-compatible stores. |
Example
import asyncio
import os
from briefcase.storage import S3ObjectLockStorage
storage = S3ObjectLockStorage(
bucket="my-ai-decisions",
region="us-east-1",
retention_days=2555,
access_key=os.environ["AWS_ACCESS_KEY_ID"],
secret_key=os.environ["AWS_SECRET_ACCESS_KEY"],
)
record = {
"decision_id": "abc123",
"function_name": "claims_review",
"outputs": {"decision": "approve", "confidence": 0.93},
}
import json
key = f"briefcase/decisions/{record['decision_id']}.json"
record_bytes = json.dumps(record).encode()
receipt = asyncio.run(storage.write(key, record_bytes))
print(f"Stored at: {receipt.key}")
print(f"Hash: {receipt.hash}")
Object Keys
Records are stored with keys in the format:
briefcase/{YYYY}/{MM}/{DD}/{decision_id}.json
DOSpacesStorage
DigitalOcean Spaces is S3-compatible. DOSpacesStorage inherits from
S3ObjectLockStorage and sets the endpoint URL automatically.
Constructor
from briefcase.storage import DOSpacesStorage
storage = DOSpacesStorage(
bucket="decisions-archive",
region_name="nyc3", # DigitalOcean region (nyc3, ams3, sgp1, etc.)
retention_days=2555,
access_key=None,
secret_key=None,
)
The endpoint is set to https://{region_name}.digitaloceanspaces.com automatically.
Example
import os
from briefcase.storage import DOSpacesStorage
storage = DOSpacesStorage(
bucket="my-ai-decisions",
region_name="nyc3",
access_key=os.environ["DO_SPACES_ACCESS_KEY"],
secret_key=os.environ["DO_SPACES_SECRET_KEY"],
)
SQLite Backend (Rust Core)
For development and testing, use the Rust-backed SqliteBackend from
briefcase_ai. This provides full decision snapshot storage with replay:
import briefcase_ai
briefcase_ai.init()
# In-memory (testing)
db = briefcase_ai.SqliteBackend.in_memory()
# Persistent
db = briefcase_ai.SqliteBackend("decisions.db")
# Save and load
decision_id = db.save_decision(snapshot)
snapshot = db.load_decision(decision_id)
Configuring Storage via setup()
from briefcase.config import setup
from briefcase.storage import S3ObjectLockStorage
setup(
storage=S3ObjectLockStorage(
bucket="my-decisions",
region="us-east-1",
)
)
See Also
- Infrastructure — Exporters — ship to observability platforms
- Infrastructure — Events — publish to webhooks and Kafka
- End-to-End Workflow