Hanzo S3
S3-compatible object storage for AI infrastructure
Hanzo S3
Hanzo S3 is a high-performance, S3-compatible object storage service built for AI workloads. It provides durable, scalable storage for model artifacts, training datasets, media assets, event streams, and application data across the Hanzo platform.
Console: hanzo.space
S3 API: s3.hanzo.ai / api.hanzo.space
Server: github.com/hanzoai/s3
Console: github.com/hanzos3/console
CLI: github.com/hanzos3/cli
Features
- S3 API Compatible: Drop-in replacement for Amazon S3 -- works with any S3 client, SDK, or tool
- Object Versioning: Track and restore previous versions of any object
- Lifecycle Management: Automated expiration, transition, and tiering rules
- Erasure Coding: Configurable data redundancy across drives and nodes
- Server-Side Encryption: SSE-S3 and SSE-KMS for data at rest and in transit
- OIDC SSO: Single sign-on via
hanzo.idfor console access - Bucket Policies: Fine-grained S3-compatible IAM policy documents
- Multi-Tenancy: Isolated namespaces and access boundaries for teams and services
- Event Notifications: Webhook and queue notifications on object mutations
- High Throughput: Designed to saturate NVMe and modern network hardware
Architecture
┌─────────────────────────────────────────────────────────────────┐
│ Clients │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────────┐ │
│ │ s3 CLI │ │ AWS SDKs │ │ Hanzo │ │ Console │ │
│ │ │ │ boto3 │ │ S3 SDKs │ │ hanzo.space │ │
│ └────┬─────┘ └────┬─────┘ └────┬─────┘ └──────┬───────┘ │
│ │ │ │ │ │
└───────┼──────────────┼──────────────┼───────────────┼───────────┘
│ │ │ │
▼ ▼ ▼ ▼
┌─────────────────────────────────────────────────────────────────┐
│ s3.hanzo.ai / api.hanzo.space │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ Hanzo S3 Server │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │
│ │ │ S3 API │ │ IAM / │ │ Erasure │ │ Lifecycle│ │ │
│ │ │ Handler │ │ Policies │ │ Coding │ │ Manager │ │ │
│ │ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ │
│ │ │ SSE-KMS │ │ Version │ │ Event │ │ Console │ │ │
│ │ │ Encrypt │ │ Control │ │ Notify │ │ UI :9001 │ │ │
│ │ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │ │
│ ┌─────────┴─────────┐ │
│ │ Persistent Data │ │
│ │ (NVMe / Block) │ │
│ └───────────────────┘ │
└─────────────────────────────────────────────────────────────────┘Quick Start
Install the CLI
# macOS
brew install hanzos3/tap/s3
# Linux (amd64)
curl -fsSL https://dl.hanzo.ai/s3/cli/linux-amd64/s3 -o /usr/local/bin/s3
chmod +x /usr/local/bin/s3
# From source
go install github.com/hanzos3/cli/cmd/s3@latestConfigure
# Add Hanzo S3 as an alias
s3 alias set hanzo https://s3.hanzo.ai $HANZO_S3_ACCESS_KEY $HANZO_S3_SECRET_KEY
# Verify connectivity
s3 admin info hanzoBasic Operations
# Create a bucket
s3 mb hanzo/my-models
# Upload a model artifact
s3 cp ./model.safetensors hanzo/my-models/v1/
# List objects
s3 ls hanzo/my-models/
# Download
s3 cp hanzo/my-models/v1/model.safetensors ./local-copy.safetensors
# Recursive sync
s3 mirror ./training-data/ hanzo/datasets/run-042/
# Remove an object
s3 rm hanzo/my-models/v1/model.safetensors
# Set bucket policy (public read)
s3 anonymous set download hanzo/public-assetsPython SDK
from hanzos3 import S3Client
client = S3Client(
endpoint="s3.hanzo.ai",
access_key="your-access-key",
secret_key="your-secret-key",
secure=True,
)
# Create a bucket
client.make_bucket("my-models")
# Upload a file
client.fput_object("my-models", "v1/model.safetensors", "./model.safetensors")
# Download a file
client.fget_object("my-models", "v1/model.safetensors", "./downloaded.safetensors")
# List objects
for obj in client.list_objects("my-models", prefix="v1/", recursive=True):
print(f"{obj.object_name} {obj.size} bytes")
# Presigned URL (7 days)
from datetime import timedelta
url = client.presigned_get_object("my-models", "v1/model.safetensors", expires=timedelta(days=7))
print(url)Install:
pip install hanzos3Go SDK
package main
import (
"context"
"fmt"
"log"
"github.com/hanzos3/go-sdk"
)
func main() {
client, err := hanzos3.New("s3.hanzo.ai", &hanzos3.Options{
Creds: hanzos3.NewStaticV4("access-key", "secret-key", ""),
Secure: true,
})
if err != nil {
log.Fatal(err)
}
ctx := context.Background()
// Create bucket
err = client.MakeBucket(ctx, "my-models", hanzos3.MakeBucketOptions{})
if err != nil {
log.Fatal(err)
}
// Upload
_, err = client.FPutObject(ctx, "my-models", "v1/model.safetensors",
"./model.safetensors", hanzos3.PutObjectOptions{})
if err != nil {
log.Fatal(err)
}
// List
for obj := range client.ListObjects(ctx, "my-models", hanzos3.ListObjectsOptions{
Prefix: "v1/",
Recursive: true,
}) {
if obj.Err != nil {
log.Fatal(obj.Err)
}
fmt.Printf("%s %d bytes\n", obj.Key, obj.Size)
}
}Install:
go get github.com/hanzos3/go-sdkJavaScript / TypeScript SDK
import * as HanzoS3 from '@hanzos3/js-sdk'
const client = new HanzoS3.Client({
endPoint: 's3.hanzo.ai',
port: 443,
useSSL: true,
accessKey: 'your-access-key',
secretKey: 'your-secret-key',
})
// Create bucket
await client.makeBucket('my-models')
// Upload
await client.fPutObject('my-models', 'v1/model.safetensors', './model.safetensors')
// List
const stream = client.listObjects('my-models', 'v1/', true)
stream.on('data', (obj) => {
console.log(`${obj.name} ${obj.size} bytes`)
})
stream.on('error', (err) => console.error(err))
// Presigned download URL
const url = await client.presignedGetObject('my-models', 'v1/model.safetensors', 7 * 24 * 60 * 60)
console.log(url)Install:
npm install @hanzos3/js-sdkAWS SDK Compatibility
Any standard AWS S3 SDK works with Hanzo S3. Override the endpoint:
# Python (boto3)
import boto3
s3 = boto3.client(
"s3",
endpoint_url="https://s3.hanzo.ai",
aws_access_key_id="your-access-key",
aws_secret_access_key="your-secret-key",
)
s3.put_object(Bucket="my-models", Key="v1/model.safetensors", Body=open("model.safetensors", "rb"))// TypeScript (@aws-sdk/client-s3)
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'
const client = new S3Client({
endpoint: 'https://s3.hanzo.ai',
region: 'us-east-1',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key',
},
forcePathStyle: true,
})
await client.send(new PutObjectCommand({
Bucket: 'my-models',
Key: 'v1/model.safetensors',
Body: readFileSync('model.safetensors'),
}))Platform Buckets
The Hanzo platform uses the following preconfigured buckets for internal services:
| Bucket | Purpose |
|---|---|
chat | Chat message history and attachments |
commerce | Product media, invoices, and transaction artifacts |
deploy | Deployment artifacts and build outputs |
hanzo | Core platform assets and configuration |
hanzo-events | Event stream uploads (analytics, audit logs) |
hanzo-media | User-uploaded media (images, video, documents) |
lux | Lux blockchain snapshots and state data |
pars | Pars network artifacts |
zen | Zen model weights and checkpoints |
zoo | Zoo research datasets and experiment outputs |
You can create additional buckets for your own projects at any time.
Console
The Hanzo S3 web console is available at hanzo.space:
┌─────────────────────────────────────────────────────────────┐
│ Hanzo S3 [email protected] ▾ │
├─────────────────────────────────────────────────────────────┤
│ │
│ Buckets (10) + Create Bucket │
│ ───────────────────────────────────────────────────────── │
│ Name Objects Size Created │
│ ───────────────────────────────────────────────────────── │
│ chat 12,847 3.2 GB 2025-11-01 │
│ commerce 4,231 1.8 GB 2025-11-01 │
│ deploy 8,912 24.1 GB 2025-11-01 │
│ hanzo 15,003 8.7 GB 2025-11-01 │
│ hanzo-events 142,500 12.4 GB 2025-11-01 │
│ hanzo-media 28,744 45.2 GB 2025-11-01 │
│ lux 2,100 67.3 GB 2025-12-15 │
│ pars 450 2.1 GB 2025-12-15 │
│ zen 1,205 312.8 GB 2026-01-10 │
│ zoo 3,891 89.4 GB 2026-01-10 │
│ │
│ Total: 219,883 objects, 567.0 GB │
│ │
└─────────────────────────────────────────────────────────────┘Sign in with your Hanzo account via OIDC SSO (powered by hanzo.id).
OIDC Integration
Hanzo S3 console authentication is integrated with hanzo.id via OpenID Connect:
- Provider:
hanzo.id(Hanzo IAM) - IAM App:
app-storage - Client ID: Configured in K8s secret
- Scopes:
openid,profile,email - Flow: Authorization Code with PKCE
Users with a Hanzo account can sign in to the console at hanzo.space without creating separate S3 credentials. Access policies are managed through bucket policies and IAM service accounts.
Service Accounts
For programmatic access (CI/CD, applications, agents), create service accounts:
# Create a service account with read/write to specific buckets
s3 admin user svcacct add hanzo my-user \
--access-key "svc-myapp-key" \
--secret-key "svc-myapp-secret" \
--policy /path/to/policy.jsonExample policy (policy.json):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject", "s3:ListBucket"],
"Resource": [
"arn:aws:s3:::my-models",
"arn:aws:s3:::my-models/*"
]
}
]
}Bucket Policies
Apply fine-grained access control with S3-compatible policy documents:
# Set a bucket policy
s3 anonymous set-json /path/to/policy.json hanzo/my-models
# View current policy
s3 anonymous get-json hanzo/my-modelsCommon Policies
Public read-only:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": ["arn:aws:s3:::public-assets/*"]
}
]
}Restrict to specific IP range:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject", "s3:PutObject"],
"Resource": ["arn:aws:s3:::internal-data/*"],
"Condition": {
"IpAddress": {"aws:SourceIp": "10.0.0.0/8"}
}
}
]
}Object Lifecycle
Automate object management with lifecycle rules:
# Set lifecycle rule: expire objects after 90 days
s3 ilm rule add hanzo/logs --expiry-days 90
# Transition to lower tier after 30 days
s3 ilm rule add hanzo/archives --transition-days 30 --storage-class GLACIER
# List lifecycle rules
s3 ilm rule ls hanzo/logsVersioning
Enable versioning to preserve every revision of an object:
# Enable versioning
s3 version enable hanzo/my-models
# List object versions
s3 ls --versions hanzo/my-models/v1/model.safetensors
# Restore a previous version
s3 cp --version-id "abc123" hanzo/my-models/v1/model.safetensors ./restored.safetensorsEncryption
All data at rest is encrypted by default. Choose the encryption method:
| Method | Description |
|---|---|
| SSE-S3 | Server-managed keys (default) |
| SSE-KMS | Keys managed by Hanzo KMS (kms.hanzo.ai) |
# Enable SSE-KMS on a bucket
s3 encrypt set sse-kms hanzo-kms-key hanzo/sensitive-dataEvent Notifications
Receive notifications when objects are created, accessed, or deleted:
# Add webhook notification for object creation
s3 event add hanzo/uploads arn:hanzo:sqs::my-queue \
--event put --suffix .safetensors
# List configured events
s3 event ls hanzo/uploadsSDKs
| Language | Package | Repository |
|---|---|---|
| Python | hanzos3 | github.com/hanzos3/py-sdk |
| Go | github.com/hanzos3/go-sdk | github.com/hanzos3/go-sdk |
| JavaScript | @hanzos3/js-sdk | github.com/hanzos3/js-sdk |
| Java | io.hanzo.s3 | github.com/hanzos3/java-sdk |
| Rust | hanzos3 | github.com/hanzos3/rust-sdk |
| .NET | HanzoS3 | github.com/hanzos3/dotnet-sdk |
| C++ | hanzos3-cpp | github.com/hanzos3/cpp-sdk |
All standard AWS S3 SDKs (boto3, @aws-sdk/client-s3, aws-sdk-go) also work by setting the endpoint to https://s3.hanzo.ai.
Docker
Run a standalone Hanzo S3 server:
docker run -p 9000:9000 -p 9001:9001 \
-e "HANZO_S3_ROOT_USER=admin" \
-e "HANZO_S3_ROOT_PASSWORD=changeme-strong-password" \
-v /data:/data \
ghcr.io/hanzoai/storage:latest server /data --console-address :9001- Port 9000: S3 API
- Port 9001: Web console
Related Services
How is this guide?
Last updated on