Skip to content

Commit 122bf4d

Browse files
h4x3rotabclaude
andcommitted
feat: add confidential AI examples
- Add confidential-ai/ with inference, training, and agents examples - Update README with Confidential AI section 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 668295a commit 122bf4d

16 files changed

Lines changed: 990 additions & 16 deletions

README.md

Lines changed: 23 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99

1010
**Example applications for [dstack](https://github.com/Dstack-TEE/dstack) - Deploy containerized apps to TEEs with end-to-end security in minutes**
1111

12-
[Getting Started](#getting-started)[Use Cases](#use-cases)[Core Patterns](#core-patterns)[Dev Tools](#dev-scaffolding)[Starter Packs](#starter-packs)[Other Use Cases](#other-use-cases)
12+
[Getting Started](#getting-started)[Confidential AI](#confidential-ai)[Tutorials](#tutorials)[Use Cases](#use-cases)[Core Patterns](#core-patterns)[Dev Tools](#dev-scaffolding)[Starter Packs](#starter-packs)
1313

1414
</div>
1515

@@ -44,7 +44,7 @@ phala simulator start
4444
### Run an Example Locally
4545

4646
```bash
47-
cd tutorial/01-attestation-oracle
47+
cd tutorial/01-attestation
4848
docker compose run --rm \
4949
-v ~/.phala-cloud/simulator/0.5.3/dstack.sock:/var/run/dstack.sock \
5050
app
@@ -57,7 +57,23 @@ phala auth login
5757
phala deploy -n my-app -c docker-compose.yaml
5858
```
5959

60-
See [Phala Cloud](https://cloud.phala.network) for production TEE deployment.
60+
See [Phala Cloud](https://cloud.phala.com) for production TEE deployment.
61+
62+
---
63+
64+
## Confidential AI
65+
66+
Run AI workloads where prompts, model weights, and inference stay encrypted in hardware.
67+
68+
| Example | Description |
69+
|---------|-------------|
70+
| [confidential-ai/inference](./confidential-ai/inference) | Private LLM inference with vLLM on Confidential GPU |
71+
| [confidential-ai/training](./confidential-ai/training) | Confidential fine-tuning on sensitive data using Unsloth |
72+
| [confidential-ai/agents](./confidential-ai/agents) | Secure AI agent with TEE-derived wallet keys using LangChain and Confidential AI models |
73+
74+
GPU deployments require: `--instance-type h200.small --region US-EAST-1 --image dstack-nvidia-dev-0.5.4.1`
75+
76+
See [Confidential AI Guide](https://github.com/Dstack-TEE/dstack/blob/master/docs/confidential-ai.md) for concepts and security model.
6177

6278
---
6379

@@ -67,10 +83,10 @@ Step-by-step guides covering core dstack concepts.
6783

6884
| Tutorial | Description |
6985
|----------|-------------|
70-
| [01-attestation-oracle](./tutorial/01-attestation-oracle) | Use the guest SDK to work with attestations directly — build an oracle, bind data to TDX quotes via `report_data`, verify with local scripts |
71-
| [02-persistence-and-kms](./tutorial/02-persistence-and-kms) | Use `getKey()` for deterministic key derivation from a KMS — persistent wallets, same key across restarts |
72-
| [03-gateway-and-ingress](./tutorial/03-gateway-and-ingress) | Custom domains with automatic SSL, certificate evidence chain |
73-
| [04-upgrades](./tutorial/04-upgrades) | Extend `AppAuth.sol` with custom authorization logic — NFT-gated clusters, on-chain governance |
86+
| [01-attestation](./tutorial/01-attestation) | Build an oracle, bind data to TDX quotes via `report_data`, verify with local scripts |
87+
| [02-kms-and-signing](./tutorial/02-kms-and-signing) | Deterministic key derivation from KMS — persistent wallets, same key across restarts |
88+
| [03-gateway-and-tls](./tutorial/03-gateway-and-tls) | Custom domains with automatic SSL, certificate evidence chain |
89+
| [04-onchain-oracle](./tutorial/04-onchain-oracle) | AppAuth contracts, on-chain signature verification, multi-device deployment |
7490

7591
---
7692

@@ -120,15 +136,6 @@ TLS termination, custom domains, external connectivity.
120136
| Example | Description |
121137
|---------|-------------|
122138
| [dstack-ingress](./custom-domain/dstack-ingress) | **Complete ingress solution** — auto SSL via Let's Encrypt, multi-domain, DNS validation, evidence generation with TDX quote chain |
123-
| [custom-domain](./custom-domain/custom-domain) | Simpler custom domain setup via zt-https |
124-
125-
### Keys & Persistence
126-
127-
Persistent keys across deployments via KMS.
128-
129-
| Example | Description | Status |
130-
|---------|-------------|--------|
131-
| [get-key-basic](./get-key-basic) | `dstack.get_key()` — same key identity across machines | Coming Soon |
132139
133140
### On-Chain Interaction
134141

confidential-ai/README.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# Confidential AI Examples
2+
3+
Run AI workloads with hardware-enforced privacy. Your prompts, model weights, and computations stay encrypted in memory.
4+
5+
| Example | Description | Status |
6+
|---------|-------------|--------|
7+
| [inference](./inference) | Private LLM with response signing | Ready to deploy |
8+
| [training](./training) | Fine-tuning on sensitive data | Requires local build |
9+
| [agents](./agents) | AI agent with TEE-derived keys | Requires local build |
10+
11+
Start with inference—it deploys in one command and shows the full attestation flow.
12+
13+
```bash
14+
cd inference
15+
phala auth login
16+
phala deploy -n my-llm -c docker-compose.yaml \
17+
--instance-type h200.small \
18+
-e TOKEN=your-secret-token
19+
```
20+
21+
First deployment takes 10-15 minutes (large images + model loading). Check progress with `phala cvms serial-logs <app_id> --tail 100`.
22+
23+
See the [Confidential AI Guide](https://github.com/Dstack-TEE/dstack/blob/master/docs/confidential-ai.md) for how the security model works.

confidential-ai/agents/Dockerfile

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
FROM python:3.11-slim
2+
3+
WORKDIR /app
4+
5+
COPY requirements.txt .
6+
RUN pip install --no-cache-dir -r requirements.txt
7+
8+
COPY agent.py .
9+
10+
EXPOSE 8080
11+
12+
CMD ["python", "agent.py"]

confidential-ai/agents/README.md

Lines changed: 91 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,91 @@
1+
# Secure AI Agent
2+
3+
Run AI agents with TEE-derived wallet keys. The agent calls a confidential LLM (redpill.ai), so prompts never leave encrypted memory.
4+
5+
## Quick Start
6+
7+
```bash
8+
phala auth login
9+
phala deploy -n my-agent -c docker-compose.yaml \
10+
-e LLM_API_KEY=your-redpill-key
11+
```
12+
13+
Your API key is encrypted client-side and only decrypted inside the TEE.
14+
15+
Test it:
16+
17+
```bash
18+
# Get agent info and wallet address
19+
curl https://<endpoint>/
20+
21+
# Chat with the agent
22+
curl -X POST https://<endpoint>/chat \
23+
-H "Content-Type: application/json" \
24+
-d '{"message": "What is your wallet address?"}'
25+
26+
# Sign a message
27+
curl -X POST https://<endpoint>/sign \
28+
-H "Content-Type: application/json" \
29+
-d '{"message": "Hello from TEE"}'
30+
```
31+
32+
## How It Works
33+
34+
```mermaid
35+
graph TB
36+
User -->|TLS| Agent
37+
subgraph TEE1[Agent CVM]
38+
Agent[Agent Code]
39+
Agent --> Wallet[TEE-derived wallet]
40+
end
41+
Agent -->|TLS| LLM
42+
subgraph TEE2[LLM CVM]
43+
LLM[redpill.ai]
44+
end
45+
```
46+
47+
The agent derives an Ethereum wallet from TEE keys:
48+
49+
```python
50+
from dstack_sdk import DstackClient
51+
from dstack_sdk.ethereum import to_account
52+
53+
client = DstackClient()
54+
eth_key = client.get_key("agent/wallet", "mainnet")
55+
account = to_account(eth_key)
56+
# Same path = same key, even across restarts
57+
```
58+
59+
Both the agent and the LLM run in separate TEEs. User queries stay encrypted from browser to agent to LLM and back.
60+
61+
## API
62+
63+
| Endpoint | Method | Description |
64+
|----------|--------|-------------|
65+
| `/` | GET | Agent info, wallet address, TCB info |
66+
| `/attestation` | GET | TEE attestation quote |
67+
| `/chat` | POST | Chat with the agent |
68+
| `/sign` | POST | Sign a message with agent's wallet |
69+
70+
## Using a Different LLM
71+
72+
The agent uses redpill.ai by default for end-to-end confidentiality. To use a different OpenAI-compatible endpoint:
73+
74+
```bash
75+
phala deploy -n my-agent -c docker-compose.yaml \
76+
-e LLM_BASE_URL=https://api.openai.com/v1 \
77+
-e LLM_API_KEY=sk-xxxxx
78+
```
79+
80+
Note: Using a non-confidential LLM means prompts leave the encrypted environment.
81+
82+
## Cleanup
83+
84+
```bash
85+
phala cvms delete my-agent --force
86+
```
87+
88+
## Further Reading
89+
90+
- [Confidential AI Guide](https://github.com/Dstack-TEE/dstack/blob/master/docs/confidential-ai.md)
91+
- [dstack Python SDK](https://github.com/Dstack-TEE/dstack/tree/master/sdk/python)

0 commit comments

Comments
 (0)