Stockyard Trace Protocol v1
An open standard for recording LLM request traces. Adopt it for interoperable observability.
Required Fields
| Field | Type | Description |
|---|---|---|
| id | string | Unique trace identifier |
| service | string | Service name (default: "proxy") |
| operation | string | Operation type (chat_completion, embedding) |
| status | string | "ok" or "error" |
| created_at | string | RFC 3339 timestamp |
Optional Fields
| Field | Type | Description |
|---|---|---|
| provider | string | LLM provider (openai, anthropic, etc.) |
| model | string | Model identifier |
| duration_ms | integer | Request duration |
| tokens_in | integer | Input token count |
| tokens_out | integer | Output token count |
| cost_usd | number | Estimated cost |
| metadata_json | object | Extension metadata |
Example Trace
{
"id": "tr_abc123def456",
"service": "proxy",
"operation": "chat_completion",
"provider": "openai",
"model": "gpt-4o",
"status": "ok",
"duration_ms": 432,
"tokens_in": 150,
"tokens_out": 523,
"cost_usd": 0.0089,
"metadata_json": {
"_cache_hit": false,
"_smart_route_rule": "short-to-mini"
},
"created_at": "2026-03-21T14:23:01Z"
}
JSON Schema
Download the formal schema: trace-v1.schema.json
Adopt the Protocol
If you build LLM tooling, adopt the Stockyard Trace Protocol for interoperable observability. The schema is permissively licensed and designed for extension via the metadata_json field.