Stockyard Trace Protocol v1

An open standard for recording LLM request traces. Adopt it for interoperable observability.

Required Fields

FieldTypeDescription
idstringUnique trace identifier
servicestringService name (default: "proxy")
operationstringOperation type (chat_completion, embedding)
statusstring"ok" or "error"
created_atstringRFC 3339 timestamp

Optional Fields

FieldTypeDescription
providerstringLLM provider (openai, anthropic, etc.)
modelstringModel identifier
duration_msintegerRequest duration
tokens_inintegerInput token count
tokens_outintegerOutput token count
cost_usdnumberEstimated cost
metadata_jsonobjectExtension metadata

Example Trace

{
  "id": "tr_abc123def456",
  "service": "proxy",
  "operation": "chat_completion",
  "provider": "openai",
  "model": "gpt-4o",
  "status": "ok",
  "duration_ms": 432,
  "tokens_in": 150,
  "tokens_out": 523,
  "cost_usd": 0.0089,
  "metadata_json": {
    "_cache_hit": false,
    "_smart_route_rule": "short-to-mini"
  },
  "created_at": "2026-03-21T14:23:01Z"
}

JSON Schema

Download the formal schema: trace-v1.schema.json

Adopt the Protocol

If you build LLM tooling, adopt the Stockyard Trace Protocol for interoperable observability. The schema is permissively licensed and designed for extension via the metadata_json field.