SDK Wrappers
Get started in 30 seconds. One import change. Zero config.
Overview
Stockyard SDK wrappers are thin drop-in replacements for official LLM client libraries. Change one import line and all traffic automatically routes through your Stockyard proxy — no base URL configuration, no HTTP client swapping, no middleware setup.
Python
pip install stockyard-openai
Node
npm install @stockyard/openai
Go
go get github.com/stockyard-dev/stockyard-openai-go
Python
Install
pip install stockyard-openai
Before (vanilla OpenAI)
from openai import OpenAI client = OpenAI(api_key="sk-...") response = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Hello"}] )
After (with Stockyard)
from stockyard_openai import OpenAI client = OpenAI(api_key="sk-...") response = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Hello"}] )
One line changed. Everything else stays the same. The wrapper automatically routes through http://localhost:4200/v1 (or your STOCKYARD_URL).
Node
Install
npm install @stockyard/openai
Before (vanilla OpenAI)
import OpenAI from 'openai'; const client = new OpenAI(); const response = await client.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: 'Hello' }], });
After (with Stockyard)
import OpenAI from '@stockyard/openai'; const client = new OpenAI(); const response = await client.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: 'Hello' }], });
Go
Install
go get github.com/stockyard-dev/stockyard-openai-go
Before (vanilla OpenAI)
import "github.com/openai/openai-go" client := openai.NewClient()
After (with Stockyard)
import "github.com/openai/openai-go" import "github.com/stockyard-dev/stockyard-openai-go" client := openai.NewClient( option.WithBaseURL(stockyard.BaseURL()), option.WithHTTPClient(&http.Client{Transport: stockyard.Transport()}), )
The Go wrapper provides helper functions for base URL and transport configuration. All requests flow through Stockyard with full middleware support.
Environment Variables
All SDK wrappers respect these environment variables:
| Variable | Default | Description |
|---|---|---|
STOCKYARD_URL | http://localhost:4200 | Stockyard proxy base URL |
STOCKYARD_API_KEY | — | API key for authenticated Stockyard instances |
OPENAI_API_KEY | — | Your OpenAI API key (passed through to provider) |
STOCKYARD_TIMEOUT | 30s | Request timeout for proxy calls |
STOCKYARD_RETRY | true | Enable client-side retries on transient errors |
STOCKYARD_VERIFY_SSL | true | Verify SSL certificates (disable for local dev) |
Cloud users: Set
STOCKYARD_URL to your cloud endpoint (e.g., https://your-team.stockyard.dev) and the SDK handles everything else.FAQ
- Do the wrappers support streaming?
- Yes. Streaming works identically to the official SDKs. The wrapper passes through SSE streams from the proxy without buffering.
- What if Stockyard is down?
- By default, the wrappers fail open — if the proxy is unreachable and
STOCKYARD_RETRYis enabled, the SDK retries up to 3 times. You can also configure a fallback to call the provider directly. - Can I use multiple providers through the same SDK?
- Yes. Stockyard routes based on the model name. Use
model: "claude-sonnet-4-20250514"through the OpenAI SDK wrapper and Stockyard routes it to Anthropic automatically. - Do I need to change my tests?
- No. The wrapper has the same interface as the official SDK. Your existing mocks and test fixtures work unchanged.
- What about Anthropic and other provider SDKs?
- Wrappers for the Anthropic SDK, Google Vertex SDK, and others are in development. In the meantime, set
base_urlmanually to point at your Stockyard instance.