Cookbook
Integration recipes for popular AI frameworks. Point them at Stockyard for observability, caching, and cost control.
LangChain (Python)
Use Stockyard as the OpenAI backend for LangChain:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-4o",
openai_api_base="http://localhost:4200/v1",
openai_api_key="your-stockyard-key"
)
response = llm.invoke("What is Stockyard?")
CrewAI
Route CrewAI agent calls through Stockyard:
import os os.environ["OPENAI_API_BASE"] = "http://localhost:4200/v1" os.environ["OPENAI_API_KEY"] = "your-stockyard-key" from crewai import Agent, Task, Crew agent = Agent(role="Researcher", goal="Find info", llm="gpt-4o") task = Task(description="Research topic", agent=agent) crew = Crew(agents=[agent], tasks=[task]) result = crew.kickoff()
AutoGen
Configure AutoGen agents to use Stockyard:
from autogen import AssistantAgent, UserProxyAgent
config_list = [{
"model": "gpt-4o",
"base_url": "http://localhost:4200/v1",
"api_key": "your-stockyard-key"
}]
assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})
user = UserProxyAgent("user")
user.initiate_chat(assistant, message="Hello")
Vercel AI SDK (Next.js)
Use Stockyard with the Vercel AI SDK:
import { createOpenAI } from '@ai-sdk/openai';
import { generateText } from 'ai';
const stockyard = createOpenAI({
baseURL: 'http://localhost:4200/v1',
apiKey: 'your-stockyard-key',
});
const { text } = await generateText({
model: stockyard('gpt-4o'),
prompt: 'Hello from Vercel AI SDK!',
});
Mastra
Configure Mastra to route through Stockyard:
import { Mastra } from '@mastra/core';
const mastra = new Mastra({
llm: {
provider: 'openai',
config: {
baseURL: 'http://localhost:4200/v1',
apiKey: 'your-stockyard-key',
},
},
});