Migration Guide
OpenAI SDK · Python · Node.js · TypeScript
baseURL at api.maig.dev and use your MAIG API key instead of your OpenAI key. Everything else stays the same.
Python (openai SDK)
Install the standard OpenAI Python SDK if you haven't already (pip install openai), then change the client initialization:
from openai import OpenAI
client = OpenAI(
api_key="sk-your-openai-key-here"
)
from openai import OpenAI
client = OpenAI(
base_url="https://api.maig.dev/v1",
api_key="maig_your_key_here"
)
# The rest of your code is unchanged:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
Node.js / TypeScript (openai SDK)
Install the standard OpenAI Node.js SDK if you haven't already (npm install openai), then change the client initialization:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.maig.dev/v1",
apiKey: process.env.MAIG_API_KEY,
});
// The rest of your code is unchanged:
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
What stays the same
- Message format —
roleandcontentfields are identical - Response format — the same
choices,message,finish_reason, andusagefields - Streaming with SSE —
stream: trueworks exactly as before; the same delta chunks and[DONE]terminator - Error types —
401,422,429, and other standard HTTP status codes - Model names — pass
"gpt-4o","gpt-4", or any OpenAI model name as-is
What's different
- API key format — MAIG keys start with
maig_instead ofsk- - You can use Claude and Gemini models without changing any other code — just pass a
claude-orgemini-model name - Usage is tracked per-project in the MAIG dashboard, not your OpenAI account
Using MAIG routing
Instead of passing a model name, you can pass a route name you've configured in the MAIG dashboard. MAIG handles provider selection, fallback, and retries automatically.
# Pass the route name as the model field
response = client.chat.completions.create(
model="my-production-route",
messages=[{"role": "user", "content": "Hello!"}]
)
This decouples your code from specific model names or providers. You can change the underlying model, swap providers, or configure fallback behaviour in the dashboard without touching your application code.
Framework compatibility
MAIG's gateway is OpenAI-compatible. Any library or framework that supports a custom baseURL works out of the box — no special integration required.
- LangChain — set
openai_api_baseorbase_urlon theChatOpenAIobject - LlamaIndex — pass
api_basetoOpenAI()in your LLM settings - Vercel AI SDK — use the
openaiprovider withbaseURLincreateOpenAI() - LiteLLM — set
api_base="https://api.maig.dev/v1"in your completion call - aisuite — configure the OpenAI provider with a custom base URL