GabForge AI Documentation
The GabForge AI API is fully compatible with the OpenAI SDK. Swap your base URL, keep your existing code, and gain access to a local-first, privacy-respecting inference backend — with seamless cloud fallback when you need it.
OpenAI Compatible
Use any OpenAI SDK — Python, Node.js, Go, or cURL. Change one line of configuration, nothing else.
Local-First
Primary inference runs on our gabforge-coder model (Qwen2.5-Coder-32B). Your data stays on our infrastructure.
Cloud Fallback
Under high load, requests automatically fall back to claude-sonnet-4-6. Transparent to your code.
Get started in 3 steps
Get your API key
Sign in to your dashboard, navigate to API Keys, and click Create new key. Copy the key — it is only shown once.
Go to DashboardInstall the OpenAI SDK
The GabForge API is fully compatible with the official OpenAI Python package.
Make your first request
Point the client at the GabForge base URL and use your API key.
from openai import OpenAI client = OpenAI( base_url="https://ai.gabforge.ai/v1", api_key="YOUR_API_KEY", ) response = client.chat.completions.create( model="gabforge-coder", messages=[{"role": "user", "content": "Hello!"}], ) print(response.choices[0].message.content)
Explore the documentation
API Reference
Full endpoint documentation with request/response schemas and error codes.
Code Examples
Copy-ready code in Python, JavaScript, and cURL for common use cases.
Streaming Responses
Process tokens as they arrive for real-time chat and autocomplete experiences.
Available Models
Compare model capabilities, latency, and choose the right one for your workload.