Docs

Integrate models in 5 minutes

Start with an endpoint, API key, and model name, then finish the OpenAI-compatible setup inside your tool and review the model catalog and quota guide.

OpenAI-compatible request
base_url / key / model
from openai import OpenAI

client = OpenAI(
  api_key="sk-coding-key",
  base_url="https://www.prysmapi.cn"
)

response = client.chat.completions.create(
  model="coding-pro",
  messages=[{"role": "user", "content": "Review this patch"}]
)
Tool setup

Tool setup

Each tool below supports an OpenAI-compatible setup. The fields usually include base_url, API key, and model.

Cursor

Cursor

Set the OpenAI-compatible base URL and model name.

Codex

Codex

Configure a custom provider with your endpoint and key.

Cline

Cline

Connect through custom OpenAI-compatible providers.

OpenCode

OpenCode

Route coding requests through one API key.

OpenClaw

OpenClaw

Use shared model aliases for agent workflows.

Console

Quota and console

After activation, use the console to confirm API Access, Models, Usage, Logs, and Current Plan.

API Access

Review endpoint, API key, base_url, and sample model calls.

Models

Browse the standard pool, advanced pool, and recommended models by plan.

Usage

Track monthly quota, request count, usage trends, and the active plan.

Logs

Inspect integration errors, rate limits, model responses, and request traces.

FAQ

Frequently asked questions

Questions about access, models, plans, quotas, and support.

What do I need before I integrate?

You need an endpoint, API key, model name, current plan quota, and the OpenAI-compatible configuration entry in your tool.

How should I choose a model?

Use the standard pool for everyday work, then move to the Pro pool for deeper reasoning and large-repository tasks.

How do I debug failed requests?

Check base_url, API key, model, plan coverage, monthly quota, and RPM / TPM first, then review the logs.