Cursor
Set the OpenAI-compatible base URL and model name.
Start with an endpoint, API key, and model name, then finish the OpenAI-compatible setup inside your tool and review the model catalog and quota guide.
from openai import OpenAI
client = OpenAI(
api_key="sk-coding-key",
base_url="https://www.prysmapi.cn"
)
response = client.chat.completions.create(
model="coding-pro",
messages=[{"role": "user", "content": "Review this patch"}]
)Each tool below supports an OpenAI-compatible setup. The fields usually include base_url, API key, and model.
Set the OpenAI-compatible base URL and model name.
Configure a custom provider with your endpoint and key.

Connect through custom OpenAI-compatible providers.
Route coding requests through one API key.
Use shared model aliases for agent workflows.
After activation, use the console to confirm API Access, Models, Usage, Logs, and Current Plan.
Review endpoint, API key, base_url, and sample model calls.
Browse the standard pool, advanced pool, and recommended models by plan.
Track monthly quota, request count, usage trends, and the active plan.
Inspect integration errors, rate limits, model responses, and request traces.
Questions about access, models, plans, quotas, and support.
You need an endpoint, API key, model name, current plan quota, and the OpenAI-compatible configuration entry in your tool.
Use the standard pool for everyday work, then move to the Pro pool for deeper reasoning and large-repository tasks.
Check base_url, API key, model, plan coverage, monthly quota, and RPM / TPM first, then review the logs.