base_url to https://api.pioneer.ai/v1, authenticate with your Pioneer API key, and use your training job ID as the model name. No other code changes are required.
Endpoints
| Method | Path | Description |
|---|---|---|
POST | /v1/messages | Create a message (Anthropic-compatible) |
Configure the Anthropic SDK
Pass your Pioneer API key and base URL when constructing the client:Create a message
POST /v1/messages accepts the same request shape as the Anthropic Messages API. Set model to your training job ID and include your messages array. You can pass Pioneer-specific fields like schema alongside the standard Anthropic fields.
Request parameters
Training job ID (e.g.
job_abc123) or a base model ID. This is the model that processes your request.Maximum number of tokens to generate in the response.
Conversation messages. Each object has a
role ("user" or "assistant") and a content string.Pioneer-specific extraction schema. Define
entities, classifications, structures, or relations to control what the model extracts. See Pioneer inference for full schema documentation.Examples
Streaming
The/v1/messages endpoint supports streaming. Set stream=True in the SDK or "stream": true in the raw request body.
Related
- Pioneer native inference — direct Pioneer endpoint with full schema documentation
- OpenAI-compatible inference — use the OpenAI SDK instead
- Inference history and feedback — retrieve past results and submit corrections

