Authentication is done via API keys. You can create API keys from the keys page.
API keys must be sent as a Authorization header with the value Bearer <api_key>.
If the API key is not valid, you will receive a 403 error.
from openai import OpenAIclient = OpenAI( base_url="https://api.thesys.dev/v1/embed", api_key="<api_key>")completion = client.chat.completions.create(model="c1/anthropic/claude-sonnet-4/v-20250930",messages=[{"role": "user","content": "How did the population of the world grow from 1950 to 2020?"}])
Stream this response via the language specific helper functions to the react
client.
from openai import OpenAIclient = OpenAI( base_url="https://api.thesys.dev/v1/embed", api_key="<api_key>")response = client.responses.create(model="c1/anthropic/claude-sonnet-4/v-20250930",input="How did the population of the world grow from 1950 to 2020?")print(response)
All C1 models are supported via the Responses API, including OpenAI and
non-OpenAI models. Our Responses API implementation is Open
Responses compliant.
There are three ways to maintain chat history in Responses API:
Pass the full conversation history — Include all previous messages in the input array with each request.
Use previous_response_id — Reference a prior response by its ID to automatically chain conversations (requires store: true).
Use conversation — Group related responses into a named conversation for persistent multi-turn context (requires store: true).
To create a conversation using previous_response_id:
Copy
Ask AI
from openai import OpenAIclient = OpenAI( base_url="https://api.thesys.dev/v1/embed", api_key="<api_key>")response = client.responses.create(model="c1/anthropic/claude-sonnet-4/v-20250930",input="What is the capital of France?")follow_up = client.responses.create(model="c1/anthropic/claude-sonnet-4/v-20250930",input="And what is its population?",previous_response_id=response.id)print(follow_up)
To create a conversation and use it across multiple requests:
Copy
Ask AI
from openai import OpenAI# Client for conversation managementconv_client = OpenAI(base_url="https://api.thesys.dev",api_key="<api_key>")# Client for generationembed_client = OpenAI(base_url="https://api.thesys.dev/v1/embed",api_key="<api_key>")conversation = conv_client.conversations.create()response = embed_client.responses.create(model="c1/anthropic/claude-sonnet-4/v-20250930",input="What is the capital of France?",store=True,conversation={"id": conversation.id})follow_up = embed_client.responses.create(model="c1/anthropic/claude-sonnet-4/v-20250930",input="And what is its population?",store=True,conversation={"id": conversation.id})print(follow_up)
Built-in tools (web_search, file_search, code_interpreter,
computer_use, mcp) will be supported soon.