Create a chat API endpoint that handles streaming responses from the C1 model.

1

Install required dependencies

Install the necessary packages for your backend API:

npm install openai @crayonai/stream
2

Create the message store

First, create a simple in-memory message store to manage conversation history: This in-memory store just stores the list of messages for a given threadId including messages that are not sent to the client like the tool call messages.

app/api/chat/messageStore.ts
import OpenAI from "openai";

export type DBMessage = OpenAI.Chat.ChatCompletionMessageParam & {
  id?: string;
};

const messagesStore: {
  [threadId: string]: DBMessage[];
} = {};

export const getMessageStore = (threadId: string) => {
  if (!messagesStore[threadId]) {
    messagesStore[threadId] = [];
  }
  const messageList = messagesStore[threadId];
  return {
    addMessage: (message: DBMessage) => {
      messageList.push(message);
    },
    getOpenAICompatibleMessageList: () => {
      return messageList.map((m) => {
        const message = { ...m };
        delete message.id;
        return message;
      });
    },
  };
};
3

Create the API route file

Create a new file at app/api/chat/route.ts and add the necessary imports:

app/api/chat/route.ts
import { NextRequest, NextResponse } from "next/server";
import OpenAI from "openai";
import { transformStream } from "@crayonai/stream";
import { DBMessage, getMessageStore } from "./messageStore";
4

Set up the POST handler

Add the main POST function that will handle incoming chat requests:

app/api/chat/route.ts
// ... previous imports ...

export async function POST(req: NextRequest) {
  const { prompt, threadId, responseId } = (await req.json()) as {
    prompt: DBMessage;
    threadId: string;
    responseId: string;
  };

  // More code will go here...
}
5

Initialize the OpenAI client

Configure the OpenAI client to use Thesys API:

app/api/chat/route.ts
// ... inside the POST function ...

const client = new OpenAI({
  baseURL: "https://api.thesys.dev/v1/embed/",
  apiKey: process.env.THESYS_API_KEY,
});
6

Handle message storage

Add the user’s message to the conversation history:

app/api/chat/route.ts
// ... after client initialization ...

const messageStore = getMessageStore(threadId);
messageStore.addMessage(prompt);
7

Create streaming chat completion

Call the C1 model with the conversation history:

app/api/chat/route.ts
// ... after message storage ...

const llmStream = await client.chat.completions.create({
  model: "c1-nightly",
  messages: messageStore.getOpenAICompatibleMessageList(),
  stream: true,
});
8

Transform the response stream

Convert the OpenAI stream to a format suitable for your frontend:

app/api/chat/route.ts
// ... after llmStream creation ...

const responseStream = transformStream(
  llmStream,
  (chunk) => {
    return chunk.choices[0].delta.content;
  },
  {
    onEnd: ({ accumulated }) => {
      const message = accumulated.filter((message) => message).join("");
      messageStore.addMessage({
        role: "assistant",
        content: message,
        id: responseId,
      });
    },
  }
) as ReadableStream;
9

Return the streaming response

Return the response with proper headers for server-sent events:

app/api/chat/route.ts
// ... after responseStream creation ...

return new NextResponse(responseStream, {
  headers: {
    "Content-Type": "text/event-stream",
    "Cache-Control": "no-cache, no-transform",
    Connection: "keep-alive",
  },
});
10

Set your API key

Make sure to set your Thesys API key as an environment variable:

export THESYS_API_KEY=<your-api-key>

Or add it to your .env.local file:

THESYS_API_KEY=<your-api-key>

Your API endpoint is now ready to handle streaming chat conversations with the C1 model!