This guide walks you through setting up a new project from scratch. If you’d just like to get started quickly, check out the Quickstart guide which gets you up and running with a basic template.

1

Create a project

npx create-next-app c1-app
2

Install the latest dependencies

First, install the latest packages for C1 and Crayon.

npm install @thesysai/genui-sdk @crayonai/react-core @crayonai/stream

Next, install the OpenAI SDK.

npm install openai
3

Create an API key

Navigate to C1 Console and create a new API key.

4

Set up a backend endpoint

First, create a message history store. This is a simple in-memory store for the messages in the chat.

import OpenAI from "openai";

export type DBMessage = OpenAI.Chat.ChatCompletionMessageParam & {
  id?: string;
};

const messagesStore: {
  [threadId: string]: DBMessage[];
} = {};

export const getMessageStore = (id: string) => {
  if (!messagesStore[id]) {
    messagesStore[id] = [];
  }
  const messageList = messagesStore[id];
  return {
    addMessage: (message: DBMessage) => {
      messageList.push(message);
    },
    messageList,
    getOpenAICompatibleMessageList: () => {
      return messageList.map((m) => {
        const message = {
          ...m,
        };
        delete message.id;
        return message;
      });
    },
  };
};

Then, setup the backend endpoint using the message history store. In a typical Next.js application using the App Router, a simple endpoint for an agent may look like this:

import { NextRequest, NextResponse } from "next/server";
import OpenAI from "openai";
import type { ChatCompletionMessageParam } from "openai/resources.mjs";
import { transformStream } from "@crayonai/stream";
import { getMessageStore } from "./messageStore";

export async function POST(req: NextRequest) {
  const { prompt, threadId, responseId } = (await req.json()) as {
    prompt: ChatCompletionMessageParam;
    threadId: string;
    responseId: string;
  };

  const client = new OpenAI({
    baseURL: "https://api.thesys.dev/v1/embed",
    apiKey: process.env.THESYS_API_KEY, // Use the API key you created in the previous step
  });

  const messageStore = getMessageStore(threadId);
  messageStore.addMessage(prompt);

  const llmStream = await client.chat.completions.create({
    model: "c1-nightly",
    messages: messageStore.getOpenAICompatibleMessageList(),
    stream: true,
  });

  // Unwrap the OpenAI stream to a C1 stream
  const responseStream = transformStream(
    llmStream,
    (chunk) => {
      return chunk.choices[0].delta.content;
    },
    {
      onEnd: ({ accumulated }) => {
        const message = accumulated.filter((chunk) => chunk).join("");
        messageStore.addMessage({
          id: responseId,
          role: "assistant",
          content: message,
        });
      },
    }
  ) as ReadableStream<string>;

  return new NextResponse(responseStream, {
    headers: {
      "Content-Type": "text/event-stream",
      "Cache-Control": "no-cache, no-transform",
      Connection: "keep-alive",
    },
  });
}
5

Set up the frontend

To integrate the frontend, use the C1Chat component. This component includes a ready-to-use chat interface that manages its own state. All you need to do is pass the apiUrl prop pointing to your backend endpoint.

Example implementation:

app/page.tsx
"use client";

import { C1Chat } from "@thesysai/genui-sdk";
import "@crayonai/react-ui/styles/index.css";

export default function Home() {
  return <C1Chat apiUrl="/api/chat" />;
}

If you would like to customize the UI instead of using the pre-built C1Chat component, see Customizing Crayon UI.