In many cases, a form is a much more efficient way to gather structured information from a user than a back-and-forth text conversation. For example, if a user wants to plan a trip, the C1 API can present a form asking for their destination, dates, and accommodation type all at once. C1 supports different of input types that the LLM can use to build a form:
  • Text and Number inputs
  • Date input and Textarea
  • Dropdowns, Checkboxes, and Radio buttons
  • Sliders and Switches

Generating Forms

There are two primary methods to instruct the C1 API to generate a form: automatically from a tool schema (recommended), or manually via a system prompt. The most robust way to generate a form is to provide the LLM with the JSON schema of a tool you want it to use. See Integrating data for more details. C1 automatically renders the required input fields based on the tool’s parameters. For example, to create a Jira copilot, you can provide the schema for a create_jira_issue tool.
const tools = [
  {
    type: "function",
    function: {
      name: "create_jira_issue",
      description: "Create a Jira issue",
      parameters: {
        type: "object",
        properties: {
          title: {
            type: "string",
            description: "The title of the Jira issue"
          },
          priority: {
            type: "string",
            enum: ["Low", "Medium", "High"],
            description: "The priority of the Jira issue"
          },
          description: {
            type: "string",
            description: "A multiline description for the issue"
          }
        },
        required: ["title", "priority", "description"]
      }
    }
  }
];

const response = await client.chat.completions.create({
  model: "c1-model-name",
  messages: [
    { role: "system", content: "You are a helpful Jira copilot." },
    { role: "user", content: "Create a Jira issue with the title 'New Feature'" }
  ],
  tools: tools,
});
In this scenario, C1 will automatically render a form with fields for title, priority, and description. It will even pre-fill the title field, since that value was already provided in the user’s prompt.

Manual Generation via System Prompt

For simpler cases or more direct control, you can instruct the LLM to generate a specific form using a system prompt.
You are a helpful travel planner assistant.
When a user wants to plan a trip, you should generate a form with the following fields:
- Destination: A text input field.
- Dates: A date input field.
- Accommodation Type: A dropdown with options for Hotel, Apartment, or Hostel.

How Forms Work on the Frontend

Once a form is rendered, the C1 SDK handles most of the complexity for you.

Automatic State Management

As a user types into a C1-generated form, the state of each input field is managed automatically within the component. You do not need to write useState or onChange handlers to track these values. This internal state is the same state that is captured and persisted via the updateMessage callback, as described in the Managing State guide.

Handling Form Submissions

When a user clicks the submit button on a C1-generated form, it triggers the onAction callback. The action object passed to your handler will contain all the data from the form. Here is how you would handle a form submission on the frontend:
import { useState } from "react";
import { C1Component, ThemeProvider } from "@thesysai/genui-sdk";

function MyFormComponent() {
  const [c1Response, setC1Response] = useState<string>(/* initial C1 DSL with a form */);

  const handleFormSubmit = async (action) => {

    // The `action.payload` contains the form data as a JSON object
    // e.g., { destination: "Paris", accommodation_type: "Hotel" }
    console.log("Form submitted with payload:", action.payload);

    // This is the message that will be shown to the user
    // e.g., "Plan my trip"
    console.log("Message to be shown to user:", action.userFriendlyMessage);

    // e.g., "The user submitted the trip planning form with the following details..."
    // The `action.llmFriendlyMessage` is a string pre-formatted for the LLM
    console.log("Message for LLM:", action.llmFriendlyMessage);

    // Send this message to your backend to generate the next UI
    const response = await fetch("/api/chat", {
      method: "POST",
      body: JSON.stringify({ prompt: action.llmFriendlyMessage }),
    });

    const data = await response.text();
    setC1Response(data.content);
  };

  return (
    <ThemeProvider>
      <C1Component
        c1Response={c1Response}
        onAction={handleFormSubmit}
      />
    </ThemeProvider>
  );
}
You need to send action.llmFriendlyMessage to C1 Api as user prompt from you backend.