Skip to main content
A key feature of conversational artifacts is the ability for users to modify them with follow-up prompts. This guide explains how to handle a user request like “add a slide about our competitors” by using a tool to edit an existing artifact.
This guide assumes you have already completed the Generating an Artifact in a Conversation guide.

Step 1: Define the edit_artifact Tool

First, define the schema for a tool that allows the assistant to edit an artifact. The schema must include parameters to identify which artifact to edit (artifactId and version) and what changes to make (instructions).
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

const editPresentationTool = {
  type: "function",
  function: {
    name: "edit_presentation",
    description: "Edits an existing slide presentation.",
    parameters: zodToJsonSchema(
      z.object({
        artifactId: z.string().describe("The ID of the artifact to edit."),
        version: z.string().describe("The version of the artifact to edit"), // this corresponds to the messageId of the assistant's response that contains the artifact
        instructions: z.string().describe("The user's instructions for what to change."),
      })
    ),
  },
};

Step 2: The Role of the version

In our workflow, the version is the unique messageId of the assistant’s response that contains the artifact. Using the messageId as a version provides a stable reference to a specific, point-in-time snapshot of the artifact as it exists in the conversation history. When the LLM requests an edit, your backend can use this version to reliably retrieve the exact content that needs to be modified.

Step 3: Handle the Tool Call in Your Backend

When a user asks to make a change, the LLM will use your edit_presentation tool. Your backend must handle this tool call by retrieving the old content, calling the C1 Artifacts API in “edit mode,” and streaming back the updated result. The key steps are:
  1. Use the version from the tool call to fetch the previous assistant message content from your database.
  2. Generate a new messageId which will serve as the new version of the artifact.
  3. Call the C1 Artifacts API, providing the old content and the new editing instructions.
  4. Stream the updated artifact into a C1 Response object.
  5. Return a tool_result to the main LLM with the new version.
  6. Stream the LLM’s final text confirmation into the same C1 Response object.
// (Inside your Next.js API route)
import { makeC1Response } from "@thesysai/genui-sdk/server";

// When your edit tool handler is invoked...
async function handleEditPresentation(
  { artifactId, version, instructions }: { artifactId: string, version: string, instructions: string },
  { messageId, c1Response }: { messageId: string, c1Response: ReturnType<typeof makeC1Response> }
) {
  // 1. Retrieve the old artifact content from your database using the version (messageId).
  const oldMessageContent = await getMessageFromDb(version);

  // 3. Call the Artifacts API in "edit mode".
  const updatedArtifactStream = await c1ArtifactsClient.chat.completions.create({
    model: "c1/artifact/v-20251030",
    messages: [
      { role: "assistant", content: oldMessageContent }, // Old content
      { role: "user", content: instructions },           // New instructions
    ],
    metadata: { thesys: JSON.stringify({ c1_artifact_type: "slides", id: artifactId }) },
    stream: true,
  });

  // 4. Pipe the updated artifact stream into the C1 Response object.
  for await (const delta of updatedArtifactStream) {
    const content = delta.choices[0]?.delta?.content;
    if (content) {
      c1Response.writeContent(content);
    }
  }

  // 5. Return the result to the main LLM with the new version.
  return `Presentation edited successfully. New version: ${messageId}`
}

Step 4: The Final Updated Response

As in the creation step, the main LLM will receive the successful tool_result and generate a final confirmation (e.g., “I’ve updated the presentation with the new slide.”). Your backend streams this text into the same C1 Response object, completing the request. The frontend then receives and renders the new assistant message containing the fully updated artifact.

View the code

Find more examples and complete code on our GitHub repository.