A guide to creating a basic LangGraph agent using Python and the Thesys C1 API for a generative UI (GenUI) chatbot.
Add Dependencies
requirements.txt
. You’ll need langchain-openai
to interact with the Thesys C1 API endpoint.Define the Graph (`graph.py`)
graph.py
to define the agent’s structure and state management.Agent State (AgentState
)The AgentState
includes messages
annotated with add_messages
for automatic history updates, and a response_id
string. This response_id
is crucial as it’s passed from the @thesysai/genui-sdk
frontend with each user message and needs to be assigned to the corresponding AI response message for the UI to track it correctly.call_model
)Next, initialize the ChatOpenAI
model, pointing it to your Thesys C1 API endpoint and binding any necessary tools. The call_model
function invokes the model, and importantly, checks if the response is the final AIMessage which will be sent to User. If it is, it updates the assistant id with the response_id present in the state.graph.py
file involves setting up the ToolNode
, defining the should_continue
logic for routing between the agent and tools, building the StateGraph
, adding nodes and edges, and finally compiling the graph with a MemorySaver
checkpointer.Full graph.py Code
Implement Thread Service (`thread_service.py`)
thread_service.py
to handle thread metadata and message retrieval/updates for the UI.Thread Metadata StorageThis service uses an in-memory dictionary (_thread_metadata_store
) to store thread metadata. The ThreadMetadata
model holds the title and creation timestamp, while ThreadInfo
is used for sending thread details to the client.app
to fetch and format messages for the UI (get_formatted_ui_messages
) using app.aget_state
, and to update messages within the graph’s state (update_message
) using app.update_state
.Full thread_service.py Code
Define API Endpoints (`main.py`)
main.py
to expose the LangGraph agent and thread management via a FastAPI application.Core Chat Streaming Endpoint (/chat
)The primary endpoint is /chat
. It receives the user’s prompt
, the threadId
, and the responseId
(generated by GenUI). It uses an async generator stream_langgraph_events
to interact with the compiled LangGraph app
. This function constructs the input message, includes the response_id
in the graph input, streams events using app.astream_events
with the correct thread_id
config, and yields only the content chunks from on_chat_model_stream
events. The /chat
endpoint returns a StreamingResponse
./threads
, /threads/{thread_id}
, /threads/{thread_id}/messages
, /threads/{thread_id}/message
) are defined to handle thread metadata (create, list, update, delete) and message operations (get history, update message content). These endpoints primarily call the corresponding functions implemented in thread_service.py
.Full API ImplementationFull main.py Code
@thesysai/genui-sdk
components and hooks.
App.tsx
)useThreadListManager
and useThreadManager
hooks from the SDK to manage threads and messages.
The example below shows how to initialize these hooks and pass them to the <C1Chat />
component. It also includes a useEffect
to handle loading a specific thread based on a URL query parameter (threadId
).
App.tsx
http://localhost:8000
..App.tsx
and package.json
are) in a separate terminal.