Overview
Streaming involves your entire application stack. The C1 API sends the response in chunks, your backend forwards these chunks, and your UI progressively renders them as they arrive.Backend: Enabling the Stream
To enable streaming, you must first setstream: true
in your call to the C1 API. Your backend’s primary role is then to efficiently forward this stream to the UI with the correct Content-Type: text/event-stream
header.
Our server-side SDKs for Python and Node.js provide helpers to simplify this process.
To simplify streaming in FastAPI, we provide the
thesys-genui-sdk
Python library. You can install it via pip:
pip install thesys-genui-sdk
The library provides the @with_c1_response
decorator, which automatically handles setting the correct response headers and creating a streaming context. Inside the decorated function, you can use the write_content
helper to yield each chunk from the LLM stream.For framework independent streaming, see the
thesys-genui-sdk
package on PyPI.main.py
UI: Rendering the Stream
Handling a streaming response on the UI requires manually fetching the data, reading the stream chunk by chunk, and updating your component’s state as new data arrives. While this involves more code than a standard fetch request, it gives you full control over the user experience. This section breaks down a complete, working example.Manual Stream Handling with <C1Component>
Here is a full React component that fetches a streaming C1 DSL response from a backend endpoint and renders it progressively.
app/page.tsx
Code Breakdown
Let’s walk through the key parts of the code above.-
State Management: We use two state variables:
c1Response
: An accumulating string that holds the C1 DSL as it arrives from the stream. It starts empty.isLoading
: A boolean to track the request status, which is passed to theisStreaming
prop.
-
The Fetch Request: Inside the
handleGenerate
function, we initiate a standardfetch
call to our streaming backend endpoint. -
Reading the Stream: This is the core of the logic.
- We get a
reader
from theresponse.body
. - The
while (true)
loop continuously callsreader.read()
to get the next chunk of data. - A
TextDecoder
converts each raw data chunk into a string. - We append this string to our
accumulatedResponse
variable and update thec1Response
state withsetC1Response()
. This state update is what causes the UI to render progressively. - The loop breaks when the stream sends a
done: true
signal.
- We get a
-
Connecting to
<C1Component>
:- The
c1Response
state variable is passed directly to the<C1Component>
. As this state updates with each new chunk, the component re-renders to display the incoming UI. - The
isLoading
state is passed to theisStreaming
prop, which can be used by the component to display loading indicators.
- The
All-in-One Solution: <C1Chat>
For conversational interfaces, <C1Chat>
is the simplest solution. It has streaming enabled by default and encapsulates all the complex state and stream-handling logic shown above. As long as the apiUrl
you provide points to a streaming backend endpoint, no further UI configuration is required.
app/page.tsx
<C1Chat>
, please refer to the Conversational UI section.