LangChain provides a robust framework for building AI applications with tool calling capabilities. This guide demonstrates how to integrate LangChain with C1 to create an intelligent agent that can execute SQL queries and provide conversational interfaces.

We’ll build a complete application in two parts:

  • Backend: A FastAPI server with LangChain agents and SQL tools
  • Frontend: A React interface to interact with the agent

This guide assumes you have basic knowledge of Python, LangChain, and React. You’ll also need a Thesys API key from the C1 Console.

Part 1: Backend Implementation

The backend uses LangChain to create an intelligent agent that can execute SQL queries on a Chinook database using C1 as the underlying LLM.

1

Set up the project structure

Create a new directory for your LangChain project and set up the basic structure:

mkdir backend db
cd backend
2

Install dependencies

Create a requirements.txt file with the necessary dependencies:

backend/requirements.txt
fastapi==0.104.1
uvicorn==0.24.0
langchain==0.1.0
langchain-openai==0.0.2
langserve==0.0.30
pydantic==2.5.0
sqlite3

Install the dependencies:

cd backend
pip install -r requirements.txt
3

Create the sample database

For this example, we’ll use the Chinook database. Create it in your db folder using the SQL script:

cd db
curl -s https://raw.githubusercontent.com/lerocha/chinook-database/master/ChinookDatabase/DataSources/Chinook_Sqlite.sql | sqlite3 Chinook.db
4

Create the backend server

Create the main backend file with LangChain integration:

backend/main.py
#!/usr/bin/env python
import os
import sqlite3
from langchain_openai import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.tools import tool
from langchain_core.runnables import RunnableLambda
from langchain.agents import create_openai_tools_agent, AgentExecutor
from fastapi import FastAPI
from langserve import add_routes
from pydantic import BaseModel

# Input model
class ChainInput(BaseModel):
    c1Response: str = ""  # Can be empty
    query: str

# 1. Create model
model = ChatOpenAI(
    base_url="https://api.thesys.dev/v1/embed",
    model="c1-nightly",
    api_key=os.environ.get("THESYS_API_KEY")
)

# 2. Create SQL tool
@tool
def execute_sql_query(query: str) -> str:
    """Execute a SQL query on the Chinook database and return the results.

    Args:
        query: The SQL query to execute

    Returns:
        The query results as a formatted string
    """
    try:
        conn = sqlite3.connect('db/Chinook.db')
        cursor = conn.cursor()
        cursor.execute(query)
        results = cursor.fetchall()

        # Get column names
        column_names = [description[0] for description in cursor.description]

        # Format results
        if not results:
            return "No results found."

        # Create a formatted table
        formatted_results = []
        formatted_results.append(" | ".join(column_names))
        formatted_results.append("-" * len(" | ".join(column_names)))

        for row in results:
            formatted_results.append(" | ".join(str(value) for value in row))

        conn.close()
        return "\n".join(formatted_results)

    except Exception as e:
        return f"Error executing SQL query: {str(e)}"

# 3. Create parser
parser = StrOutputParser()

# 4. Create agent prompt template
prompt = ChatPromptTemplate.from_messages([
    ("system", """You are a helpful assistant that can answer questions about the Chinook digital media store database.

You have access to a SQL tool that can execute queries on the database. The database contains information about:
- Artists, Albums, and Tracks
- Customers and their purchase history
- Employees and sales data
- Playlists and media types

When users ask questions about the music store data, use the SQL tool to query the database and provide accurate information.

Context from previous conversation: {context}"""),
    ("human", "{query}"),
    ("placeholder", "{agent_scratchpad}")
])

# 5. Create agent and chain
tools = [execute_sql_query]
agent = create_openai_tools_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

def format_inputs(inputs: ChainInput):
    """Transform ChainInput to prompt variables and execute agent"""
    context = inputs.get("c1Response", "No previous context")

    # Execute the agent
    result = agent_executor.invoke({
        "context": context,
        "query": inputs["query"]
    })
    return result["output"]

# Create a proper runnable chain
chain = RunnableLambda(format_inputs)

# 6. App definition
app = FastAPI(
    title="LangChain + C1 Server",
    version="1.0",
    description="A simple API server using LangChain's Runnable interfaces powered by Thesys C1",
)

# 7. Adding chain route
add_routes(
    app,
    chain.with_types(input_type=ChainInput),
    path="/chain",
)

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, host="localhost", port=4001)
5

Set up environment variables

Export your Thesys API key:

export THESYS_API_KEY=your_api_key_here
6

Test the backend

Run the backend server:

cd backend
python main.py

You can test the API at http://localhost:4001/chain/playground to see the interactive documentation.

Part 2: Frontend Implementation

Now let’s create a React frontend that integrates with our LangChain backend and uses C1 for the generative UI.

1

Set up the frontend project

Create a new React project with Vite:

cd ..
npm create vite@latest frontend -- --template react-ts
cd frontend
2

Install dependencies

Install the necessary packages for C1 integration:

npm install @crayonai/react-ui @thesysai/genui-sdk
3

Create the main App component

Create the main App.tsx file that connects to your LangChain backend:

src/App.tsx
import "@crayonai/react-ui/styles/index.css";
import { ThemeProvider, C1Component } from "@thesysai/genui-sdk";
import "@crayonai/react-ui/styles/index.css";
import { useState } from "react";
import "./App.css";

function App() {
  const [isLoading, setIsLoading] = useState(false);
  const [c1Response, setC1Response] = useState("");
  const [question, setQuestion] = useState("");

  const makeApiCall = async (query: string, c1Response: string) => {
    setIsLoading(true);
    setC1Response("");

    try {
      const response = await fetch("/api/chain/invoke", {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
        body: JSON.stringify({
          input: { query, c1Response }
        }),
      });

      const data = await response.json();
      setC1Response(data.output);
    } catch (error) {
      console.error("Error:", error);
    } finally {
      setIsLoading(false);
    }
  };

  return (
    <div className="app-container">
      <h1>Chinook Store Data Assistant</h1>

      <form
        onSubmit={(e) => {
          e.preventDefault();
          makeApiCall(question, c1Response);
        }}
      >
        <input
          type="text"
          value={question}
          onChange={(e) => setQuestion(e.target.value)}
          placeholder="Ask about the music store database..."
          className="question-input"
        />
        <button
          type="submit"
          className="submit-button"
          disabled={isLoading || !question.trim()}
        >
          {isLoading ? "Processing..." : "Ask Question"}
        </button>
      </form>

      {c1Response && (
        <div className="response-container">
          <ThemeProvider>
            <C1Component
              c1Response={c1Response}
              isStreaming={isLoading}
              updateMessage={(message) => setC1Response(message)}
              onAction={({ llmFriendlyMessage }) => {
                if (!isLoading) {
                  makeApiCall(llmFriendlyMessage, c1Response);
                }
              }}
            />
          </ThemeProvider>
        </div>
      )}
    </div>
  );
}

export default App;
4

Set up the API proxy

Create a Vite configuration to proxy API calls to your backend:

vite.config.ts
import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'

export default defineConfig({
  plugins: [react()],
  server: {
    proxy: {
      '/api': {
        target: 'http://localhost:4001',
        changeOrigin: true,
      }
    }
  }
})
5

Run the frontend

With your backend server already running, start the frontend:

npm run dev

Visit http://localhost:5173 to interact with your LangChain + C1 application!

Example Queries

Try these example queries to test your application:

  • “Show me the top 5 best-selling albums”
  • “What are the most popular genres in the store?”
  • “List all employees and their roles”
  • “Show me customers from Canada”
  • “What’s the total revenue by country?”

Key Features

  • LangChain Integration: Uses LangChain’s agent framework for tool calling
  • SQL Tool: Executes dynamic SQL queries on the Chinook database
  • C1 Visualization: Automatically visualizes responses with rich UI components
  • Context Awareness: Maintains conversation context across interactions
  • Error Handling: Robust error handling for both database and API calls

This example demonstrates the power of combining LangChain’s agent capabilities with C1’s generative UI. You can extend this by adding more tools, different databases, or custom UI components.

Next Steps

  • Add more tools to your LangChain agent (web search, calculations, etc.)
  • Implement user authentication and session management
  • Add streaming responses for better UX
  • Deploy your application to production

View the code

Find more examples and complete code on our GitHub repository.