Build powerful AI agents with LangChain and C1 using tools and SQL database integration
LangChain provides a robust framework for building AI applications with tool calling capabilities.
This guide demonstrates how to integrate LangChain with C1 to create an intelligent agent that can execute SQL queries and provide conversational interfaces.
We’ll build a complete application in two parts:
Backend: A FastAPI server with LangChain agents and SQL tools
Frontend: A React interface to interact with the agent
This guide assumes you have basic knowledge of Python, LangChain, and React. You’ll also need a Thesys API key from the C1 Console.
For this example, we’ll use the Chinook database. Create it in your db folder using the SQL script:
Copy
cd dbcurl -s https://raw.githubusercontent.com/lerocha/chinook-database/master/ChinookDatabase/DataSources/Chinook_Sqlite.sql | sqlite3 Chinook.db
4
Create the backend server
Create the main backend file with LangChain integration:
backend/main.py
Copy
#!/usr/bin/env pythonimport osimport sqlite3from langchain_openai import ChatOpenAIfrom langchain_core.output_parsers import StrOutputParserfrom langchain_core.prompts import ChatPromptTemplatefrom langchain_core.tools import toolfrom langchain_core.runnables import RunnableLambdafrom langchain.agents import create_openai_tools_agent, AgentExecutorfrom fastapi import FastAPIfrom langserve import add_routesfrom pydantic import BaseModel# Input modelclass ChainInput(BaseModel): c1Response: str = "" # Can be empty query: str# 1. Create modelmodel = ChatOpenAI( base_url="https://api.thesys.dev/v1/embed", model="c1-nightly", api_key=os.environ.get("THESYS_API_KEY"))# 2. Create SQL tool@tooldef execute_sql_query(query: str) -> str: """Execute a SQL query on the Chinook database and return the results. Args: query: The SQL query to execute Returns: The query results as a formatted string """ try: conn = sqlite3.connect('db/Chinook.db') cursor = conn.cursor() cursor.execute(query) results = cursor.fetchall() # Get column names column_names = [description[0] for description in cursor.description] # Format results if not results: return "No results found." # Create a formatted table formatted_results = [] formatted_results.append(" | ".join(column_names)) formatted_results.append("-" * len(" | ".join(column_names))) for row in results: formatted_results.append(" | ".join(str(value) for value in row)) conn.close() return "\n".join(formatted_results) except Exception as e: return f"Error executing SQL query: {str(e)}"# 3. Create parserparser = StrOutputParser()# 4. Create agent prompt templateprompt = ChatPromptTemplate.from_messages([ ("system", """You are a helpful assistant that can answer questions about the Chinook digital media store database.You have access to a SQL tool that can execute queries on the database. The database contains information about:- Artists, Albums, and Tracks- Customers and their purchase history- Employees and sales data- Playlists and media typesWhen users ask questions about the music store data, use the SQL tool to query the database and provide accurate information.Context from previous conversation: {context}"""), ("human", "{query}"), ("placeholder", "{agent_scratchpad}")])# 5. Create agent and chaintools = [execute_sql_query]agent = create_openai_tools_agent(model, tools, prompt)agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)def format_inputs(inputs: ChainInput): """Transform ChainInput to prompt variables and execute agent""" context = inputs.get("c1Response", "No previous context") # Execute the agent result = agent_executor.invoke({ "context": context, "query": inputs["query"] }) return result["output"]# Create a proper runnable chainchain = RunnableLambda(format_inputs)# 6. App definitionapp = FastAPI( title="LangChain + C1 Server", version="1.0", description="A simple API server using LangChain's Runnable interfaces powered by Thesys C1",)# 7. Adding chain routeadd_routes( app, chain.with_types(input_type=ChainInput), path="/chain",)if __name__ == "__main__": import uvicorn uvicorn.run(app, host="localhost", port=4001)
5
Set up environment variables
Export your Thesys API key:
Copy
export THESYS_API_KEY=your_api_key_here
6
Test the backend
Run the backend server:
Copy
cd backendpython main.py
You can test the API at http://localhost:4001/chain/playground to see the interactive documentation.
LangChain Integration: Uses LangChain’s agent framework for tool calling
SQL Tool: Executes dynamic SQL queries on the Chinook database
C1 Visualization: Automatically visualizes responses with rich UI components
Context Awareness: Maintains conversation context across interactions
Error Handling: Robust error handling for both database and API calls
This example demonstrates the power of combining LangChain’s agent capabilities with C1’s generative UI. You can extend this by adding more tools, different databases, or custom UI components.