Skip to content

kamalviewcode-spec/Project_3_MCP_LibSQL_Concept

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Project 3: Explore MCP Servers with LibSQL

Overview

This project explores Model Context Protocol (MCP) servers, specifically focusing on integrating persistent memory capabilities using LibSQL. It demonstrates how AI agents can be augmented with MCP tools to maintain stateful conversations and persistent data storage.

What is MCP?

MCP (Model Context Protocol) is a protocol that enables:

  • Connecting AI agents with tools - Provides a standardized way for agents to access external tools and capabilities
  • Resource sharing - Agents can access and share resources like databases, APIs, and local files
  • Prompt templates - Reusable prompt patterns for common tasks
  • Tool discovery - Agents can dynamically discover available tools from MCP servers

Project Purpose

This project demonstrates:

  1. Setting up and connecting to an MCP server (mcp-memory-libsql)
  2. Using LibSQL as a persistent memory backend for AI agents
  3. Creating agents that maintain context across conversations
  4. Implementing multi-turn conversations with persistent state

Project Structure

Project_3_MCP_LibSQL/
├── Src/
│   └── Lab3.py              # Main demonstration script
├── memory/
│   └── kamal.db             # LibSQL database for persistent memory
├── pyproject.toml           # Project dependencies and metadata
├── .env                     # Environment configuration (secrets)
├── .gitignore               # Git ignore rules
├── .venv/                   # Python virtual environment
└── README.md                # This file

Setup Instructions

Prerequisites

  • Python 3.12+
  • Node.js (for running the MCP server)
  • pip or uv package manager

Installation

  1. Clone or navigate to the project directory

    cd /home/dell/projects/Project_3_MCP_LibSQL
  2. Activate the virtual environment

    source .venv/bin/activate
  3. Install dependencies

    pip install -e .
  4. Set up environment variables Create or update .env file with:

    OPENAI_API_KEY=your_api_key_here
    LIBSQL_URL=file:./memory/kamal.db

Usage

Running the Main Script

python Src/Lab3.py

This script demonstrates:

  1. Connecting to an MCP server (mcp-memory-libsql) running locally
  2. Creating an AI agent with the MCP server attached
  3. Running multi-turn conversations where the agent can:
    • Store information about the user ("Kamal is an AI engineer")
    • Recall stored information in subsequent queries
    • Provide context-aware responses based on persistent memory

Key Components

MCP Server Configuration:

  • Server: mcp-memory-libsql (runs via npx)
  • Database: Local LibSQL file-based database
  • Environment: LIBSQL_URL=file:./memory/kamal.db

Agent Configuration:

  • Model: GPT-4.1-mini
  • Instructions: Configured to use entity tools for persistent memory
  • MCP Servers: Attached LibSQL memory server

Dependencies Highlights

  • anthropic - Anthropic API client
  • autogen-agentchat - Agent orchestration framework
  • langchain - LLM framework with multiple integrations
  • langgraph - Graph-based agent workflows
  • mcp - Model Context Protocol implementation
  • mcp-server-fetch - MCP server for web fetching
  • openai - OpenAI API client
  • langsmith - LLM tracing and debugging

Key Concepts

LibSQL Integration

LibSQL is a lightweight SQL database that runs locally. In this project:

  • Stores entity information (e.g., user details)
  • Provides persistent memory across agent sessions
  • Allows agents to query and update information

Storage Capacity for Personal Use:

  • Local database (file-based): ✅ UNLIMITED - Limited only by your available disk space
  • Turso Cloud (hosted): Up to 9 GB in free tier
  • This Project: Uses local file-based storage, so truly unlimited storage for personal use
  • No cost: Running locally means zero subscription fees

Agent Execution

The code uses:

  • Agents: AI entities with role-specific instructions
  • Runners: Execute agents with given prompts
  • Tracing: Track conversation flows and debugging

Example Workflow

  1. Agent receives initial input: "My name's Kamal. I'm an AI engineer..."
  2. Agent uses MCP tools to store this information in LibSQL
  3. In a second query, agent asks: "What do you know about me?"
  4. Agent retrieves stored information from memory using MCP tools
  5. Agent provides context-aware response based on persisted data

Notes

  • The project uses async/await patterns for non-blocking MCP server communication
  • Rich markdown formatting is used for console output
  • The memory database persists between runs (stored in memory/kamal.db)
  • Node options are configured to suppress deprecation warnings

Development

To extend this project:

  1. Add more MCP tools: Modify the agent configuration
  2. Enhance memory schema: Update LibSQL database structure
  3. Multi-agent coordination: Add multiple agents with shared MCP servers
  4. API integration: Use mcp-server-fetch or create custom MCP servers

Troubleshooting

Issue: MCP server connection timeout

  • Ensure Node.js is installed: node --version
  • Check npx availability: npx --version

Issue: LibSQL database locked

  • Delete memory/kamal.db and let it regenerate
  • Ensure only one process accesses the database

Issue: API key errors

  • Verify .env file has correct OPENAI_API_KEY
  • Use python -c "from dotenv import load_dotenv; load_dotenv(); import os; print(os.getenv('OPENAI_API_KEY'))" to debug

References

Author Notes

This project is part of learning AI Agents and the MCP protocol. It serves as a practical guide to:

  • Understanding MCP architecture
  • Building stateful AI agents
  • Implementing persistent memory
  • Orchestrating multi-turn conversations with tools

Last Updated: April 2026 Status: Active Development

About

Stateful AI agents using MCP (Model Context Protocol) + LibSQL for persistent memory — agents that remember across sessions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages