
The Pitch: LLMs are incredibly smart, but they are essentially "brains in a jar." They have no idea what is happening inside your company's private network, your local file system, or your internal databases.
Until now, giving an LLM access to your database meant writing messy, custom REST APIs and hoping the LLM formatted the JSON payload correctly. Enter MCP (Model Context Protocol). MCP is the new open standard that acts as a universal "USB-C cable" between AI models and your local data.
Today, we are going deep. We are building a local TypeScript MCP server that allows Claude (or Cursor) to natively query a local SQLite database, securely passing arguments back and forth.
🧠 How the Architecture Works
Before we write code, you need to understand the flow:
The Host (Claude Desktop / Cursor): The app you are chatting with.
The MCP Client: Built into the host, it negotiates with your local servers.
Your MCP Server: The bridge we are building today. It defines Tools and Prompts.
The Local Resource: Your SQLite DB, local API, or file system.
🛠️ The Stack
Node.js & TypeScript
The Official MCP SDK (
@modelcontextprotocol/sdk)SQLite3 (To simulate a real local database)
Step 1: Initialize & Mock the Database
First, let's set up the project and create a fake database of "users" so the AI has something to read.
mkdir deep-mcp-server && cd deep-mcp-server
npm init -y
npm install @modelcontextprotocol/sdk sqlite3
npm install -D typescript @types/node @types/sqlite3
npx tsc --initCreate a quick setup-db.js file and run it (node setup-db.js) to generate your local data:
const sqlite3 = require('sqlite3').verbose();
const db = new sqlite3.Database('./local_users.db');
db.serialize(() => {
db.run("CREATE TABLE IF NOT EXISTS users (id INT, name TEXT, role TEXT, active BOOLEAN)");
db.run("INSERT INTO users VALUES (1, 'Alice Cyber', 'Admin', true)");
db.run("INSERT INTO users VALUES (2, 'Bob The Builder', 'Developer', false)");
db.run("INSERT INTO users VALUES (3, 'Charlie Root', 'DevOps', true)");
});
db.close();
console.log("Database created!");Step 2: The Server Code (index.ts)
This is where the magic happens. We are going to define a tool called query_users_by_role. Notice how we are strictly defining the inputSchema so the LLM knows exactly what parameters it is allowed to pass.
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { CallToolRequestSchema, ListToolsRequestSchema } from "@modelcontextprotocol/sdk/types.js";
import sqlite3 from "sqlite3";
// Connect to our local SQLite DB
const db = new sqlite3.Database('./local_users.db');
// 1. Initialize the MCP Server
const server = new Server(
{ name: "sqlite-mcp-bridge", version: "2.0.0" },
{ capabilities: { tools: {} } }
);
// 2. Define the Tools and their strict JSON Schemas
server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: [{
name: "query_users_by_role",
description: "Queries the local user database by their job role.",
inputSchema: {
type: "object",
properties: {
role: {
type: "string",
description: "The job role to search for (e.g., 'Admin', 'Developer', 'DevOps')"
}
},
required: ["role"]
}
}]
}));
// 3. Handle the execution logic and interact with the DB
server.setRequestHandler(CallToolRequestSchema, async (request) => {
if (request.params.name === "query_users_by_role") {
// Extract the argument passed by the LLM
const requestedRole = request.params.arguments?.role;
if (!requestedRole || typeof requestedRole !== "string") {
throw new Error("Invalid or missing 'role' argument.");
}
// Wrap the SQLite query in a Promise
const data = await new Promise((resolve, reject) => {
db.all("SELECT * FROM users WHERE role = ?", [requestedRole], (err, rows) => {
if (err) reject(err);
else resolve(rows);
});
});
// Return the data to the LLM in a readable format
return {
content: [{
type: "text",
text: JSON.stringify(data, null, 2)
}]
};
}
throw new Error("Tool not found");
});
// 4. Boot the server using Standard I/O Transport
const transport = new StdioServerTransport();
server.connect(transport);
console.error("MCP SQLite Server is running on stdio");Step 3: Connect to your AI Client
Compile the TypeScript code (npx tsc). Now, we need to tell our AI client where this server lives. If you are using Claude Desktop, open your config file (~/.claude_desktop_config.json on Mac, or %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"local-db-bridge": {
"command": "node",
"args": ["/absolute/path/to/deep-mcp-server/dist/index.js"]
}
}
}Step 4: The "Aha!" Moment
Restart Claude Desktop and open a new chat. Try this prompt:
"Can you check our local database and tell me the names of all the active Admins?"What happens behind the scenes:
Claude reads the prompt and realizes it doesn't know the answer.
It checks its available MCP tools and spots
query_users_by_role.It intelligently extracts the concept of "Admin" from your prompt.
It calls your local Node script, passing
{"role": "Admin"}as the argument.Your script queries the local SQLite DB, formats the JSON, and sends it back.
Claude reads the JSON and responds: "Found it. Alice Cyber is your active Admin."
The Takeaway: You've just created a secure, local RAG (Retrieval-Augmented Generation) pipeline without uploading a single byte of your proprietary database to the cloud. By mastering inputSchema, you can build tools that execute highly complex, parameterized scripts natively from your IDE or chat client.
