Simple MCP with LangGraph
Navas Herbert

Navas Herbert @navashub

About: I'm Herbert Kipkemoi, a Data Analyst & Data Engineer skilled in Python, SQL, ETL, and ML. I build data-driven solutions and automate workflows. 🚀 Let’s connect!

Joined:
Mar 31, 2025

Simple MCP with LangGraph

Publish Date: Jun 23
1 0

It's wild how far we have come - not too long ago, many of us were still figuring out how to scrape public data or call basic APIs .

But now we are entering a whole new level. Building MCP servers and clients - basically custom tools that LLMs cam talk to directly.

MCP is a protocol that lets you expose any custom logic or service (like a weather API, Calculator, or even your own DB) and plug it I to an AI agent - it's clean , fast.

You can build your own MCP server with Python ( like using FastMCP) then connect it to an LLM via a client. The LLM can then ask your tool for answers in real time.

Real protocol. Real structure.

MCP is actually the official spec - model context protocol

While we have the official spec live ☝️☝️

You can explore and share your MCP - compatible tools on smithery- think of it like the Hugging Face for MCPs.

**

🚨 Security Note 🚨

**

Ofcourse the MCP setups involve some subprocessss, API calls and server logic - means a code is running.

  • Be cautious when connecting them with your Gitub , or accounts.
  • if you are testing stuff, I would recommend a fresh email, new GitHub and maybe even an isolated virtual environment.

✍️ - not every example out there is Hardened for safety.

Especially the GitHub MCP, I have used for a while and it has all access to everything in GitHub.

📌 want to see a working example? Here is a repo Where I have been experimenting building an MCP weather server , and a maths server (I just have addition and multiplication in there,you can add more and more maths and even more server. ) and connecting then to an AI agent using LangGraph.

here is the link to the project in GitHub - mcp-langchain

You will find:

  • a weather server
  • math server
  • a client setup that lets an LLM Use those tools intelligently

You will go through readme for setup.

A simple one , but can give you a roadmap and more insights.

Explore it, fork it, run it. 🤝

Comments 0 total

    Add comment