AI LABS thumbnail

AI LABS

Connect any LLM to any MCP server without any MCP client.

Unlocking Direct MCP Server Communication with the New Python Client Library

Communicating with Multi-Cloud Platform (MCP) servers has traditionally required the use of specific MCP clients like Windinsurf, Cursor, and Claude Desktop. But what if you could streamline this process directly within your code, using any large language model (LLM) you prefer? This is now possible thanks to an exciting new MCP client library that empowers developers to integrate MCP servers seamlessly with LLMs — all within a flexible, modular Python framework.

In this blog post, we’ll walk you through what this library offers, how to set it up, and some creative ways to use it to build powerful autonomous agents.


What Is the New MCP Client Library?

This new MCP client library is a Python-based tool designed to simplify communication between your code and MCP servers. Unlike previous solutions that required specific clients, this library lets you bind any LLM—OpenAI, Anthropic, Grock, Llama, and more—directly to MCP servers using an agent-based architecture.

Key Features:

  • LLM Agnostic: Use any preferred LLM provider with simple configuration.
  • Agent-Based: The agent handles communication logic, including prompt management and step limits.
  • Multi-Server Support: Manage multiple MCP servers in a single file and intelligently route requests.
  • Modular & Extensible: Easily modify code to build unique, autonomous applications.
  • Localhost HTTP Support: Connect to MCP servers running locally or remotely.
  • Easy Installation: Python package with straightforward setup steps.

How to Get Started: Installation and Setup

Step 1: Verify Python Installation

Make sure you have Python installed on your system. You can check this by running:

bash python --version

or for Python 3:

bash python3 --version

Step 2: Create and Activate a Virtual Environment

Creating a virtual environment isolates your project dependencies. Use the following commands:

  • Windows:

bash python -m venv env .\env\Scripts\activate

  • macOS/Linux:

bash python3 -m venv env source env/bin/activate

If you’re unfamiliar with these commands, you can easily get them from ChatGPT or online resources.

Step 3: Install the MCP Client Library and LLM Dependencies

Use the package manager pip3 to install the MCP client library and any LLM-specific packages you need:

bash pip3 install mcp-client-library pip3 install langchain-openai # For OpenAI pip3 install langchain-anthropic # For Anthropic

Check the GitHub repository’s documentation for other providers like Grock or Llama.


Configuring Your Project with API Keys

After setup, open your project folder in an IDE or editor like Cursor:

bash cursor .

Create a new .env file and add your API key for the LLM provider you intend to use. For example, if you’re using OpenAI:

OPENAI_API_KEY=your_openai_api_key_here

Only add the key for the provider you plan to use.


How the Code Works: A Quick Overview

Here’s a simplified explanation of the setup:

  1. Load Environment Variables: Your .env file is loaded to access API keys.
  2. Create MCP Client: The client connects to the MCP server using configurations in a separate file.
  3. Define LLM: Set which LLM model you want to use (e.g., OpenAI GPT-4).
  4. Create Agent: The agent binds the LLM and MCP client, sets max steps, and provides prompts.
  5. Execute and Receive Output: The agent communicates with the MCP server and returns results.

This architecture eliminates the need for separate MCP clients and allows you to build modular applications, such as autonomous WhatsApp agents or real estate listing filters.


Real-World Example: Airbnb Listing Filter

Using the Airbnb MCP server, you can create an agent that fetches listings filtered by your preferences—like properties with pools and high ratings. The agent queries the MCP server, processes the data through the LLM, and returns curated results. This showcases the power and flexibility of the framework.


Enhancing Your Development with Cursor and Docs Integration

If you use Cursor as your code editor, you might notice it lacks context about the MCP framework by default. To fix this:

  1. Go to Cursor’s features section and open the docs.
  2. Add a new doc and paste the link to the MCP library’s README file from the GitHub repository.
  3. Cursor will index this documentation, enabling it to generate context-aware code snippets.

Additionally, you can convert the entire repository into a format digestible by LLMs by modifying the URL (replacing /hub/ with /ingest/). This allows you to query the repo content directly for clarifications or help.


Expanding Possibilities: Multi-Server and Service Manager Support

The library supports defining multiple MCP servers in a single configuration file. You can either:

  • Specify which server should handle each request, or
  • Enable the service manager, which intelligently routes requests to the appropriate server.

You can also control which tools the agent can access, opening up possibilities for complex, multi-faceted applications.


Why You Should Try This Framework

This MCP client library is a game-changer for developers looking to leverage MCP servers and LLMs together. It offers:

  • Flexibility to use your favorite LLM.
  • Simplified architecture without needing separate MCP clients.
  • Modular design for creative application development.
  • Support for local and remote servers.
  • Intelligent multi-server management.

The code is open-source and available on GitHub, with example projects like using Playwright with Airbnb or running the Blender MCP server.


Final Thoughts

Whether you’re an experienced developer or new to coding, this library provides the tools and flexibility to build powerful, autonomous agents that interact with MCP servers directly. Use ChatGPT or Cursor to help write or modify your code, and explore the vast possibilities this framework unlocks.

Ready to dive in? Check out the GitHub repo, try the examples, and start building your own MCP-powered applications today!


Useful Links


If you found this post helpful, consider subscribing and supporting the ongoing development through donations linked in the repository. Happy coding!

← Back to AI LABS Blog