CodexSpot
L

LangChain MCP Server

Integrates AI assistants with LangChain through MCP, enabling execution of chains and querying of vector stores to build retrieval-augmented generation workflows from your AI environment.

stdioLocalApiKeypip

What This Server Exposes

Tools

run_chainsearch_vectorstore

Resources

  • chain outputs
  • vector store results

Setup Instructions

  1. 1Ensure Python 3.10+ and pip are installed
  2. 2Install the server with: pip install mcp-server-langchain
  3. 3Set the OPENAI_API_KEY environment variable if your chains use OpenAI
  4. 4Add the LangChain server config to your Claude Desktop configuration
  5. 5Restart Claude Desktop

Configuration

Claude Desktop Configuration

claude_desktop_config.json
{
  "mcpServers": {
    "langchain": {
      "command": "python",
      "args": [
        "-m",
        "mcp_server_langchain"
      ],
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key"
      }
    }
  }
}

Environment Variables

OPENAI_API_KEY
OpenAI API key used by LangChain chains and embeddings. Required if your chains use OpenAI models.
🔒 Secret

Server Manifest

No official server.json manifest found. Generate one with our MCP server.json Generator →

Compatibility

Claude DesktopClaude Code

Safety Notes

LangChain chains can make outbound calls to external APIs and databases. Audit your chain definitions to understand what data they access. Protect any API keys used within chains.

This listing is for informational purposes only. CodexSpot is not affiliated with LangChain MCP Server.