MCP Fundamentals

Apr 10 2026 · Python 3, JavaScript, macOS , Windows, VS Code

Lesson 03: Building MCP Clients & LLM Integration

Connecting Claude to Local Tools

Episode complete

Play next episode

Next
Transcript

In this video, you will run the scripts you just wrote to verify that the remote Anthropic API can successfully talk to your local machine.

Because the Anthropic API runs in the cloud, it cannot see localhost. Therefore, this demo involves three distinct steps: starting the server, creating a tunnel, and running the client.

Step 1: Start the Local Server

First, open a terminal window. Run your HTTP-based MCP server.

uv run --with mcp mcp_server_http.py

Wait for the application startup complete message. Your server is now listening on port 8000.

Step 2: Create a Public Tunnel

Open a second terminal window. You need to expose port 8000 to the public internet so Claude can reach it. You will use ngrok for this.

Run the following command:

ngrok http 8000 --host-header "localhost:8000"

Ngrok provides you with a forwarding URL. Look for the line that starts with https and ends with .ngrok-free.app or ngrok-free.dev. Copy this URL to your clipboard.

Step 3: Run the LLM Client

Now, open a third terminal window. This is where you will run the client script.

First, you must provide the server URL to your script using an environment variable. If you are using PowerShell, set it like this, pasting the URL you just copied:

$env:MCP_SERVER_URL = "https://your-url.ngrok-free.app/mcp"

If you are on Mac or Linux, you can use the export command:

export MCP_SERVER_URL="https://your-url.ngrok-free.app/mcp"

Be sure to add /mcp to the end of your ngrok URL.

Also, add your Claude API key to the environment variables like before:

Reminder: Ensure your account has active credits (via top-up or plan inclusion) before running the script.

export ANTHROPIC_API_KEY="your-key-here"

Now, run the client script. Make sure to include the anthropic dependency.

uv run --with mcp --with anthropic anthropic_llm.py

Watch the output closely.

First, you see [System] Calling MCP Tool. This confirms that Claude received your prompt (“How old is my dog?”), realized it didn’t know the answer, and decided to call your calculate_dog_in_human_year tool.

Next, you see [System] Tool Result: 28. Your local Python server performed the math and sent the number 28 back to the cloud.

Finally, Claude uses that raw data to generate the final response: “If your 3-year-old dog were a human, she would be 28 years old!”

You have successfully bridged the gap between a cloud-based LLM and your local code execution using MCP.

See forum comments
Cinema mode Download course materials from Github
Previous: Programmatic MCP with LLMs Next: Conclusion