Featured resource
2026 Tech Forecast
2026 Tech Forecast

Stay ahead of what’s next in tech with predictions from 1,500+ business leaders, insiders, and Pluralsight Authors.

Get these insights
  • Lab
    • Libraries: If you want this lab, consider one of these libraries.
    • AI
Labs

Guided: FastMCP Foundations

Move beyond basic API calls and learn to build intelligent, tool-aware AI systems. In this lab, you’ll construct a functional Model Context Protocol (MCP) server from scratch in Python. You will learn how to expose custom tools, handle client interactions, and adhere to a growing industry standard for AI agent communication. By the end, you’ll have a running server capable of securely providing tools to any MCP-compliant client—a crucial skill for building sophisticated, extensible AI applications.

Lab platform
Lab Info
Level
Intermediate
Last updated
Nov 18, 2025
Duration
28m

Contact sales

By clicking submit, you agree to our Privacy Policy and Terms of Use.
Table of Contents
  1. Challenge

    Introduction

    Welcome to the FastMCP Foundations Code Lab!

    In this lab, you'll build a production-ready MCP (Model Context Protocol) server for a "Corporate Assistant" application using FastMCP. The MCP is a standardized way for LLMs to securely access external data and tools—think of it as a bridge between AI models and your organization's systems.

    What you'll learn:

    • Structure an MCP server using FastMCP
    • Expose data via Resources (static and dynamic)
    • Provide reusable Prompts
    • Enable AI actions through Tools with proper error handling

    What you'll build:

    Your Corporate Assistant will help with common workplace tasks like:

    • Checking company holidays and employee information
    • Generating standardized email templates
    • Listing meeting rooms and sending emails

    By the end of this lab, you'll have a fully functional MCP server that can be connected to any MCP-compatible LLM client.

    Project files:

    • server.py: server entry point
    • resources.py: define resources
    • prompts.py: define prompts
    • tools.py: define tools
  2. Challenge

    Step 2: Creating the Core MCP Server

    Now that you have your files, you'll build the core of your server. A FastMCP server is the central hub that manages and exposes your components to an LLM client.

    First, you will instantiate the FastMCP class from the fastmcp library. This object will act as a registry for all your components.

    Then, you’ll make the server runnable by calling its run() method, which starts the MCP service and makes it available to connected clients.

  3. Challenge

    Step 3: Implementing MCP Resources

    Resources let you provide contextual information to an LLM. They are functions that return data when queried.

    FastMCP supports two main types:

    • Static Resources: Return fixed, unchanging data. Ideal for information like company policies or lists.
    • Dynamic Resources: Accept parameters and return data based on those inputs. Perfect for looking up information in a database or an external API.

    In this step, you'll create one of each. You'll use the @mcp.resource() decorator to register them.

    Note: The @mcp.resource() decorator requires a URI as its first argument—a unique identifier for the resource.

    For dynamic resources, use placeholders in the URI (like {employee_id}) that match function parameters.

  4. Challenge

    Step 4: Implementing MCP Prompts

    Prompts are reusable text templates that structure queries for an LLM. They help ensure consistent and high-quality output.

    Using the @mcp.prompt() decorator, you can define prompts with placeholders for dynamic content.

    FastMCP supports two types of prompts:

    • Prompt Templates: Simple, static strings that serve as a base for an LLM query.
    • Parameterized Prompts: Functions that accept arguments and insert them into a template, creating customized prompts on the fly.
  5. Challenge

    Step 5: Implementing MCP Tools

    Tools are functions that an LLM can decide to call to perform real-world actions, like sending an email or querying a database. This is a powerful feature that extends an LLM's capabilities beyond just generating text.

    You'll use the @tools_server.tool() decorator and Python type hints to define the tool's signature, which the LLM uses to understand how to call it.

    You will also implement error handling to make your tools more robust.

  6. Challenge

    Step 6: Final Validation

    Your final step is to ensure all the components you've built — resources, prompts, and tools — are correctly registered with the main application instance.

    In FastMCP, you'll need to import the server instances from each module and combine them with the main mcp instance. You'll use import_server() to register all components from resources.py, prompts.py, and tools.py with the main server.py file.

    Once this is done, your server is complete! #### Running the Client

    Your FastMCP server is fully configured and ready to accept connections. Here's how to test it with the provided client:

    Step 1: Start the Server

    In your first terminal, navigate to the workspace root directory and start the server:

    python3 server.py
    

    The server will start and listen on http://localhost:8000/mcp by default.

    Step 2: Run the Client

    In the second terminal, navigate to the same workspace root directory and run the client:

    python3 client.py
    

    The client will connect to your server and test all the components you've built:

    • Resources: Tests the company_holidays and get_employee_details resources
    • Prompts: Tests the generate_welcome_email and project_status_update prompts
    • Tools: Tests the list_meeting_rooms and send_email tools
    • Error Handling: Verifies that the send_email tool properly handles invalid email addresses

    Expected Output

    You should see output like:

    • Lists of available resources, prompts, and tools
    • Company holidays data
    • Employee details
    • Email templates
    • Meeting room lists
    • Email sending confirmations
    • Error handling verification for invalid emails

    If everything is working correctly, you'll see "All tests completed successfully!" at the end.

    Troubleshooting

    If you encounter connection errors:

    • Make sure the server is running in the first terminal
    • Verify the server is listening on port 8000 (check the server output)
    • Ensure both terminals are in the same workspace directory
About the author

Real skill practice before real-world application

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Learn by doing

Engage hands-on with the tools and technologies you’re learning. You pick the skill, we provide the credentials and environment.

Follow your guide

All labs have detailed instructions and objectives, guiding you through the learning process and ensuring you understand every step.

Turn time into mastery

On average, you retain 75% more of your learning if you take time to practice. Hands-on labs set you up for success to make those skills stick.

Get started with Pluralsight