Overview and setup

The system you will build

We start with a course FAQ and turn it into an agent tool. The tool is small on purpose: search(query) looks up relevant FAQ entries and returns a list of matching documents.

That small tool lets us focus on how the agent works under the hood. We first call the OpenAI Responses API directly, then use a helper library for the same loop, then compare production-oriented frameworks, and finally expose the tool through MCP so other clients can use it.

The workshop uses the Data Engineering Zoomcamp FAQ as the example data. The source Google Doc is formatted as question and answer pairs, but the notebook uses a parsed JSON version so we can spend the workshop on agents rather than document parsing. You can look at the parser in the FAQ parsing notebook.

Prerequisites

You can run the whole workshop locally with Jupyter and uv. The workshop code pins Python >=3.13.

Accounts and keys:

  • OpenAI API key
  • Anthropic API key if you want to run the PydanticAI Anthropic example
  • Groq API key if you want to try the optional chat-completions notebook

Local tools:

  • Python 3.13 or newer
  • uv for isolated environments
  • Jupyter Notebook
  • Cursor or VS Code if you want to test the MCP server from an IDE

Set the OpenAI key in your shell before starting Jupyter:

export OPENAI_API_KEY="sk-..."

You can also pass the key directly to OpenAI(api_key="...") while experimenting, but do not commit a notebook or script that contains the key.

Project setup

Create the main workshop project:

mkdir agents-mcp
cd agents-mcp
uv init

Add the dependencies for the first half of the workshop:

uv add jupyter openai minsearch requests toyaikit

Use uv run when you start Jupyter so the notebook kernel sees the packages you installed. In the live workshop, starting Jupyter outside the uv environment made the notebook miss installed packages.

uv run jupyter notebook

Create a notebook named notebook.ipynb. The first check is that the OpenAI client can read your environment variable:

from openai import OpenAI

openai_client = OpenAI()

If this cell runs without an authentication error, the Python process can see OPENAI_API_KEY.

We add the framework dependencies later, when we reach those parts:

uv add openai-agents pydantic-ai

The MCP server is a separate uv project under mcp_faq/, because it has a smaller dependency set and can run independently from the notebook:

mkdir mcp_faq
cd mcp_faq
uv init
uv add fastmcp minsearch requests toyaikit

You now have the same project shape used in the workshop code. Next we will load the FAQ data and create the first search function.

Questions & Answers (0)

Sign in to ask questions