Workshops ... Part 1: OpenAI function calling recap

Part 1: OpenAI function calling recap

Before building the Django coding agent, we need the basic agent loop. A plain chatbot takes messages and returns text. An agent can also ask your program to run a tool, usually a normal Python function.

Plain chatbot call

Start with the OpenAI client:

from openai import OpenAI

openai_client = OpenAI()

If this fails because the API key is missing, go back to Overview and setup and set OPENAI_API_KEY before starting Jupyter.

Use a developer message for the behavior and a user message for the actual request:

system_prompt = "You can make funny and original jokes."
user_prompt = "Tell me a joke about Alexey."

chat_messages = [
    {"role": "developer", "content": system_prompt},
    {"role": "user", "content": user_prompt},
]

Send the messages to the Responses API:

response = openai_client.responses.create(
    model="gpt-4o-mini",
    input=chat_messages,
)

print(response.output_text)

This is still a chatbot. The model answers directly because we have not given it any functions it can call.

A Python function as a tool

Now create a normal Python function. The workshop uses jokes because the behavior is easy to see, but the same shape works for writing files, booking calendar events, searching a database, or calling an internal API.

import random

def make_joke(name):
    jokes = [
        f"Why did {name} bring a pencil to the party? Because he wanted to draw some attention!",
        f"Did you hear about {name}'s bakery? Business is on a roll!",
        f"{name} walked into a library and asked for a burger. The librarian said, 'This is a library.' So {name} whispered, 'Can I get a burger?'",
        f"When {name} does push-ups, the Earth moves down.",
        f"{name} told a chemistry joke... but there was no reaction.",
    ]
    return random.choice(jokes)

Call it once directly so you know what it returns:

print(make_joke("Alexey"))

The model cannot call this function only because it exists in the notebook. We need to describe it in the schema format the OpenAI API expects.

Tool schema

The tool schema names the function, describes what it does, and defines the arguments. The LLM sees this schema, not the Python source code.

make_joke_description = {
    "type": "function",
    "name": "make_joke",
    "description": "Generates a random personalized joke using the provided name.",
    "parameters": {
        "type": "object",
        "properties": {
            "name": {
                "type": "string",
                "description": "The name to insert into the joke, personalizing the output.",
            }
        },
        "required": ["name"],
        "additionalProperties": False,
    },
}

The name must match what your tool dispatcher will use later. The parameters object is JSON Schema. Here the tool needs exactly one string: name.

Send the same messages again, this time with the tool:

response = openai_client.responses.create(
    model="gpt-4o-mini",
    input=chat_messages,
    tools=[make_joke_description],
)

Look at the raw output:

response.output

Instead of answering immediately, the model can return a function_call item. The important part is that the model says, effectively, "I want to call make_joke, and these are the arguments."

The loop the API does not run

The API does not execute Python for you. Your code has to do the next steps:

  1. Detect that the response contains a function call.
  2. Parse the arguments.
  3. Execute the Python function.
  4. Send the function result back to the model.
  5. Ask the model for the final answer.

That loop is technical and repetitive. A separate agent workshop builds it by hand; here we use ToyAIKit to keep this session focused on the coding-agent design.

Note: this is the practical difference between a chatbot and an agent in this workshop. A chatbot only returns text. An agent can choose a tool, your program executes that tool, and the model continues with the result.

Continue with Part 2: ToyAIKit runner.

Questions & Answers (0)

Sign in to ask questions