Function calling (also known as “tool calling” or “tool use”) lets you connect LLMs to external data and systems. You can define a set of functions as tools that the LLM has access to, and it can use them when appropriate.
This guide will explain how to integrate function calls into your agent.
get_weather
functionThis section will walk you through the process of adding a get_weather
function as a tool.
Here’s how we’ll do it:
get_weather
function.get_weather
functionOpen the file containing your agent (e.g. agent/main.py
). Then, copy and paste the get_weather
function below into it:
We’ll make the LLM aware of the get_weather
function by adding the tools
definition below to your LLM
response handler:
You can learn more about the fields in the tools
definition above by following OpenAI’s Function
Calling guide.
Important: The function definition you include in your LLM response handler must exactly match the function signature of the corresponding function. In other words, the function name (e.g.
get_weather
above) and the parameter name(s) (e.g.location
) must match. Any mismatch in the function name or parameter names can cause the tool call to fail or be ignored by the LLM.
Add the get_weather
function to the tools
array in your Agent
:
Here’s what happens if your LLM decides to call a function during a session:
get_weather
) and its parameters (e.g. "location": "New York City"
).You can define event handlers that respond to events emitted during the function calling lifecycle. These event handlers are useful for sending data to external systems, like your database or a third-party analytics service.
on_function_calls_collected
(Reference): Called when Jay receives the
complete set of functions to execute from the LLM.on_function_calls_executed
(Reference): Called after Jay has executed all
the functions the model requested.Function calling (also known as “tool calling” or “tool use”) lets you connect LLMs to external data and systems. You can define a set of functions as tools that the LLM has access to, and it can use them when appropriate.
This guide will explain how to integrate function calls into your agent.
get_weather
functionThis section will walk you through the process of adding a get_weather
function as a tool.
Here’s how we’ll do it:
get_weather
function.get_weather
functionOpen the file containing your agent (e.g. agent/main.py
). Then, copy and paste the get_weather
function below into it:
We’ll make the LLM aware of the get_weather
function by adding the tools
definition below to your LLM
response handler:
You can learn more about the fields in the tools
definition above by following OpenAI’s Function
Calling guide.
Important: The function definition you include in your LLM response handler must exactly match the function signature of the corresponding function. In other words, the function name (e.g.
get_weather
above) and the parameter name(s) (e.g.location
) must match. Any mismatch in the function name or parameter names can cause the tool call to fail or be ignored by the LLM.
Add the get_weather
function to the tools
array in your Agent
:
Here’s what happens if your LLM decides to call a function during a session:
get_weather
) and its parameters (e.g. "location": "New York City"
).You can define event handlers that respond to events emitted during the function calling lifecycle. These event handlers are useful for sending data to external systems, like your database or a third-party analytics service.
on_function_calls_collected
(Reference): Called when Jay receives the
complete set of functions to execute from the LLM.on_function_calls_executed
(Reference): Called after Jay has executed all
the functions the model requested.