Example: Add a get_weather
function
This section will walk you through the process of adding a get_weather
function as a tool.
Here’s how we’ll do it:
- Define the
get_weather
function. - Make the LLM aware of the tool.
- Make Jay aware of the tool.
1. Define the get_weather
function
Open the file containing your agent (e.g. agent/main.py
). Then, copy and paste the get_weather
function below into it:
2. Make the LLM aware of the tool
We’ll make the LLM aware of theget_weather
function by adding the tools
definition below to your LLM
response handler:
tools
definition above by following OpenAI’s Function
Calling guide.
Important: The function definition you include in your LLM response handler must exactly match the function signature of the corresponding function. In other words, the function name (e.g.get_weather
above) and the parameter name(s) (e.g.location
) must match. Any mismatch in the function name or parameter names can cause the tool call to fail or be ignored by the LLM.
3. Make Jay aware of the tool
Add theget_weather
function to the tools
array in your Agent
:
Function Calling Lifecycle
Here’s what happens if your LLM decides to call a function during a session:- The LLM will return a tool call message containing the function call (e.g.
get_weather
) and its parameters (e.g."location": "New York City"
). - Jay will detect the tool call message returned by the LLM and call the function with the specified parameters.
- Jay adds the result of the function call to the conversation history and then calls the LLM again with the full conversation history so that it can give a response to the user.