HomeArtificial IntelligenceA Coding Information to Completely different Operate Calling Strategies to Create Actual-Time,...

A Coding Information to Completely different Operate Calling Strategies to Create Actual-Time, Instrument-Enabled Conversational AI Brokers


Operate calling lets an LLM act as a bridge between natural-language prompts and real-world code or APIs. As a substitute of merely producing textual content, the mannequin decides when to invoke a predefined perform, emits a structured JSON name with the perform identify and arguments, after which waits to your utility to execute that decision and return the outcomes. This back-and-forth can loop, probably invoking a number of capabilities in sequence, enabling wealthy, multi-step interactions solely beneath conversational management. On this tutorial, we’ll implement a climate assistant with Gemini 2.0 Flash to reveal the right way to arrange and handle that function-calling cycle. We’ll implement totally different variants of Operate Calling. By integrating perform calls, we rework a chat interface right into a dynamic instrument for real-time duties, whether or not fetching reside climate information, checking order statuses, scheduling appointments, or updating databases. Customers now not fill out advanced varieties or navigate a number of screens; they merely describe what they want, and the LLM orchestrates the underlying actions seamlessly. This pure language automation allows the straightforward development of AI brokers that may entry exterior information sources, carry out transactions, or set off workflows, all inside a single dialog.

Operate Calling with Google Gemini 2.0 Flash

!pip set up "google-genai>=1.0.0" geopy requests

We set up the Gemini Python SDK (google-genai ≥ 1.0.0), together with geopy for changing location names to coordinates and requests for making HTTP calls, guaranteeing all of the core dependencies for our Colab climate assistant are in place.

import os
from google import genai


GEMINI_API_KEY = "Use_Your_API_Key"  


consumer = genai.Consumer(api_key=GEMINI_API_KEY)


model_id = "gemini-2.0-flash"

We import the Gemini SDK, set your API key, and create a genai.Consumer occasion configured to make use of the “gemini-2.0-flash” mannequin, establishing the inspiration for all subsequent function-calling requests.

res = consumer.fashions.generate_content(
    mannequin=model_id,
    contents=["Tell me 1 good fact about Nuremberg."]
)
print(res.textual content)

We ship a person immediate (“Inform me 1 good reality about Nuremberg.”) to the Gemini 2.0 Flash mannequin through generate_content, then print out the mannequin’s textual content reply, demonstrating a primary, end-to-end textual content‐era name utilizing the SDK.

Operate Calling with JSON Schema

weather_function = {
    "identify": "get_weather_forecast",
    "description": "Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour.",
    "parameters": {
        "kind": "object",
        "properties": {
            "location": {
                "kind": "string",
                "description": "The town and state, e.g., San Francisco, CA"
            },
            "date": {
                "kind": "string",
                "description": "the forecasting date for when to get the climate format (yyyy-mm-dd)"
            }
        },
        "required": ["location","date"]
    }
}

Right here, we outline a JSON Schema for our get_weather_forecast instrument, specifying its identify, a descriptive immediate to information Gemini on when to make use of it, and the precise enter parameters (location and date) with their sorts, descriptions, and required fields, so the mannequin can emit legitimate perform calls.

from google.genai.sorts import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API. As we speak is 2025-03-04.",
    instruments=[{"function_declarations": [weather_function]}],
)

We create a GenerateContentConfig that tells Gemini it’s appearing as a climate‐retrieval assistant and registers your climate perform beneath instruments. Therefore, the mannequin is aware of the right way to generate structured calls when requested for forecast information.

response = consumer.fashions.generate_content(
    mannequin=model_id,
    contents="Whats the climate in Berlin right this moment?"
)
print(response.textual content)

This name sends the naked immediate (“What’s the climate in Berlin right this moment?”) with out together with your config (and thus no perform definitions), so Gemini falls again to plain textual content completion, providing generic recommendation as a substitute of invoking your climate‐forecast instrument.

response = consumer.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin right this moment?"
)


for half in response.candidates[0].content material.components:
    print(half.function_call)

By passing in config (which incorporates your JSON‐schema instrument), Gemini acknowledges it ought to name get_weather_forecast quite than reply in plain textual content. The loop over response.candidates[0].content material.components then prints out every half’s .function_call object, displaying you precisely which perform the mannequin determined to invoke (with its identify and arguments).

from google.genai import sorts
from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location, date):
    location = geolocator.geocode(location)
    if location:
        attempt:
            response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = response.json()
            return {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        besides Exception as e:
            return {"error": str(e)}
    else:
        return {"error": "Location not discovered"}


capabilities = {
    "get_weather_forecast": get_weather_forecast
}


def call_function(function_name, **kwargs):
    return capabilities[function_name](**kwargs)


def function_call_loop(immediate):
    contents = [types.Content(role="user", parts=[types.Part(text=prompt)])]
    response = consumer.fashions.generate_content(
        mannequin=model_id,
        config=config,
        contents=contents
    )
    for half in response.candidates[0].content material.components:
        contents.append(sorts.Content material(function="mannequin", components=[part]))
        if half.function_call:
            print("Instrument name detected")
            function_call = half.function_call
            print(f"Calling instrument: {function_call.identify} with args: {function_call.args}")
            tool_result = call_function(function_call.identify, **function_call.args)
            function_response_part = sorts.Half.from_function_response(
                identify=function_call.identify,
                response={"outcome": tool_result},
            )
            contents.append(sorts.Content material(function="person", components=[function_response_part]))
            print(f"Calling LLM with instrument outcomes")
            func_gen_response = consumer.fashions.generate_content(
                mannequin=model_id, config=config, contents=contents
            )
            contents.append(sorts.Content material(function="mannequin", components=[func_gen_response]))
    return contents[-1].components[0].textual content.strip()
   
outcome = function_call_loop("Whats the climate in Berlin right this moment?")
print(outcome)

We implement a full “agentic” loop: it sends your immediate to Gemini, inspects the response for a perform name, executes get_weather_forecast (utilizing Geopy plus an Open-Meteo HTTP request), after which feeds the instrument’s outcome again into the mannequin to supply and return the ultimate conversational reply.

Operate Calling utilizing Python capabilities

from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")


def get_weather_forecast(location: str, date: str) -> str:
    """
    Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour."
   
    Args:
        location (str): The town and state, e.g., San Francisco, CA
        date (str): The forecasting date for when to get the climate format (yyyy-mm-dd)
    Returns:
        Dict[str, float]: A dictionary with the time as key and the temperature as worth
    """
    location = geolocator.geocode(location)
    if location:
        attempt:
            response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = response.json()
            return {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        besides Exception as e:
            return {"error": str(e)}
    else:
        return {"error": "Location not discovered"}

The get_weather_forecast perform first makes use of Geopy’s Nominatim to transform a city-and-state string into coordinates, then sends an HTTP request to the Open-Meteo API to retrieve hourly temperature information for the given date, returning a dictionary that maps every timestamp to its corresponding temperature. It additionally handles errors gracefully, returning an error message if the placement isn’t discovered or the API name fails.

from google.genai.sorts import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that may assist with climate associated questions. As we speak is 2025-03-04.", # to present the LLM context on the present date.
    instruments=[get_weather_forecast],
    automatic_function_calling={"disable": True}
)

This config registers your Python get_weather_forecast perform as a callable instrument. It units a transparent system immediate (together with the date) for context, whereas disabling “automatic_function_calling” in order that Gemini will emit the perform name payload as a substitute of invoking it internally.

r = consumer.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin right this moment?"
)
for half in r.candidates[0].content material.components:
    print(half.function_call)

By sending the immediate along with your customized config (together with the Python instrument however with computerized calls disabled), this snippet captures Gemini’s uncooked perform‐name choice. Then it loops over every response half to print out the .function_call object, letting you examine precisely which instrument the mannequin desires to invoke and with what arguments.

from google.genai.sorts import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API. As we speak is 2025-03-04.", # to present the LLM context on the present date.
    instruments=[get_weather_forecast],
)


r = consumer.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents="Whats the climate in Berlin right this moment?"
)


print(r.textual content)

With this config (which incorporates your get_weather_forecast perform and leaves computerized calling enabled by default), calling generate_content may have Gemini invoke your climate instrument behind the scenes after which return a pure‐language reply. Printing r.textual content outputs that remaining response, together with the precise temperature forecast for Berlin on the required date.

from google.genai.sorts import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use instruments to entry and retrieve data from a climate API.",
    instruments=[get_weather_forecast],
)


immediate = f"""
As we speak is 2025-03-04. You might be chatting with Andrew, you could have entry to extra details about him.


Person Context:
- identify: Andrew
- location: Nuremberg


Person: Am i able to put on a T-shirt later right this moment?"""


r = consumer.fashions.generate_content(
    mannequin=model_id,
    config=config,
    contents=immediate
)


print(r.textual content)

We lengthen your assistant with private context, telling Gemini Andrew’s identify and placement (Nuremberg) and asking if it’s T-shirt climate, whereas nonetheless utilizing the get_weather_forecast instrument beneath the hood. It then prints the mannequin’s natural-language advice primarily based on the precise forecast for that day.

In conclusion, we now know the right way to outline capabilities (through JSON schema or Python signatures), configure Gemini 2.0 Flash to detect and emit perform calls, and implement the “agentic” loop that executes these calls and composes the ultimate response. With these constructing blocks, we are able to lengthen any LLM right into a succesful, tool-enabled assistant that automates workflows, retrieves reside information, and interacts along with your code or APIs as effortlessly as chatting with a colleague.


Right here is the Colab Pocket book. Additionally, don’t overlook to comply with us on Twitter and be a part of our Telegram Channel and LinkedIn Group. Don’t Neglect to hitch our 90k+ ML SubReddit.

🔥 [Register Now] miniCON Digital Convention on AGENTIC AI: FREE REGISTRATION + Certificates of Attendance + 4 Hour Brief Occasion (Could 21, 9 am- 1 pm PST) + Arms on Workshop


Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its reputation amongst audiences.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments