dev-resources.site
for different kinds of informations.
Function calling with Google Gemini chat AI
Published at
12/24/2024
Categories
genai
datascience
gemini
ai
Author
harshit_kedia
Author
13 person written this
harshit_kedia
open
Function calling enables developers to incorporte custom functions or APIs with a generative AI chat application
The AI model decides if any of the functions and APIs is useful for answering the user query, and the AI model responds with the functions and their arguments to be used
The functions need to be well described with the arguments, their argument data types and the returning values
In this tutorial we will explore parallel function calling with Python, using Google Gemini AI, with external APIs providing city's population and country's currency used, for answering user questions
import requests
import json
import google.generativeai as genai
Add your API key and select the model
genai.configure(api_key="YOUR API KEY")
model = genai.GenerativeModel("gemini-1.5-flash")
response = model.generate_content("What is the date today?")
print(response.text)
Today is October 26, 2023.
Define function to be used in function calling
#### Ensure to specify the datatype of arguments used in the function,
#### otherwise it will give error in function calling
def set_light_values(brightness: int, color_temp: str):
"""Set the brightness and color temperature of a room light. (mock API).
Args:
brightness: Light level from 0 to 100. Zero is off and 100 is full brightness
color_temp: Color temperature of the light fixture, which can be `daylight`, `cool` or `warm`.
Returns:
A dictionary containing the set brightness and color temperature.
"""
return {
"brightness": brightness,
"colorTemperature": color_temp
}
set_light_values(brightness=30,color_temp="daylight")
{'brightness': 30, 'colorTemperature': 'daylight'}
Select the model and specify the tool to be used in function calling
model = genai.GenerativeModel(model_name='gemini-1.5-flash',
tools=[set_light_values])
The response generated from the input will contain the suggested functions and arguments the chat model suggests
chat = model.start_chat()
response = chat.send_message('Dim the lights so the room feels cozy and warm.')
response
response:
GenerateContentResponse(
done=True,
iterator=None,
result=protos.GenerateContentResponse({
"candidates": [
{
"content": {
"parts": [
{
"function_call": {
"name": "set_light_values",
"args": {
"brightness": 30.0,
"color_temp": "warm"
}
}
}
],
"role": "model"
},
"finish_reason": "STOP",
"avg_logprobs": -0.0008751733228564262
}
],
"usage_metadata": {
"prompt_token_count": 150,
"candidates_token_count": 10,
"total_token_count": 160
}
}),
)
Evaluate the functions with the arguments the chat model suggests and store the functions with their corresponding values returned, as a JSON object
def function_used_and_their_outputs(functions,response):
for part in response.parts:
#### in every iteration, fn is assigned a function used by AI during the function call
if fn := part.function_call:
#### args gets assigned a string value like "brightness=30.0, color_temp='warm'",
#### repr() is used to add quotes to the keys like 'warm',
#### repr() is used so string value keys are treated as string, not variable
args = ", ".join(f"{key}={repr(val)}" for key, val in fn.args.items())
print("arguments are: ", args)
#### Function is being generated using f strings
function_to_be_called = f"{fn.name}({args})"
print("function being called using eval, with arguments: ", function_to_be_called)
output=eval(function_to_be_called)
functions[fn.name]=output
house_functions={}
function_used_and_their_outputs(house_functions,response)
house_functions
arguments are: brightness=30.0, color_temp='warm'
function being called using eval, with arguments: set_light_values(brightness=30.0, color_temp='warm')
{'set_light_values': {'brightness': 30.0, 'colorTemperature': 'warm'}}
Provide the JSON object back to the generative chat function to get AI generated response processing the outputs of the functions
def response_based_on_used_functions(functions):
response_parts = [
genai.protos.Part(function_response=genai.protos.FunctionResponse(name=fn, response={"result": val}))
for fn, val in functions.items()
]
response = chat.send_message(response_parts)
print(response.text)
response_based_on_used_functions(house_functions)
OK. I've dimmed the lights to 30% brightness and set the color temperature to warm. Is there anything else?
Defining functions to get population and currency of input city and country respectively
def get_population(city: str):
"""Gets population of a city and year of count.
Args:
city: The name of the city.
Returns:
JSON object encapsulating the population of the city and year of count if function executed successfully, else, an error message.
"""
url = "https://countriesnow.space/api/v0.1/countries/population/cities"
payload = {"city": city}
headers = {}
response = requests.request("POST", url, headers=headers, data=payload)
json_data=json.loads(response.text)
if json_data['error']:
return json_data['msg']
else:
return {'population':json_data['data']['populationCounts'][0]['value'],'year of counting':json_data['data']['populationCounts'][0]['year']}
get_population('amsterdam')
{'population': '779808', 'year of counting': '2011'}
def get_currency(country_code: str):
"""Gets information about the currency used in the country
Args:
country_code (str): The alphabetical country code of the country, eg.: if referring to Nigeria, then argument passed will be its alphabetical country code NG
Returns:
JSON object encapsulating the currency used in the country if function executed successfully, else, an error message is returned.
"""
url = "https://countriesnow.space/api/v0.1/countries/currency"
payload = {"iso2":country_code}
headers = {}
response = requests.request("POST", url, headers=headers, data=payload)
json_data=json.loads(response.text)
if json_data['error']:
return json_data['msg']
else:
return json_data['data']['currency']
get_currency('pk')
'PKR'
demography_functions=[get_population, get_currency]
model = genai.GenerativeModel(model_name="gemini-1.5-flash", tools=demography_functions)
chat = model.start_chat()
In the following example, notice that both, the population and currency functions are being called by the AI chat when population and currency are being asked by user, also notice that the AI is using its own knowledge to identify the country in which the city asked by the user is located, to send the corresponding country code as the argument to the currency function
response = chat.send_message('Tell about the population and currency used in delhi')
demography_functions_used={}
function_used_and_their_outputs(demography_functions_used,response)
demography_functions_used
arguments are: city='Delhi'
function being called using eval, with arguments: get_population(city='Delhi')
arguments are: country_code='IN'
function being called using eval, with arguments: get_currency(country_code='IN')
{'get_population': {'population': '9879172', 'year of counting': '2001'},
'get_currency': 'INR'}
response_based_on_used_functions(demography_functions_used)
The population of Delhi was 9,879,172 in 2001. The currency used in India, where Delhi is located, is the Indian Rupee (INR).
In the following 2 examples, notice that only the population function is being called by the AI chat when only population is asked by user
response = chat.send_message('Tell about the population of Tokyo')
demography_functions_used={}
function_used_and_their_outputs(demography_functions_used,response)
response_based_on_used_functions(demography_functions_used)
arguments are: city='Tokyo'
function being called using eval, with arguments: get_population(city='Tokyo')
The population of Tokyo was 8,945,695 in 2010.
Due to some reason, the API isn't able to fetch the population of New York, which the AI chat achnowledges
response = chat.send_message('Tell about the population of new york')
demography_functions_used={}
function_used_and_their_outputs(demography_functions_used,response)
response_based_on_used_functions(demography_functions_used)
arguments are: city='New York'
function being called using eval, with arguments: get_population(city='New York')
I'm sorry, I couldn't find information about the population of New York.
In the following examples, notice that only the currency function is being called by the AI chat when only currency is asked by user
response = chat.send_message('Tell about the currency of london')
demography_functions_used={}
function_used_and_their_outputs(demography_functions_used,response)
response_based_on_used_functions(demography_functions_used)
arguments are: country_code='GB'
function being called using eval, with arguments: get_currency(country_code='GB')
The currency used in London, United Kingdom is the British Pound (GBP).
– I am Harshit from India. I am exploring generative AI, data science and machine learning. To collaborate, let's connect on linkedin and twitter 😊
genai Article's
30 articles in total
A Magic Line That Cuts Your LLM Latency by >40% on Amazon Bedrock
read article
All Data and AI Weekly #172 for 13 January 2025
read article
Evolution of language models
read article
Spoken Language Models
read article
What is ollama? Is it also a LLM?
read article
What is Gen AI and how does it work?
read article
Building an Audio Conversation Bot with Twilio, FastAPI, and Google Gemini
read article
mkdev's top 10 GenAI gifts of 2024
read article
Opinions wanted: how do we identify AI misinformation?
read article
Tensorflow on AWS
read article
Party Rock Application - GenAI
read article
AI Basics: Understanding Artificial Intelligence and Its Everyday Applications
read article
Gen AI vs LLM: Understanding the Core Differences and Practical Insights
read article
Amazon Bedrock and its benefits in a RAG project
read article
My Tech Blog
read article
2025: When Computers Started Creating Things
read article
How to run Ollama on Windows using WSL
read article
Generative AI Cost Optimization Strategies
read article
Simplifying Data Extraction with OpenAI JSON Mode and JSON Schemas
read article
Driving business efficiency: Integrating Needle’s GenAI framework into your applications
read article
AI + Data Weekly 169 for 23 December 2024
read article
A PAGE TALKS ABOUT (TIME UNBOXED: The @reWireByAutomation Story (2024))
read article
AIOps : Déboguer son cluster Kubernetes en utilisant l’intelligence artificielle générative via…
read article
State of AI at the End of 2024
read article
Why Function-Calling GenAI Must Be Built by AI, Not Manually Coded
read article
GenAI Developer Roadmap 🚀 | Week 1, Day 1
read article
Transforming Enterprises with Needle: A Generative AI Framework
read article
The Dev Tools Evolution: LLMs, Wasm, and What's Next for 2025
read article
My Journey into Novel Creation Using Generative AI: Day 1
read article
Function calling with Google Gemini chat AI
currently reading
Featured ones: