Introduction
In this blog post, we’ll deep dive into function calling in Gemini. More
specifically, you’ll see how to handle multiple and parallel function
call requests from generate_content and chat interfaces and take a look at
the new auto function calling feature through a sample weather application.
What is function calling?
Function Calling is useful to augment LLMs with more up-to-date data via external API calls.
You can define custom functions and provide these to an LLM. While processing a prompt, the LLM can choose to delegate tasks to the functions that you identify. The model does not call the functions directly but rather makes function call requests with parameters to your application. In turn, your application code responds to function call requests by calling external APIs and providing the responses back to the model, allowing the LLM to complete its response to the prompt.
Read More →



