Feature

Function/Tool Calling

Let AI models interact with your APIs, databases, and external tools

What is Function Calling?

Function calling allows AI models to intelligently choose to call functions you define, passing appropriate arguments based on the conversation. This enables AI to interact with external systems, fetch real-time data, and perform actions.

Smart Selection

AI automatically chooses when and which functions to call

Type Safe

Functions have defined schemas for parameters

Multi-Step

Chain multiple function calls to complete complex tasks

Basic Example

Here's how to enable function calling in your requests:

Pythonpython
from openai import OpenAI
import json

client = OpenAI(
    base_url="https://api.parrotrouter.com/v1",
    api_key="your-api-key"
)

# Define available functions
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather in a location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA"
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                        "description": "The unit for temperature"
                    }
                },
                "required": ["location"]
            }
        }
    }
]

# Make the request
response = client.chat.completions.create(
    model="gpt-4-turbo-preview",
    messages=[
        {"role": "user", "content": "What's the weather like in New York?"}
    ],
    tools=tools,
    tool_choice="auto"  # Let the model decide when to use tools
)

# Check if the model wants to call a function
message = response.choices[0].message
if message.tool_calls:
    for tool_call in message.tool_calls:
        function_name = tool_call.function.name
        function_args = json.loads(tool_call.function.arguments)
        
        print(f"Model wants to call: {function_name}")
        print(f"With arguments: {function_args}")
        
        # Execute your function here
        if function_name == "get_weather":
            weather_result = get_weather_data(
                function_args["location"],
                function_args.get("unit", "fahrenheit")
            )

Complete Flow

Here's the full flow including executing the function and getting a final response:

Full Implementationpython
# Step 1: Initial request with tools
messages = [{"role": "user", "content": "What's the weather in Tokyo and NYC?"}]

response = client.chat.completions.create(
    model="gpt-4-turbo-preview",
    messages=messages,
    tools=tools
)

# Step 2: Execute function calls
assistant_message = response.choices[0].message
messages.append(assistant_message)  # Add assistant's message to history

if assistant_message.tool_calls:
    for tool_call in assistant_message.tool_calls:
        function_name = tool_call.function.name
        function_args = json.loads(tool_call.function.arguments)
        
        # Execute the function (your implementation)
        if function_name == "get_weather":
            result = {
                "temperature": 72,
                "condition": "sunny",
                "humidity": 45
            }
        
        # Add function result to messages
        messages.append({
            "role": "tool",
            "tool_call_id": tool_call.id,
            "content": json.dumps(result)
        })

# Step 3: Get final response with function results
final_response = client.chat.completions.create(
    model="gpt-4-turbo-preview",
    messages=messages
)

print(final_response.choices[0].message.content)
# Output: "The weather in Tokyo is sunny with a temperature of 72°F..."

Common Use Cases

Database Queries

Let AI query your database based on natural language requests

tools = [{
    "type": "function",
    "function": {
        "name": "query_database",
        "description": "Execute a database query",
        "parameters": {
            "type": "object",
            "properties": {
                "table": {"type": "string"},
                "filters": {"type": "object"},
                "limit": {"type": "integer"}
            }
        }
    }
}]

Web Search & APIs

Fetch real-time data from the web or external APIs

tools = [{
    "type": "function",
    "function": {
        "name": "web_search",
        "description": "Search the web for information",
        "parameters": {
            "type": "object",
            "properties": {
                "query": {"type": "string"},
                "num_results": {"type": "integer"}
            },
            "required": ["query"]
        }
    }
}]

Calculations & Processing

Perform complex calculations or data processing

tools = [{
    "type": "function",
    "function": {
        "name": "calculate",
        "description": "Perform mathematical calculations",
        "parameters": {
            "type": "object",
            "properties": {
                "expression": {"type": "string"},
                "variables": {"type": "object"}
            }
        }
    }
}]

System Actions

Execute actions like sending emails or creating tasks

tools = [{
    "type": "function",
    "function": {
        "name": "send_email",
        "description": "Send an email",
        "parameters": {
            "type": "object",
            "properties": {
                "to": {"type": "string"},
                "subject": {"type": "string"},
                "body": {"type": "string"}
            },
            "required": ["to", "subject", "body"]
        }
    }
}]

Advanced Features

Parallel Function Calls

Models can call multiple functions in parallel for efficiency:

# Model may return multiple tool calls
if message.tool_calls:
    # Execute all functions in parallel
    import concurrent.futures
    
    with concurrent.futures.ThreadPoolExecutor() as executor:
        futures = []
        for tool_call in message.tool_calls:
            future = executor.submit(
                execute_function,
                tool_call.function.name,
                json.loads(tool_call.function.arguments)
            )
            futures.append((tool_call.id, future))
        
        # Collect results
        for tool_call_id, future in futures:
            result = future.result()
            messages.append({
                "role": "tool",
                "tool_call_id": tool_call_id,
                "content": json.dumps(result)
            })

Force Function Use

You can force the model to use a specific function:

response = client.chat.completions.create(
    model="gpt-4-turbo-preview",
    messages=messages,
    tools=tools,
    tool_choice={
        "type": "function",
        "function": {"name": "get_weather"}
    }
)

Supported Models

GPT-4 & GPT-4 Turbo

Full support, parallel calls

Recommended

GPT-3.5 Turbo

Full support, single calls

Supported

Claude 3 Models

Native tool use support

Supported

Gemini Pro

Function calling support

Supported

Best Practices

  • 1.
    Clear Descriptions

    Write detailed function and parameter descriptions for better results

  • 2.
    Error Handling

    Always validate function arguments and handle execution errors gracefully

  • 3.
    Security First

    Never expose sensitive operations without proper authentication and validation

  • 4.
    Idempotent Functions

    Design functions to be safe to retry in case of failures

Related Features