Function calling in SambaStudio
Function calling is a beta feature of SambaStudio that has been included as an early access feature of this release. |
Function calling in SambaStudio enables dynamic workflows by allowing the model to select and suggest function calls based on user input, which helps in building agentic workflows. By defining a set of functions, or tools, you provide context that lets the model recommend and fill in function arguments as needed.
The API itself does not execute functions; it only returns a list of function calls, which you then execute externally. |
How function calling works
Function calling enables adaptive workflows that leverage real-time data and structured outputs, creating more dynamic and responsive model interactions.
-
Submit a query with tools: Start by submitting a user query along with available tools defined in JSON schema. This schema specifies parameters for each function.
-
The model processes and suggests: The model interprets the query, assesses intent, and decides if it will respond conversationally or suggest function calls. If a function is called, it fills in the arguments based on the schema.
-
Receive a model response: You’ll get a response from the model, which may include a function call suggestion. Execute the function with the provided arguments and return the result to the model for further interaction.
Supported models
-
Meta-Llama-3.1-8B-Instruct
-
Meta-Llama-3.1-70B-Instruct
-
Meta-Llama-3.1-405B-Instruct
Meta recommends using Llama 70B-Instruct or Llama 405B-Instruct for applications that combine conversation and tool calling. Llama 8B-Instruct cannot reliably maintain a conversation alongside tool-calling definitions. It can be used for zero-shot tool calling, but tool instructions should be removed for regular conversations. |
Example usage
The examples below describe each step of using function calling with an end-to-end example after the last step.
Step 1: Define the function schema
Define a JSON schema for your function. You will need to specify:
-
The name of the function.
-
A description of what the function does.
-
The parameters, their data types, and their descriptions.
{
"type": "function",
"function": {
"name": "solve_quadratic",
"description": "Solves a quadratic equation given coefficients a, b, and c.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "integer", "description": "Coefficient of the squared term"},
"b": {"type": "integer", "description": "Coefficient of the linear term"},
"c": {"type": "integer", "description": "Constant term"},
"root_type": {"type": "string", "description": "Type of roots to return: 'real' or 'all'"}
},
"required": ["a", "b", "c"]
}
}
}
Step 2: Configure function calling in your request
When sending a request, include the function definition in the tools
parameter and set tool_choice
to the following:
-
auto
: Allows the model to choose between generating a message or calling a function. This is the default tool choice when the field is not specified. -
required
: This forces the model to generate a function call. The model will then always select one or more function(s) to call. -
To enforce a specific function call, set
tool_choice = {"type": "function", "function": {"name": "solve_quadratic"}}
. This ensures the model will only use the specified function.
import openai
import cmath
import json
# Initialize the client with SN Cloud base URL and your API key
client = openai.OpenAI(
base_url="https://<your-sambastudio-domain>/v1/<project-id>/<endpoint-id>/chat/completion",
api_key= API_KEY
)
def solve_quadratic(a, b, c, root_type="real"):
"""
Solve a quadratic equation of the form ax^2 + bx + c = 0.
"""
discriminant = b**2 - 4*a*c
if root_type == "real":
if discriminant < 0:
return [] # No real roots
else:
root1 = (-b + discriminant**0.5) / (2 * a)
root2 = (-b - discriminant**0.5) / (2 * a)
return [root1, root2]
else:
root1 = (-b + cmath.sqrt(discriminant)) / (2 * a)
root2 = (-b - cmath.sqrt(discriminant)) / (2 * a)
return [
{"real": root1.real, "imag": root1.imag},
{"real": root2.real, "imag": root2.imag}
]
# Define user input and function schema
user_prompt = "Find all the roots of a quadratic equation given coefficients a = 3, b = -11, and c = -4."
messages = [
{
"role": "user",
"content": user_prompt,
}
]
tools = [
{
"type": "function",
"function": {
"name": "solve_quadratic",
"description": "Solves a quadratic equation given coefficients a, b, and c.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "integer", "description": "Coefficient of the squared term"},
"b": {"type": "integer", "description": "Coefficient of the linear term"},
"c": {"type": "integer", "description": "Constant term"},
"root_type": {"type": "string", "description": "Type of roots: 'real' or 'all'"}
},
"required": ["a", "b", "c"]
}
}
}
]
response = client.chat.completions.create(
model="Meta-Llama-3.1-70B-Instruct",
messages=messages,
tools=tools,
tool_choice="required"
)
print(response)
Step 3: Handle tool calls
If the model chooses to call a function, you will find tool_calls
in the response. Extract the function call details and execute the corresponding function with the provided parameters.
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
# If tool call is present
if tool_calls:
tool_call = tool_calls[0]
function_name = tool_call.function.name
arguments = tool_call.function.arguments
# Call the appropriate function with parsed arguments
if function_name == "solve_quadratic":
result = solve_quadratic(
a=arguments["a"],
b=arguments["b"],
c=arguments["c"],
root_type=arguments.get("root_type", "real")
)
print(result)
Step 4: Provide function results back to the model
Once you have computed the result, pass it back to the model to continue the conversation or confirm the output.
# Convert result to JSON string format to return to model
function_response = json.dumps({"result": result})
# Provide the function response back to the model as a message
messages.append(
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response
}
)
# Second API call to incorporate the function result into conversation
second_response = client.chat.completions.create(
model="Meta-Llama-3.1-70B-Instruct",
messages=messages,
)
# print the final response from the model
print(second_response.choices[0].message.content)
Step 5: Example output
An example output is shown below.
The roots of the quadratic equation with coefficients a = 3, b = -11, and c = -4 are 4 and -1/3.
Example using OpenAI compatibility
An end-to-end example using OpenAI compatibility.
import openai
import cmath
import json
# Define the OpenAI client
client = openai.OpenAI(
base_url="https://<your-sambastudio-domain>/v1/<project-id>/<endpoint-id>/chat/completions",
api_key=API_KEY
)
MODEL = 'Meta-Llama-3.1-70B-Instruct'
# Function to solve the quadratic equation
def solve_quadratic(a, b, c, root_type="real"):
"""
Solve a quadratic equation of the form ax^2 + bx + c = 0.
"""
discriminant = b**2 - 4*a*c
if root_type == "real":
if discriminant < 0:
return [] # No real roots
else:
root1 = (-b + discriminant**0.5) / (2 * a)
root2 = (-b - discriminant**0.5) / (2 * a)
return [root1, root2]
else:
root1 = (-b + cmath.sqrt(discriminant)) / (2 * a)
root2 = (-b - cmath.sqrt(discriminant)) / (2 * a)
return [
{"real": root1.real, "imag": root1.imag},
{"real": root2.real, "imag": root2.imag}
]
# Function to run conversation and provide tool result back to the model
def run_conversation(user_prompt):
# Initial conversation with user input
messages = [
{
"role": "system",
"content": "You are an assistant that can solve quadratic equations given coefficients a, b, and c."
},
{
"role": "user",
"content": user_prompt,
}
]
# Define the tool
tools = [
{
"type": "function",
"function": {
"name": "solve_quadratic",
"description": "Solve a quadratic equation given coefficients a, b, and c.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "integer", "description": "Coefficient of the squared term."},
"b": {"type": "integer", "description": "Coefficient of the linear term."},
"c": {"type": "integer", "description": "Constant term."},
"root_type": {"type": "string", "description": "Type of roots: 'real' or 'all'."}
},
"required": ["a", "b", "c"],
}
}
}
]
# First API call to get model's response
response = client.chat.completions.create(
model=MODEL,
messages=messages,
tools=tools,
tool_choice="auto",
max_tokens=500
)
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
# If tool call is present
if tool_calls:
tool_call = tool_calls[0]
function_name = tool_call.function.name
arguments = tool_call.function.arguments
# Call the appropriate function with parsed arguments
if function_name == "solve_quadratic":
result = solve_quadratic(
a=arguments["a"],
b=arguments["b"],
c=arguments["c"],
root_type=arguments.get("root_type", "real")
)
# Convert result to JSON string format to return to model
function_response = json.dumps({"result": result})
# Provide the function response back to the model as a message
messages.append(
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response
}
)
# Second API call to incorporate the function result into conversation
second_response = client.chat.completions.create(
model=MODEL,
messages=messages,
max_tokens=500
)
# Return the final response from the model
return second_response.choices[0].message.content
# Example user prompt
user_prompt = "Find all the roots of a quadratic equation given coefficients a = 3, b = -11, and c = -4."
print(run_conversation(user_prompt))
JSON mode
You can set the response_format
parameter to json_object
in your request to ensure that the model outputs a valid JSON. In case the mode is not able to generate a valid JSON, an error will be returned.
Example code
import openai
# Define the OpenAI client
client = openai.OpenAI(
base_url="https://<your-sambastudio-domain>/v1/<project-id>/<endpoint-id>/chat/completions",
api_key= API_KEY
)
MODEL = 'Meta-Llama-3.1-70B-Instruct'
def run_conversation(user_prompt):
# Initial conversation with user input
messages = [
{
"role": "system",
"content": "Always provide the response in this JSON format: {\"country\": \"name\", \"capital\": \"xx\"}"
},
{
"role": "user",
"content": user_prompt,
}
]
# First API call to get model's response
response = client.chat.completions.create(
model=MODEL,
messages=messages,
max_tokens=500,
response_format = { "type": "json_object"},
# stream = True
)
response_message = response.choices[0].message
print(response_message)
run_conversation('what is the capital of Austria')
Additional usage commands
The example commands below demonstrate additional function calling commands using curl and the SambaNova API (snapi).
API V1 example curl command
Reference document: API V1
curl -X POST -H 'Content-Type: application/json' -H 'key: <api-key>' --data '{ "instance": "{\"conversation_id\": \"simple_0\", \"messages\": [{\"message_id\": 0, \"role\": \"user\", \"content\": \"Find the area of a triangle with a base of 10 units and height of 5 units.\"}], \"tools\": [{\"type\": \"function\", \"function\": {\"name\": \"calculate_triangle_area\", \"description\": \"Calculate the area of a triangle given its base and height.\", \"parameters\": {\"type\": \"object\", \"properties\": {\"base\": {\"type\": \"integer\", \"description\": \"The base of the triangle.\"}, \"height\": {\"type\": \"integer\", \"description\": \"The height of the triangle.\"}, \"unit\": {\"type\": \"string\", \"description\": \"The unit of measure (defaults to 'units' if not specified)\"}}, \"required\": [\"base\", \"height\"]}}}]}", "params": { "do_sample": { "type": "bool", "value": "false" }, "max_tokens_to_generate": { "type": "int", "value": "2048" }, "process_prompt": { "type": "bool", "value": "true" }, "repetition_penalty": { "type": "float", "value": "1" }, "select_expert": { "type": "str", "value": "Meta-Llama-3.1-8B-Instruct" }, "temperature": { "type": "float", "value": "1" }, "top_k": { "type": "int", "value": "50" }, "top_p": { "type": "float", "value": "1" } } }", "params": { "do_sample": { "type": "bool", "value": "true" }, "max_tokens_to_generate": { "type": "int", "value": "1024" }, "process_prompt": { "type": "bool", "value": "true" }, "repetition_penalty": { "type": "float", "value": "1" }, "select_expert": { "type": "str", "value": "Meta-Llama-3.1-8B-Instruct" }, "temperature": { "type": "float", "value": "0.7" }, "top_k": { "type": "int", "value": "50" }, "top_p": { "type": "float", "value": "0.95" } } }' 'https://<your-sambastudio-domain>/api/predict/generic/stream/<project-id>/<endpoint-id>'
API V2 example curl command
Reference document: API V2
curl -X POST -H 'Content-Type: application/json' -H 'key: <api-key>' --data '{ "items": [ { "id": "1", "value": "{\"conversation_id\": \"simple_0\", \"messages\": [{\"message_id\": 0, \"role\": \"user\", \"content\": \"Find the area of a triangle with a base of 10 units and height of 5 units.\"}], \"tools\": [{\"type\": \"function\", \"function\": {\"name\": \"calculate_triangle_area\", \"description\": \"Calculate the area of a triangle given its base and height.\", \"parameters\": {\"type\": \"object\", \"properties\": {\"base\": {\"type\": \"integer\", \"description\": \"The base of the triangle.\"}, \"height\": {\"type\": \"integer\", \"description\": \"The height of the triangle.\"}, \"unit\": {\"type\": \"string\", \"description\": \"The unit of measure (defaults to 'units' if not specified)\"}}, \"required\": [\"base\", \"height\"]}}}]}" }], "params": { "do_sample": false, "max_tokens_to_generate": 2048, "process_prompt": true, "repetition_penalty": 1, "select_expert": "Meta-Llama-3.1-8B-Instruct", "temperature": 1, "top_k": 50, "top_p": 1 } }' 'https://<your-sambastudio-domain>/api/v2/predict/generic/stream/<project-ida/<endpoint-id>'
OpenAI API example curl command
Reference document: OpenAI compatible API
curl --location 'https://<your-sambastudio-domain>/v1/<project-id>/<endpoint-id>/chat/completions' --header 'Content-Type: application/json' --header 'key: <api-key>' --data '{"model":"Meta-Llama-3.1-8B-Instruct","messages":[ { "role": "user", "content": "Find the area of a triangle with a base of 10 units and height of 5 units." } ], "tools": [ { "type": "function", "function": { "name": "calculate_triangle_area", "description": "Calculate the area of a triangle given its base and height.", "parameters": { "type": "object", "properties": { "base": { "type": "integer", "description": "The base of the triangle." }, "height": { "type": "integer", "description": "The height of the triangle." }, "unit": { "type": "string", "description": "The unit of measure (defaults to '\''units'\'' if not specified)" } }, "required": [ "base", "height" ] } } } ],"max_tokens":3000,"temperature":1,"top_p":1,"top_k":50,"frequency_penalty":1,"stream":false,"stream_options":{"include_usage":true}}'
SambaNova API (snapi) example command
snapi predict chat-completions --model Meta-Llama-3.1-8B-Instruct --project-id <project-id> --endpoint-id <endpoint-id> --key <api-key> --messages '[{"role":"user","content":"Find the area of a triangle with a base of 10 units and height of 5 units."}]' --tools '[{"type":"function","function":{"name":"calculate_triangle_area","description":"Calculate the area of a triangle given its base and height.","parameters":{"type":"object","properties":{"base":{"type":"integer","description":"The base of the triangle."},"height":{"type":"integer","description":"The height of the triangle."},"unit":{"type":"string","description":"The unit of measure (defaults to 'units' if not specified)"}},"required":["base","height"]}}}]' --tool-choice "auto"