LiteLLM is an open-source Python library that provides a unified interface for accessing LLMs, translating inputs and mapping exceptions.

Example code

from litellm import completion
import os

os.environ['SAMBANOVA_API_KEY'] = ""
response = completion(
    model="sambanova/Meta-Llama-3.1-8B-Instruct",
    messages=[
        {
            "role": "user",
            "content": "What do you know about sambanova.ai. Give your response in json format",
        }
    ],
    max_tokens=10,
    response_format={ "type": "json_object" },
    stop=["\n\n"],
    temperature=0.2,
    top_p=0.9,
    tool_choice="auto",
    tools=[],
    user="user",
)
print(response)

View the LiteLLM documentation for more examples and usage with a LiteLLM Proxy Server.