Skip to main content
SambaNova offers SDKs for both Python and JavaScript/TypeScript, making it easy to interact with the SambaNova REST API. The SambaNova Python client works with Python 3.8 and above, while the JavaScript/TypeScript version is designed for server-side environments. Both libraries come with built-in type definitions for request parameters and response fields, and support both synchronous and asynchronous usage.

Usage example

To get started, select your preferred programming language. Then, open a terminal window and install the SambaNova SDK.
//ensure you have Node.js installed.
npm install sambanova
Next, copy the following code into a new file.
import SambaNova from "sambanova";

const client = new SambaNova({
  baseURL: "your-sambanova-base-url",
  apiKey: "your-sambanova-api-key",
});

const chatCompletion = await client.chat.completions.create({
  messages: [
    { role: "system", content: "Answer the question in a couple sentences." },
    { role: "user", content: "Share a happy story with me" },
  ],
  model: "Meta-Llama-3.3-70B-Instruct",
});

console.log(chatCompletion.choices[0].message.content);
Once copied into the file, replace the string fields "your-sambanova-base-url" and "your-sambanova-api-key" with your base URL and API Key values. Then run the file with the command below in a terminal window.
node hello-world.js
After you run the program, you should now see an output like similar to the one below.
{
  "id": "d89243f2-de68-416f-85c6-27c244cf9c7f",
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "message": {
        "content": "One day, a little girl named Sophie found a lost puppy in her neighborhood and decided to take care of it until she could find its owner. As she cared for the puppy, named Max, they formed an unbreakable bond, and when the owner was finally found, they were so grateful to Sophie that they asked her to be Max's permanent dog-sitter, bringing joy and companionship to Sophie's life.",
        "role": "assistant"
      },
      "logprobs": null
    }
  ],
  "created": 1759518870.892972,
  "model": "Meta-Llama-3.3-70B-Instruct",
  "object": "chat.completion",
  "system_fingerprint": "fastcoe",
  "usage": {
    "completion_tokens": 84,
    "prompt_tokens": 49,
    "total_tokens": 133,
    ...
    "stop_reason": "stop"
  }
}
I