Get started using the SambaNova Cloud API in just a few minutes.

1

Get your API key.

Visit the API section of the SambaNova Cloud portal to generate your API key. Save this value in a secure manner.

2

Pick a model.

The SambaNova Cloud is frequently updated with latest open source models, running at the fastest tokens per second. View the full list on our supported models page. Select one from the list and refer to it using its corresponding model-ID.

We’ll use Meta-Llama-3.1-405B-Instruct as an example for the remainder of this guide.

3

Make an API request.

You can make an inference request in multiple ways. See two examples below:

  • OpenAI client library – Use Javascript or Python for a more flexible integration.

  • CURL command – Send a request directly from the command line.

OpenAI client library

To get started, decide the coding language you want to use. Then, install the OpenAI library using in a terminal window.

Next, copy the following code into a new file.

Once copied into the file, replace the string field in "<YOUR API KEY>" with your API Key value. Then run the file with the command below in a terminal window.

After you run the program, you should now see an output like similar to the one below.

Here’s a happy story: One day, a little girl named Sophie found a lost puppy in her neighborhood and decided to take it home to care for it. As she nursed the puppy back to health, she named it Max and the two became inseparable best friends, going on adventures and playing together every day.

CURL command

In a terminal window, run the CURL command to make your first request to the API.

export API_KEY=<YOUR API KEY>
export URL=https://api.sambanova.ai/v1/chat/completions

curl -H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{"role": "system", "content": "Answer the question in a couple sentences."},
{"role": "user", "content": "Share a happy story with me"}
],
"stop": ["<|eot_id|>"],
"model": "Meta-Llama-3.1-405B-Instruct",
"stream": true, "stream_options": {"include_usage": true}
}' \
-X POST $URL

Next Steps

Now that you can make requests to a model, great potential of building AI-powered applications await. Get inspired of what to build by exploring our AI Starter Kits, a collection of open-source Python projects.