Instructor enables structured output generation with SambaNova models. It allows large language models (LLMs) to produce responses in predefined formats—such as JSON, XML, or custom data schemas—ensuring consistency and making the output easier to parse and integrate into downstream systems. This functionality is particularly valuable for APIs, automation pipelines, and AI-driven applications that require reliable and predictable outputs.Documentation Index
Fetch the complete documentation index at: https://sambanova-systems.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- A SambaCloud account and API key
Installation
Basic usage
The following code demonstrates how to use the SambaCloud API with Instructor to generate structured output from theMeta-Llama-3.3-70B-Instruct model. A User schema is defined using Pydantic, requiring the model to return a response with a name (string) and age (integer). Instructor handles the response validation and parsing, resulting in a structured Python object.
Async usage
This code also uses SambaCloud API with Instructor to enforce structured output. The result is fetched asynchronously and printed, outputtingUser(name='Ivan', age=28').

