Skip to main content

Documentation Index

Fetch the complete documentation index at: https://sambanova-systems.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Oumi is an open-source platform that simplifies the entire lifecycle of foundation models, from data preparation and training to evaluation and deployment. To learn more about using the SambaNova inference engine through Oumi, check out the Oumi SambaNova inference engine API reference. This guide provides detailed instructions on how to integrate and use the SambaNova engine within the Oumi platform.

Prerequisites

Before getting started, ensure you have:

Example

from oumi.inference import SambanovaInferenceEngine
from oumi.core.configs import ModelParams, RemoteParams

engine = SambanovaInferenceEngine(
    model_params=ModelParams(
        model_name="Meta-Llama-3.3-70B-Instruct"
    ),
    remote_params=RemoteParams(
        api_key_env_varname="SAMBANOVA_API_KEY",
    )
)