SambaStack v0.4.8 Release
Release Date: March 10, 2026 This release introduces air-gapped deployment support, custom checkpoint management with NFS storage, swappable model configurations, and multiple API enhancements for improved OpenAI compatibility.New Features and Enhancements
SambaStack Air-gapped Support
Added support for air-gapped mode of operation, enabling secure, isolated deployments.- Install, upgrade, and setup for air-gapped configurations is performed in conjunction with SambaNova support.
- Ongoing administration (Auth, User Management, Custom DB) is designed for self-service and follows the same workflows as on-prem deployments.
Install, setup, port forwarding to access Keycloak UI, and upgrade steps are not documented for air-gapped deployments due to varying customer network configurations. Please work with SambaNova support for these workflows.
Custom Checkpoints with NFS Storage
Added the ability to reference custom checkpoints from customer-provided NFS storage in deployments.Swappable Models in Bundles
Added configurable model swapping behavior to optimize high-bandwidth memory (HBM) utilization.- By default, all models in bundles can be swapped out of HBM and replaced with other models in DDR memory.
- Use the
swappable: <boolean>field in the bundle YAML definition to enable or disable this behavior. - Default value is
true. When set tofalse, the model remains in HBM and cannot be swapped out, ensuring zero switching time for requests to that model.
API Improvements and Fixes
Enhancements to improve OpenAI compatibility across thechat/completions endpoint.
Text Object Support in User Message Content
- Added support for text objects in content arrays, matching OpenAI
ChatCompletionsContentPartTextspecification. - Enabled for: gpt-oss-120b, DeepSeek-V3.1, DeepSeek-V3.1-Terminus, DeepSeek-V3.2, DeepSeek-V3-0324, Qwen3-32B, Qwen3-235B.
Response Format Text Option
- Fixed an issue where
response_format=textwould throw an error. - The endpoint now supports all OpenAI formats:
text,json_object,json_schema.
Extended Temperature Range
- Expanded
temperaturerange from 0.0–1.0 to 0.0–2.0, matching OpenAI specification.
Tool Calling Number Type Fix
- Tools with number-type arguments were always returned as floats.
- Now integers are preserved as integers, matching JSON Schema number specification.
Parallel Tool Calls Support
- Added
parallel_tool_callsparameter support. - When set to
false, the model will make at most one tool call per response, matching OpenAI specification.
Streaming Token Usage Reporting
- Added support for token usage reporting in each chunk of stream.
Known Issues
- Parallel Tool Calls with Constrained Decoding
- The following models return
nullforlogprobseven whenlogprobs=trueortop_logprobsis set. The parameters are accepted without error but have no effect.- Llama-4-Maverick-17B-128E-Instruct
- Whisper-Large-v3
SambaStack Initial Release
Release Date: September 19, 2025 This release introduces the comprehensive SambaStack documentation suite.New Features and Enhancements
SambaStack Guide
Added the SambaStack Guide, providing step-by-step instructions for deploying, configuring, and managing SambaStack.- Setup, installation, and environment configuration
- User and authentication management (Keycloak, OIDC)
- Monitoring, logging, and artifact management
- Bundle and model deployment workflows
- Common command reference
SambaStack Models
Added the SambaStack models and bundles page to help customers understand which models are available on SambaStack and how to configure them.- Lists all supported models (e.g., Llama 3.3, Llama 4 Maverick, DeepSeek).
- Shows context length, batch size options, and supported features.
- Instructions for using the Model list API to check availability in your environment.
