SambaFlow learning map

Welcome! This doc page is a learning map for users new to SambaNova. It helps you see the big picture and find the information you need quickly. Here’s an overview:

SambaFlow in the software stack

Get the big picture: Architecture and concepts

Many of us learn best by understanding the big picture first — having a look at a map before exploring unknown territory. The doc set includes several pages that help you get oriented (or dig deep after initial exploration with the code).

Learn by doing: Tutorials and Model Zoo

Many of us learn best by doing. You can start with the new Model Zoo offering or choose from our older (and simpler) tutorials.

Model Zoo

The Model Zoo initiative includes a new public GitHub repo with these components:

  • The code for several open source LLMs that have been customized to run efficiently on RDU.

  • A set of example apps for compile/train and compile/inference workflows.

  • For each model, a commented YAML configuration file that allows you to run the model using the example apps (with minor modifications)

See the Model Zoo documentation and the SambaNova Model Zoo External link GitHub repo for details.

The typical workflow in the Model Zoo scenario is like this:

  1. Clone the public GitHub repo.

  2. Select a model, and download a checkpoint for that model from Hugging Face.

  3. Prepare the data that you want to use to fine tune the model.

  4. Use the training example app with the compile` option to compile the model and generate a PEF file. Both the example app and the customized model were downloaded from the SambaNova public GitHub.

  5. Use the training example app with the train option to fine-tune the model. You pass in the generated PEF file and the data you prepared for fine tuning.

  6. After you’ve completed training the model with your custom data, you can use the resulting custom checkpoint for generation (inference).

Tutorials: GitHub and doc

Choose one of our tutorials to come up to speed quickly. For tutorials, you clone the repo and compile and train directly on your SambaNova system. The tutorials use an earlier architecture, so some of the aspects of running on RDU are not shown in the code example. On the other hand, the tutorials are simple and include extensive code discussions. There’s special emphasis on comparing code on RDU with code on CPU.

  • Find tutorial code and a README with instructions in our sambanova/tutorials External link public GitHub repo.

  • For each tutorial, explore the code discussion in this doc set, which has a special focus on how code for running on RDU is different from code in other environments.

  • The learning map above points to some additional materials. For example, even if you’re trying out a simple model, the API Reference External link is useful.

The tutorials in this doc set use different code than tutorials included in /opt/sambaflow/apps. Tutorial examples have been updated and streamlined.
Table 1. Tutorial code, README, and code discussion
Tutorial Description Code and README Code discussion

Hello SambaFlow (logreg)

Learn how to compile and and run training. The tutorial code downloads the dataset.

hello_world on Github External link

Examine logreg model code

Intermediate (lenet)

Step through a complete machine learning workflow. Includes data preparation, compilation and training run, and compilation and inference run.

lenet on Github External link

Examine LeNet model code

Conversion 101

Learn about converting a PyTorch model to run on RDU looking at a simple CNN model. Includes two solutions: One uses an integrated loss function, another uses an external loss function.

Basics in Convert a simple model to SambaFlow

Examine functions and changes Examine model code with external loss function

Transformers on RDU

Use a pretrained Hugging Face GPT-2 model on RDU. The tutorial discusses data preparation, compilation and training and compilation and inference. The code is in two separate files.

Generative NLP on Github External link

Code elements of the training program
Code elements of the inference program

Reference

We include the following reference documentation for SambaFlow:

Data preparation, SambaNova Runtime, and SambaTune

The following resources in this doc set or elsewhere might help you learn more:

  • Data preparation scripts. We have a public GitHub repository External link with two scripts for pretraining data preparation, pipeline.py and data_prep.py.

  • SambaNova Runtime documentation. Information on logs, fault management, and other lower-level procedures.

  • SambaTune documentation. SambaNova tool for performance optimization (advanced).