Skip to content

Cerebras/modelzoo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cerebras Model Zoo

Introduction

This repository contains examples of common deep learning models that can be trained on Cerebras hardware. These models demonstrate the best practices for coding a model targeted at the Cerebras hardware so that you can take full advantage of this new powerful compute engine.

In order to get started with running your models on a Cerebras system, please refer to the Developer Documentation along with this readme.

NOTE: If you are interested in trying out Cerebras Model Zoo on Cerebras Hardware (CS-2 Systems), we offer the following options:

  • Academics - Please fill out our Partner Hardware Access Request form here and we will contact you about gaining access to a system from one of our partners.
  • Commercial - Please fill out our Get Demo form here so that our team can provide you with a demo and discuss access to our system.
  • For all others - Please contact us at [email protected].

For a list of all supported models, please check models in this repository.

Installation

To install the Cerebras Model Zoo on the CSX system, please follow the instructions in PYTHON-SETUP.md.

Supported frameworks

We support the models developed in PyTorch.

Basic workflow

When you are targeting the Cerebras Wafer-Scale Cluster for your neural network jobs, please follow the quick start guide from the developer docs to compile, validate and train the models in this Model Zoo for the framework of your choice.

For advanced use cases and porting your existing code please refer to the developer docs.

Models in this repository

Model Code pointer
BERT Code
BERT (fine-tuning) Classifier Code
BERT (fine-tuning) Named Entity Recognition Code
BERT (fine-tuning) Summarization Code
BLOOM Code
BTLM Code
DiT Code
DPO Code
DPR Code
ESM-2 Code
Falcon Code
GPT-2 Code
GPT-3 Code
GPT-J Code
GPT-NeoX Code
GPT-J (fine-tuning) Summarization Code
JAIS Code
LLaMA, LLaMA-2 and LLaMA-3 Code
LLaVA Code
Mistral Code
Mixtral of Experts Code
MNIST (fully connected) Code
Multimodal Simple Code
MPT Code
RoBERTa Code
SantaCoder Code
StarCoder Code
Transformer Code
T5 Code

License

Apache License 2.0

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages