Skip to content

Lightning-Universe/lightning-bolts

Repository files navigation

Deep Learning components for extending PyTorch Lightning


InstallationLatest DocsStable DocsAboutCommunityWebsiteLicense

PyPI Status PyPI - Downloads Build Status codecov

Documentation Status Slack license DOI


Getting Started

Pip / Conda

pip install lightning-bolts
Other installations

Install bleeding-edge (no guarantees)

pip install https://github.com/Lightning-Universe/lightning-bolts/archive/refs/heads/master.zip

To install all optional dependencies

pip install lightning-bolts["extra"]

What is Bolts?

Bolts package provides a variety of components to extend PyTorch Lightning, such as callbacks & datasets, for applied research and production.

Example 1: Accelerate Lightning Training with the Torch ORT Callback

Torch ORT converts your model into an optimized ONNX graph, speeding up training & inference when using NVIDIA or AMD GPUs. See the documentation for more details.

from pytorch_lightning import LightningModule, Trainer
import torchvision.models as models
from pl_bolts.callbacks import ORTCallback


class VisionModel(LightningModule):
    def __init__(self):
        super().__init__()
        self.model = models.vgg19_bn(pretrained=True)

    ...


model = VisionModel()
trainer = Trainer(gpus=1, callbacks=ORTCallback())
trainer.fit(model)

Example 2: Introduce Sparsity with the SparseMLCallback to Accelerate Inference

We can introduce sparsity during fine-tuning with SparseML, which ultimately allows us to leverage the DeepSparse engine to see performance improvements at inference time.

from pytorch_lightning import LightningModule, Trainer
import torchvision.models as models
from pl_bolts.callbacks import SparseMLCallback


class VisionModel(LightningModule):
    def __init__(self):
        super().__init__()
        self.model = models.vgg19_bn(pretrained=True)

    ...


model = VisionModel()
trainer = Trainer(gpus=1, callbacks=SparseMLCallback(recipe_path="recipe.yaml"))
trainer.fit(model)

Are specific research implementations supported?

We'd like to encourage users to contribute general components that will help a broad range of problems; however, components that help specific domains will also be welcomed!

For example, a callback to help train SSL models would be a great contribution; however, the next greatest SSL model from your latest paper would be a good contribution to Lightning Flash.

Use Lightning Flash to train, predict and serve state-of-the-art models for applied research. We suggest looking at our VISSL Flash integration for SSL-based tasks.

Contribute!

Bolts is supported by the PyTorch Lightning team and the PyTorch Lightning community!

Join our Slack and/or read our CONTRIBUTING guidelines to get help becoming a contributor!


License

Please observe the Apache 2.0 license that is listed in this repository. In addition, the Lightning framework is Patent Pending.