Skip to content

Minimal deep learning framework for Python, accelerated with Rust.

License

Notifications You must be signed in to change notification settings

neurocode-ai/leaf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

leaf, the minimal deep learning framework

Builds status badge Unit tests status badge License: MIT

In its essence, leaf is a NumPy only implementation of the well known PyTorch framework. The convolutional operations accelerated with Rust binaries enable a robust API for deep learning without GPU. Support for GPU is provided through pyopencl which allows access to parallel compute devices in Python.

Setup and install

The first thing you are going to need is a Rust compiler. Easiest way to install it is by using the toolchain manager rustup. Go to this link and following the instructions.

The second step is to create and activate a virtual environment. This can be done with the venv module like this.

$ python3 -m venv my-venv
$ source my-venv/bin/activate
(my-venv) $ 

Next step is to install the required Python libraries. Easiest way to do this is with pip in the following way.

$ python3 -m pip install -r requirements.txt

Last step is to compile the Rust binary so that we can access the Rust modules from our Python code. There is a Makefile for building everything, simply run it by typing make (To remove all compiled binaries and clean up the Rust directory, instead run make clean).

Now you can verify that everything has been set up correctly by running the pyvsrust.py script, which runs a minimal benchmark test on the Rust binaries.

Example

Below is a brief example of creating Tensors and performing an operation with them, then aggregating their result using a reduce operation which allows us to calculate their respective gradients using the autograd framework.

from leaf import Tensor

x = Tensor([[-1.4, 2.3, 5.9]], requires_grad=True)
w = Tensor.eye(3, requires_grad=True)

y = x.dot(w).mean()
y.backward()

print(x.grad)  # dy/dx
print(w.grad)  # dy/dw

Neural networks

This is a work in progress. Will be implemented under the leaf.nn module.

License

All code written is to be held under a general MIT license, please see LICENSE for specific information.

Releases

No releases published

Packages

No packages published

Languages