This repository contains scripts to setup an environment for exercises of Machine Learning course held by Prof. Iocchi.
The repo has been developed with the contributions of ML tutors: Ermanno Bartoli and Francesco Frattolillo.
In order to have a ready environment without installing manually all the libraries and dependencies, we use docker.
In order to install docker on your PC, you can follow the following guide:
NB: It's important that you add your user to the docker
group and log out and in again, before proceeding.
Note: skip this section if you do not have an NVidia GPU
In order to run the tensorflow-gpu container you need to have an NVIDIA GPU and the host machine requires the NVIDIA driver (you don't need the NVIDIA CUDA Toolkit). Follow the
Follow the remaining steps described on the tensorflow website. In particular, Install the Nvidia Container Toolkit by following the
The standard docker image used in this course is tensorflow 2.13.0-gpu-jupyter with both GPU support and jupyter notebook pre installed.
This docker image works also on CPU.
You'll need to follow this few steps:
- Clone the repository with the following command:
git clone https://github.com/iocchi/MLexercises.git
- go inside the repository and create a folder called notebooks
cd MLexercises
- Build the docker image by running the script:
bash build.bash
Once that you're ready, if in the previous step you have built the image with the build.bash
script, you can run the image
with GPU support
bash rungpu.bash
without GPU support
bash run.bash
You can build and run the images with direct commands instead of using scripts.
Build an image
docker built -t NAME_OF_IMAGE .
Run the image with GPU support
nvidia-docker run --name NAME_OF_IMAGE --rm -p 8888:8888 NAME_OF_IMAGE
or without GPU support
docker run --name NAME_OF_IMAGE --rm -p 8888:8888 NAME_OF_IMAGE
NB: -p 8888:8888 should be always the same because it's for the port
Since Google Colab has some limitations but a well structured interface, you can decide to connect colab to a local runtime and use the computational power of your machine.
- First of all upload your local ipynb file to colab
- Once you have uploaded your file the first time, colab will automatically save and update (CTRL+S) it on your drive. Next time you want to work on this file just open it from google drive
- Connect google colab to local runtime (after running the docker container):
- Write http://localhost:8888/?token= as showed below:
- Add the token showed when executing the command in the Usage section
To test your image, use the first_notebook.ipynb
available in the test
directory.
To stop the container, you can press CTRL-c
in the terminal where you launched it,
or issue in another terminal the command
docker stop mlnotebook
If you want to develop and run code locally (without Colab), you should mount a local folder to the container, write your Python code there and run the Python script from the container.
Although not recommended, if you change something inside the docker and you want to keep the changes don't forget to commit the image by doing the following command:
docker commit ID_IMAGE NAME_OF_IMAGE
NB: you can see the id of the image by doing the command:
docker ps