Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorganize the whole files and folders #51

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
39 changes: 35 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,42 @@
# MXNet Notebooks
# Python Notebooks for MXNet

This repo contains various notebooks ranging from basic usages of MXNet to
state-of-the-art deep learning applications.

## How to use
## Outline

### Basic Concepts

#### Section-1

* NDArray: manipulating multi-dimensional array
* Symbol: symbolic expression for neural networks
* Module : intermediate-level and high-level interface for neural network training and inference.
* Loading data : feeding data into training/inference programs
* Mixed programming: developing training algorithms by using NDArray and Symbol together.

#### Section-2
* MNIST: basic use case of MXNet based on the [MNIST](http://yann.lecun.com/exdb/mnist/) dataset
* Optimizer: In gradient-base optimization algorithms, we update the parameters using the gradients in each iteration. We call this updating function as Optimizer.
* Image Data IO: how to prepare, load and train with image data in MXNet.
* Record IO: the python interface for reading and writing record io files.


### Python
### Tutorials

* MNIST: Recognize handwritten digits with multilayer perceptrons and convolutional neural networks
* Recognize image objects with pre-trained model on the full Imagenet dataset that containing more than 10M images and over 10K classes
* Char-LSTM: Generates Obama's speeches with character-level LSTM.
* Matrix Factorization: Recommend movies to users.


### How Tos
* Use a pretrained 50 layers' [Deep Residual Learning](https://arxiv.org/abs/1512.03385)(resnet) model for prediction and feature extraction
* Fine-tune the [Deep Residual Learning](https://arxiv.org/abs/1512.03385)(resnet) model.
* Use a pretrained [Inception-BatchNorm Network](https://arxiv.org/abs/1502.03167).


## How to use

The python notebooks are written in [Jupyter](http://jupyter.org/).

Expand Down Expand Up @@ -52,7 +83,7 @@ The python notebooks are written in [Jupyter](http://jupyter.org/).

## How to develop

Some general guidelines
Some general guidelines:

- A notebook covers a single concept or application
- Try to be as basic as possible. Put advanced usages at the end, and allow reader to skip it.
Expand Down
14 changes: 14 additions & 0 deletions basic/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Outline

## Section-1
* NDArray: manipulating multi-dimensional array
* Symbol: symbolic expression for neural networks
* Module : intermediate-level and high-level interface for neural network training and inference.
* Loading data : feeding data into training/inference programs
* Mixed programming: developing training algorithms by using NDArray and Symbol together.

## Section-2
* MNIST: basic use case of MXNet based on the [MNIST](http://yann.lecun.com/exdb/mnist/) dataset
* Optimizer: In gradient-base optimization algorithms, we update the parameters using the gradients in each iteration. We call this updating function as Optimizer.
* Image Data IO: how to prepare, load and train with image data in MXNet.
* Record IO: the python interface for reading and writing record io files.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
8 changes: 8 additions & 0 deletions how_to/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Outline

## Section-1
* Use a pretrained 50 layers' [Deep Residual Learning](https://arxiv.org/abs/1512.03385)(resnet) model for prediction and feature extraction
* Fine-tune the [Deep Residual Learning](https://arxiv.org/abs/1512.03385)(resnet) model.

## Section-2
* Use a pretrained [Inception-BatchNorm Network](https://arxiv.org/abs/1502.03167).
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"source": [
"# Fine-tune with Pre-trained Models\n",
"\n",
"In practice the dataset we use is relative small, so that we do not train an neural network from scratch, namely staring from random initialized parameters. Instead, it is common to train a neural network on a large-scale dataset and then use it either as an initialization or a fixed feature extractor. On [predict.ipynb](./predict.ipynb) we explained how to do the feature extraction, this tutorial will focus on how to use pre-trained model to fine tune a new network.\n",
"In practice the dataset we use is relative small, so that we do not train an neural network from scratch, namely staring from random initialized parameters. Instead, it is common to train a neural network on a large-scale dataset and then use it either as an initialization or a fixed feature extractor. On [predict-with-resnet-model.ipynb](./predict-with-resnet-model.ipynb) we explained how to do the feature extraction, this tutorial will focus on how to use pre-trained model to fine tune a new network.\n",
"\n",
"The idea of fine-tune is that, we take a pre-trained model, replace the last fully-connected layer with new one, which outputs the desired number of classes and initializes with random values. Then we train as normal except that we may often use a smaller learning rate since we may already very close the final result. \n",
"\n",
Expand All @@ -19,7 +19,7 @@
"| Resnet-50 | 77.4% | \n",
"| Resnet-152 | 86.4% | \n",
"\n",
"## Prepare data\n",
"## Prepare data from scratch\n",
"\n",
"We follow the standard protocol to sample 60 images from each class as the training set, and the rest for the validation set. We resize images into 256x256 size and pack them into the rec file. The scripts to prepare the data is as following. \n",
"\n",
Expand All @@ -41,9 +41,22 @@
"python ~/mxnet/tools/im2rec.py --list True --recursive True caltech-256-60-val 256_ObjectCategories/\n",
"python ~/mxnet/tools/im2rec.py --resize 256 --quality 90 --num-thread 16 caltech-256-60-val 256_ObjectCategories/\n",
"python ~/mxnet/tools/im2rec.py --resize 256 --quality 90 --num-thread 16 caltech-256-60-train caltech_256_train_60/\n",
"```\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Use pre-genrated data\n",
"The following codes download the pre-generated rec files. It may take a few minutes.\n",
"\n",
"For convenience, we save the last part of url, e.g., caltech-256-60-train as the filename by the following code. \n",
"```\n",
"filename = url.split(\"/\")[-1]\n",
"```\n",
"\n",
"The following codes download the pre-generated rec files. It may take a few minutes."
"For detailed usage, you can reference [Python's slice notation](http://stackoverflow.com/questions/509211/explain-pythons-slice-notation) "
]
},
{
Expand All @@ -69,7 +82,8 @@
"collapsed": true
},
"source": [
"Next we define the function which returns the data iterators."
"## Data loading\n",
"Next we define the function which returns the data iterators. For detailed usage, you can reference [Data Loading API](http://mxnet.io/api/python/io.html) of MXNet."
]
},
{
Expand Down Expand Up @@ -107,6 +121,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get pre-trained model\n",
"We then download a pretrained 50-layer ResNet model and load into memory. \n",
"\n",
"Note. If `load_checkpoint` reports error, we can remove the downloaded files and try `get_model` again."
Expand Down Expand Up @@ -507,8 +522,9 @@
}
],
"metadata": {
"celltoolbar": "Raw Cell Format",
"kernelspec": {
"display_name": "Python 2",
"display_name": "Python [default]",
"language": "python",
"name": "python2"
},
Expand All @@ -522,7 +538,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.11"
"version": "2.7.13"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 2",
"display_name": "Python [default]",
"language": "python",
"name": "python2"
},
Expand All @@ -229,7 +229,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.6"
"version": "2.7.13"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3769,7 +3769,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 2",
"display_name": "Python [default]",
"language": "python",
"name": "python2"
},
Expand All @@ -3783,7 +3783,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.6"
"version": "2.7.13"
}
},
"nbformat": 4,
Expand Down
1 change: 0 additions & 1 deletion python/README.md

This file was deleted.

1 change: 0 additions & 1 deletion python/moved-from-mxnet/README.md

This file was deleted.

65 changes: 0 additions & 65 deletions python/outline.ipynb

This file was deleted.

File renamed without changes.
File renamed without changes.
281 changes: 281 additions & 0 deletions tutorials/advanced_img_io.ipynb

Large diffs are not rendered by default.

File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.