This example shows how to perform random search hyperparameter tuning using CNTK with MNIST dataset to train a convolutional neural network (CNN) on a GPU cluster.
- We provide a CNTK example ConvMNIST.py to accept command line arguments for CNTK dataset, model locations, model file suffix and two hyperparameters for tuning: 1. hidden layer dimension and 2. feedforward constant
- For demonstration purposes, MNIST dataset and CNTK training script will be deployed at Azure File Share;
- Standard output of the job and the model will be stored on Azure File Share;
- MNIST dataset (http://yann.lecun.com/exdb/mnist/) has been preprocessed by usign install_mnist.py available here.
You can find Jupyter Notebook for this recipe in RandomSearch.ipynb.
Under construction...
If you have any problems or questions, you can reach the Batch AI team at [email protected] or you can create an issue on GitHub.
We also welcome your contributions of additional sample notebooks, scripts, or other examples of working with Batch AI.