-
Notifications
You must be signed in to change notification settings - Fork 277
LSTM RNN Notes
Daniel Shiffman edited this page Nov 22, 2016
·
13 revisions
- Nature of Code Neural Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks
- Understanding LSTM Networks
- RecurrentJS (JavaScript)
- TensorFlow -- Google's open source machine learning framework (C++/python)
- Keras -- High level wrapper for machine learning (works on top of tensorflow)
- Torch-RNN (Lua)
- RNN: Recurrent Neural Network
- LSTM: Long Short-Term Memory
- Supervised Learning: training with "known" data
- Epoch: single pass through the entire training set
- Model: The results of a training process (can be saved for later use).
- Prediction: The output of a neural network with arbitrary inputs.
- Learning Rate: This is a value that tells the neural network how fast to change its weights based on errors. When it is first training, it should learn fast but as it gets better that learning should "slow down."
- Perplexity: A measurement of accuracy: how much is the model guessing? A perplexity of 1 is no guessing, a perplexity of 10 is guessing between 10 options.