Skip to content
/ keras Public
forked from keras-team/keras

Keras' fork with several new functionalities. Caffe2Keras converter, multimodal layers, etc.

License

Notifications You must be signed in to change notification settings

MarcBS/keras

 
 

Repository files navigation

MarcBS/keras Multimodal Learning fork

Build Status

This fork of Keras offers the following contributions:

  • Caffe to Keras conversion module
  • Layer-specific learning rates
  • New layers for multimodal data

Contact email: [email protected]

GitHub page: https://github.com/MarcBS

MarcBS/keras has been tested with: Python 2.7 and Python 3.6 and with the Theano and Tensorflow backends.

Caffe to Keras conversion module

This module allows to convert Caffe models to Keras for their later training or test use. See this README for further information.

Please, be aware that this feature is not regularly maintained. Thus, some layers or parameter definitions introduced in newer versions of either Keras or Caffe might not be compatible with the converter.

For this reason, any pull requests with updated versions of the caffe2keras converter are highly welcome!

Layer-specific learning rates

This functionality allows to add learning rates multipliers to each of the learnable layers in the networks. During training they will be multiplied by the global learning rate for modifying the weight of the error on each layer independently. Here is a simple example of usage:

x = Dense(100, W_learning_rate_multiplier=10.0, b_learning_rate_multiplier=10.0)  (x)

New layers for sequence-to-sequence learning and multimodal data

LSTM layers:

  • LSTMCond: LSTM conditioned to the previously generated word (additional input with previous word).
  • AttLSTM: LSTM with Attention mechanism.
  • AttLSTMCond: LSTM with Attention mechanism and conditioned to previously generated word.
  • AttConditionalLSTMCond: ConditionalLSTM similar to Nematus with Attention mechanism and conditioned to previously generated word.
  • AttLSTMCond2Inputs: LSTM with double Attention mechanism (one for each input) and conditioned to previously generated word.
  • AttLSTMCond3Inputs: LSTM with triple Attention mechanism (one for each input) and conditioned to previously generated word.
  • others

And their corresponding GRU version:

Convolutional layers

Attentional layers

Projects

You can see more practical examples in projects which use this library:

ABiViRNet for Video Description

Egocentric Video Description based on Temporally-Linked Sequences

NMT-Keras: Neural Machine Translation.

Installation

In order to install the library you just have to follow these steps:

  1. Clone this repository:
git clone https://github.com/MarcBS/keras.git
  1. Include the repository path into your PYTHONPATH:
export PYTHONPATH=$PYTHONPATH:/path/to/keras

Keras

For additional information on the Deep Learning library, visit the official web page www.keras.io or the GitHub repository https://github.com/keras-team/keras.

About

Keras' fork with several new functionalities. Caffe2Keras converter, multimodal layers, etc.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.9%
  • Other 0.1%