-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Making Keras flexible without too many specific edge cases #883
Comments
I think lambda layers are a great idea, in fact it's surprising we don't have them already. They're in the spirit of our existing "freeform" elements (activations, losses, callbacks). They can definitely help reduce the amount of layers in Keras, which might be starting to reach its limits. By the way, one way to control that amount is to start deleting unnecessary layers. Like Only technical issue I could see with lambda layers is the output shape computation. Here's a suggestion: the
About samples dimension: in the general case it only contains |
Note that one issue that would be raised with lambda layers is serialization. How do you serialize/deserialize the |
It would not be possible to serialize lambdas, as @fchollet mentioned, at least when using the HDF5 or JSON serialization backends. We could keep a stub for lambda layers in the serialization format and have the user fill in the |
If the user user a python function instead of a lambda it will work but the
|
@EderSantana Can you help me understand the difference between merge_modes |
|
Thanks! On Wed, Nov 4, 2015 at 7:42 AM, Eder Santana [email protected]
|
Is it possible to pass a function that takes additional parameters? I tried to implement a custom activation function with a Lambda layer but couldn't really figure out how to do it properly (see #1061). Are there any alternatives? |
Can someone please add a backend function to get tensor shape? Otherwise Lambda still can't do the trick such as custom loss function. |
There is |
Hmm, I tried but got error:
K.init_shape:
K.shape
|
@joetigger it's K.int_shape() and not K.init_shape(). |
As we
approachhave passed the 3000 stars and attract several users more and more edge cases are being added to the main repo. As this is awesome to make the library grow fast and help as much people as possible, it makes me wonder if we can continue to scale at that speed.To make sure Keras can continue to grow fast and helping as much as possible without breaking things, I'd like to propose some functional options, the first two things I thought about was a
LambdaLayer
andLambdaDataset
. I'm already usingLambdaLayer
in Seya pretty successfully and it looks like this:As an example of how powerful this is, we can bootstrap all
Merge
modes with a combination ofmerge_mode="join"
and aLambdaLayer
. For example, say we want to merge things with a kernel projection:We did it pretty quickly and nothing new had to be added to the main repo, although now we are merging
layer1
andlayer2
in a huge dimensional space and making Jordan happy.The same flexibility could be added to datasets and I'm sure you guys have ideas of how to do that elsewhere. This is the same philosophy as the one behind our beloved
Callbacks
. Let me know what you think.The text was updated successfully, but these errors were encountered: