Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add operators to combine different layers #8

Open
simleo opened this issue Dec 4, 2019 · 1 comment
Open

Add operators to combine different layers #8

simleo opened this issue Dec 4, 2019 · 1 comment

Comments

@simleo
Copy link
Contributor

simleo commented Dec 4, 2019

What follows is a request made on Slack by @RParedesPalacios:

PyEDDL should allow to do things that rigth now is impossible with EDDL since C++ doesn’t allow overload operator with basic type (Layer * for instance).
So, PyEDDL should allow to do things like this:

x = x * eddl.sqrt(y + eddl.abs(eps))

This internally will lead to the build of come graph of layers like this in C++

return LMult(x, LSqrt(LSum (y, LAbs(eps))))

that clearly is difficult to follow and read.

To this end we have to define, perhaps, new Layers that could provide any potential operation between tensors even with different shapes… in case of having different shapes it is possible to perfomr reductions like TF is doing for instane
Check for instance this TF example:

def FRNLayer(x, tau, beta, gamma, eps=1e-6):
  # x: Input tensor of shape [BxHxWxC].
  # alpha, beta, gamma: Variables of shape [1, 1, 1, C].
  # eps: A scalar constant or learnable variable.
  # Compute the mean norm of activations per channel.
  nu2 = tf.reduce_mean(tf.square(x), axis=[1, 2],
     keepdims=True)
  # Perform FRN.
  x = x * tf.rsqrt(nu2 + tf.abs(eps))
  # Return after applying the Offset-ReLU non-linearity.
  return tf.maximum(gamma * x + beta, tau)

In this case if you check the sizes the expression “gamma * x” entails a reduction operation, or just the inverse of a reduction, since gamma has to be applied to several parts of the tensor x.
x: Input tensor of shape [BxHxWxC]
gamma: Variables of shape [1, 1, 1, C]

@simleo
Copy link
Contributor Author

simleo commented Dec 5, 2019

Something along the lines of:

def __add__(self, other):
    return eddl.Sum(self, other)
def __mul__(self, other):
    return eddl.Mult(self, other)
[...]

for layer objects, but defined at the bindings level (the above is just a pure Python example).

Reductions to match different shapes, on the other hand, would be handled by EDDL.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant