Skip to content

Add operators to combine different layers #8

@simleo

Description

@simleo

What follows is a request made on Slack by @RParedesPalacios:

PyEDDL should allow to do things that rigth now is impossible with EDDL since C++ doesn’t allow overload operator with basic type (Layer * for instance).
So, PyEDDL should allow to do things like this:

x = x * eddl.sqrt(y + eddl.abs(eps))

This internally will lead to the build of come graph of layers like this in C++

return LMult(x, LSqrt(LSum (y, LAbs(eps))))

that clearly is difficult to follow and read.

To this end we have to define, perhaps, new Layers that could provide any potential operation between tensors even with different shapes… in case of having different shapes it is possible to perfomr reductions like TF is doing for instane
Check for instance this TF example:

def FRNLayer(x, tau, beta, gamma, eps=1e-6):
  # x: Input tensor of shape [BxHxWxC].
  # alpha, beta, gamma: Variables of shape [1, 1, 1, C].
  # eps: A scalar constant or learnable variable.
  # Compute the mean norm of activations per channel.
  nu2 = tf.reduce_mean(tf.square(x), axis=[1, 2],
     keepdims=True)
  # Perform FRN.
  x = x * tf.rsqrt(nu2 + tf.abs(eps))
  # Return after applying the Offset-ReLU non-linearity.
  return tf.maximum(gamma * x + beta, tau)

In this case if you check the sizes the expression “gamma * x” entails a reduction operation, or just the inverse of a reduction, since gamma has to be applied to several parts of the tensor x.
x: Input tensor of shape [BxHxWxC]
gamma: Variables of shape [1, 1, 1, C]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions