Tensorflow.js tf.layers.prelu() Function
The tf.layers.prelu() function is used to apply the parameterized version of a leaky rectified linear unit activation function on data.
Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.
Output Shape: The output has the same shape as the input.
Parameters: It accepts the args object which can have the following properties:
- args: It is an object that contains the following properties:
- alphaInitializer: The initializer for the learnable alpha.
- alphaRegularizer: For the learnable alpha, this is the regularizer.
- alphaConstraint: For the learnable alpha, this is the constraint.
- sharedAxes: The axes along which the activation function’s learnable parameters should be shared
- inputShape: If this property is set, it will be utilized to construct an input layer that will be inserted before this layer.
- batchInputShape: If this property is set, an input layer will be created and inserted before this layer.
- batchSize: If batchInputShape isn’t supplied and inputShape is, batchSize is utilized to build the batchInputShape.
- dtype: It is the kind of data type for this layer. float32 is the default value. This parameter applies exclusively to input layers.
- name: This is the layer’s name and is of string type.
- trainable: If the weights of this layer may be changed by fit. True is the default value.
- weights: The layer’s initial weight values.
Returns: It returns an object (PReLU).
Tensor [11, -6.450459, -7.2567663, 12]
Tensor [[1.12, 0.5329878, 1.9 ], [0.12, 0.25 , -3.0655782]]