Tensorflow.js tf.layers.leakyReLU() Function
The tf.layers.leakyReLU() function is used to apply the leaky version of a rectified linear unit activation function on data.
Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.
Output Shape: The output has the same shape as the input.
Parameters: It accepts the args object which can have the following properties:
- args: It is an object that contains the following properties:
- alpha (number): This is the negative slope coefficient. The default value is 0.3.
- inputShape: If this property is set, it will be utilized to construct an input layer that will be inserted before this layer.
- batchInputShape: If this property is set, an input layer will be created and inserted before this layer.
- batchSize: If batchInputShape isn’t supplied and inputShape is, batchSize is utilized to build the batchInputShape.
- dtype: It is the kind of data type for this layer. float32 is the default value. This parameter applies exclusively to input layers.
- name: This is the layer’s name and is of string type.
- trainable: If the weights of this layer may be changed by fit. True is the default value.
- weights: The layer’s initial weight values.
- inputDType: This property is used for Legacy support. It does not use for new code.
Returns: It returns an object (LeakyReLU).
Tensor [-0.2, 8, 19, -2.4000001]
Tensor [[1.12, -0.24, 1.9 ], [0.12, 0.25 , -1.02]]