Tensorflow.js tf.layers.thresholdedReLU() Function
The tf.layers.thresholdedReLU() function is used to apply the threshold rectified linear unit activation function on data.
Input Shape: Arbitrary. When utilizing this layer as the initial layer in a model, use the inputShape configuration.
Output Shape: The output has the same shape as the input.
Parameters: It accepts the args object which can have the following properties:
- theta (number): It is the threshold location of activation.
- inputShape: If this property is set, it will be utilized to construct an input layer that will be inserted before this layer.
- batchInputShape: If this property is set, an input layer will be created and inserted before this layer.
- batchSize: If batchInputShape isn’t supplied and inputShape is, batchSize is utilized to build the batchInputShape.
- dtype: It is the kind of data type for this layer. float32 is the default value. This parameter applies exclusively to input layers.
- name: This is the layer’s name and is of string type.
- trainable: If the weights of this layer may be changed by fit. True is the default value.
- weights: The layer’s initial weight values.
Returns: It returns an object (ThresholdedReLU).
Tensor [11, 0, 0, 12]
Tensor [[1.12, 0, 1.9], [0, 0, 3.4000001]]
Please Login to comment...