ReLU Layer
This function block implements a ReLU (Rectified Linear Unit) activation layer, commonly used in neural networks to introduce non-linearity into the model. It offers several configuration parameters for shaping the behavior of the activation.
📥 Inputs
This block does not have any direct input sockets.
📤 Outputs
This block does not produce any outputs directly; it is typically connected in a sequence with other nodes to process data through the neural network.
🕹️ Controls
Maximum Value
This control allows you to set the maximum threshold for the ReLU activation. Any number exceeding this value is capped to prevent excessive outputs.
Negative Side Slope
This control specifies the slope for negative values. A value of zero corresponds to the standard ReLU behavior.
Activation Threshold
Here, you can define the threshold below which activations are set to zero, adding a layer of control over what counts as an active signal.
🎨 Features
Customizable Activation Behavior
Users can tailor the activation characteristics through adjustable parameters such as the maximum output value and the negative slope.
User-Friendly Interface
The layout provides easy access to configure critical parameters without needing to dive into the intricacies of the underlying code.
📝 Usage Instructions
Set Maximum Value: Adjust the
Maximum Value
field according to the desired cap on the output during activation.Set Negative Slope: If needed, specify a value for the
Negative Side Slope
to alter the behavior of the ReLU activation in negative regions.Define Activation Threshold: Use the
Activation Threshold
control to set a boundary for activation, ensuring only significant values pass through.Integrate into Model: Connect this layer within a larger model to apply the ReLU activation after convolutional or dense layers.
📊 Evaluation
Upon running this function block, it configures a ReLU layer with the specified settings, integrating itself seamlessly into a larger workflow involving neural network layers.
💡 Tips and Tricks
🛠️ Troubleshooting
Last updated