Fully Connected
This function block implements a fully connected (dense) layer for artificial neural networks. It allows users to customize the output size and activation function, facilitating the design of deep learning models.
π₯ Inputs
This function block does not have inputs as it generates neural network layers directly.
π€ Outputs
This function block does not produce outputs directly; instead, it contributes to the architecture of a neural network model that can be further evaluated.
πΉοΈ Controls
Output Size
A dropdown that allows you to set the number of output neurons in the fully connected layer. Options range from 8 to 512.
Activation Function
A dropdown menu to select the activation function used in the layer. Available options include:
None
relu
sigmoid
softmax
softplus
softsign
tanh
selu
elu
exponential
π¨ Features
Customizable Layer Settings
Users can easily adjust the number of outputs and activation functions to refine their modelβs performance during training.
Integration with Keras
The block utilizes the Keras library to create layers seamlessly, allowing for easy integration into larger models.
π Usage Instructions
Open the Block: Add the
Fully Connected
block to your workflow in AugeLab Studio.Set Output Size: Choose the desired number of output neurons from the
Output Size
dropdown.Select Activation Function: Choose the preferred activation function from the
Activation Function
dropdown.Integrate into Model: The output of this block can be connected to other layers to build a complete neural network.
π Evaluation
This function block contributes a fully connected layer with specified parameters to the neural network model.
π‘ Tips and Tricks
π οΈ Troubleshooting
Last updated