activations
Activation Layers API
Functions:
-
glu–Gated linear unit layer
-
relu–ReLU activation layer
-
swish–Swish activation layer
-
relu6–Hard ReLU activation layer
-
mish–Mish activation layer
-
gelu–GeLU activation layer
-
sigmoid–Sigmoid activation layer
-
hard_sigmoid–Hard sigmoid activation layer
Please check Keras Activations for additional activations.
Functions
glu
Gated linear unit layer
Parameters:
-
(dimint, default:-1) –Dimension to split. Defaults to -1.
-
(hardbool, default:False) –Use hard sigmoid. Defaults to False.
-
(namestr | None, default:None) –Layer name. Defaults to None.
Returns:
-
Layer–keras.Layer: Functional GLU layer
Source code in neuralspot_edge/layers/activations.py
relu
ReLU activation layer w/ optional truncation to ReLU6
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
-
(truncatedbool, default:False) –Truncate to ReLU6. Defaults to False.
Returns:
-
Layer–keras.Layer: Functional ReLU layer
Source code in neuralspot_edge/layers/activations.py
swish
Swish activation layer w/ optional hard variant
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
-
(hardbool, default:False) –Use hard swish. Defaults to False.
Returns:
-
Layer–keras.Layer: Functional Swish layer
Source code in neuralspot_edge/layers/activations.py
relu6
Hard ReLU activation layer
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
Returns:
-
Layer–keras.Layer: Functional ReLU6 layer
Source code in neuralspot_edge/layers/activations.py
mish
Mish activation layer
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
Returns:
-
Layer–keras.Layer: Functional Mish layer
Source code in neuralspot_edge/layers/activations.py
gelu
GeLU activation layer
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
Returns:
-
Layer–keras.Layer: Functional GeLU layer
Source code in neuralspot_edge/layers/activations.py
sigmoid
Sigmoid activation layer
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
-
(hardbool, default:False) –Use hard sigmoid. Defaults to False.
Returns:
-
Layer–keras.Layer: Functional Sigmoid layer
Source code in neuralspot_edge/layers/activations.py
hard_sigmoid
Hard sigmoid activation layer
Parameters:
-
(namestr | None, default:None) –Layer name. Defaults to None.
Returns:
-
Layer–keras.Layer: Functional Hard sigmoid layer