Skip to content

activations

Activation Layers API

Functions:

  • glu

    Gated linear unit layer

  • relu

    ReLU activation layer

  • swish

    Swish activation layer

  • relu6

    Hard ReLU activation layer

  • mish

    Mish activation layer

  • gelu

    GeLU activation layer

  • sigmoid

    Sigmoid activation layer

  • hard_sigmoid

    Hard sigmoid activation layer

Please check Keras Activations for additional activations.

Functions

glu

glu(dim: int = -1, hard: bool = False, name: str | None = None) -> keras.Layer

Gated linear unit layer

Parameters:

  • dim (int, default: -1 ) –

    Dimension to split. Defaults to -1.

  • hard (bool, default: False ) –

    Use hard sigmoid. Defaults to False.

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

Returns:

  • Layer

    keras.Layer: Functional GLU layer

Source code in neuralspot_edge/layers/activations.py
def glu(dim: int = -1, hard: bool = False, name: str | None = None) -> keras.Layer:
    """Gated linear unit layer

    Args:
        dim (int, optional): Dimension to split. Defaults to -1.
        hard (bool, optional): Use hard sigmoid. Defaults to False.
        name (str|None, optional): Layer name. Defaults to None.

    Returns:
        keras.Layer: Functional GLU layer
    """

    def layer(x: keras.KerasTensor) -> keras.KerasTensor:
        out, gate = keras.ops.split(x, indices_or_sections=2, axis=dim)
        act = keras.activations.sigmoid if hard else keras.activations.hard_sigmoid
        gate = keras.layers.Activation(act)(gate)
        x = keras.layers.Multiply()([out, gate])
        return x

    # END DEF

    return layer

relu

relu(name: str | None = None, truncated: bool = False, **kwargs) -> keras.Layer

ReLU activation layer w/ optional truncation to ReLU6

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

  • truncated (bool, default: False ) –

    Truncate to ReLU6. Defaults to False.

Returns:

  • Layer

    keras.Layer: Functional ReLU layer

Source code in neuralspot_edge/layers/activations.py
def relu(name: str | None = None, truncated: bool = False, **kwargs) -> keras.Layer:
    """ReLU activation layer w/ optional truncation to ReLU6

    Args:
        name (str|None, optional): Layer name. Defaults to None.
        truncated (bool, optional): Truncate to ReLU6. Defaults to False.

    Returns:
        keras.Layer: Functional ReLU layer

    """
    name = name + ".act" if name else None
    act = keras.activations.relu6 if truncated else keras.activations.relu
    return keras.layers.Activation(activation=act, name=name, **kwargs)

swish

swish(name: str | None = None, hard: bool = False, **kwargs) -> keras.Layer

Swish activation layer w/ optional hard variant

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

  • hard (bool, default: False ) –

    Use hard swish. Defaults to False.

Returns:

  • Layer

    keras.Layer: Functional Swish layer

Source code in neuralspot_edge/layers/activations.py
def swish(name: str | None = None, hard: bool = False, **kwargs) -> keras.Layer:
    """Swish activation layer w/ optional hard variant

    Args:
        name (str|None, optional): Layer name. Defaults to None.
        hard (bool, optional): Use hard swish. Defaults to False.

    Returns:
        keras.Layer: Functional Swish layer
    """
    name = name + ".act" if name else None
    act = keras.activations.hard_swish if hard else keras.activations.swish
    return keras.layers.Activation(act, name=name)

relu6

relu6(name: str | None = None, **kwargs) -> keras.Layer

Hard ReLU activation layer

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

Returns:

  • Layer

    keras.Layer: Functional ReLU6 layer

Source code in neuralspot_edge/layers/activations.py
def relu6(name: str | None = None, **kwargs) -> keras.Layer:
    """Hard ReLU activation layer

    Args:
        name (str|None, optional): Layer name. Defaults to None.

    Returns:
        keras.Layer: Functional ReLU6 layer
    """
    name = name + ".act" if name else None
    return keras.layers.Activation("relu6", name=name, **kwargs)

mish

mish(name: str | None = None, **kwargs) -> keras.Layer

Mish activation layer

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

Returns:

  • Layer

    keras.Layer: Functional Mish layer

Source code in neuralspot_edge/layers/activations.py
def mish(name: str | None = None, **kwargs) -> keras.Layer:
    """Mish activation layer

    Args:
        name (str|None, optional): Layer name. Defaults to None.

    Returns:
        keras.Layer: Functional Mish layer
    """
    name = name + ".act" if name else None
    return keras.layers.Activation(keras.activations.mish, name=name, **kwargs)

gelu

gelu(name: str | None = None, **kwargs) -> keras.Layer

GeLU activation layer

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

Returns:

  • Layer

    keras.Layer: Functional GeLU layer

Source code in neuralspot_edge/layers/activations.py
def gelu(name: str | None = None, **kwargs) -> keras.Layer:
    """GeLU activation layer

    Args:
        name (str|None, optional): Layer name. Defaults to None.

    Returns:
        keras.Layer: Functional GeLU layer
    """
    name = name + ".act" if name else None
    return keras.layers.Activation(keras.activations.gelu, name=name, **kwargs)

sigmoid

sigmoid(name: str | None = None, hard: bool = False, **kwargs) -> keras.Layer

Sigmoid activation layer

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

  • hard (bool, default: False ) –

    Use hard sigmoid. Defaults to False.

Returns:

  • Layer

    keras.Layer: Functional Sigmoid layer

Source code in neuralspot_edge/layers/activations.py
def sigmoid(name: str | None = None, hard: bool = False, **kwargs) -> keras.Layer:
    """Sigmoid activation layer

    Args:
        name (str|None, optional): Layer name. Defaults to None.
        hard (bool, optional): Use hard sigmoid. Defaults to False.

    Returns:
        keras.Layer: Functional Sigmoid layer
    """
    name = name + ".act" if name else None
    activation = keras.activations.hard_sigmoid if hard else keras.activations.sigmoid
    return keras.layers.Activation(activation, name=name)

hard_sigmoid

hard_sigmoid(name: str | None = None, **kwargs) -> keras.Layer

Hard sigmoid activation layer

Parameters:

  • name (str | None, default: None ) –

    Layer name. Defaults to None.

Returns:

  • Layer

    keras.Layer: Functional Hard sigmoid layer

Source code in neuralspot_edge/layers/activations.py
def hard_sigmoid(name: str | None = None, **kwargs) -> keras.Layer:
    """Hard sigmoid activation layer

    Args:
        name (str|None, optional): Layer name. Defaults to None.

    Returns:
        keras.Layer: Functional Hard sigmoid layer
    """
    name = name + ".act" if name else None
    return keras.layers.Activation(keras.activations.hard_sigmoid, name=name)