simclr
SimCLR Loss
This module implements the SimCLR loss function for contrastive self-supervised learning.
Classes:
-
SimCLRLoss
–Implements SimCLR Cosine Similarity loss.
Functions:
-
l2_normalize
–Normalizes a tensor along a given axis.
Classes
SimCLRLoss
Implements SimCLR Cosine Similarity loss.
SimCLR loss is used for contrastive self-supervised learning.
Parameters:
-
temperature
(float
) –A scaling factor for cosine similarity b/w [0, 1].
References
Source code in neuralspot_edge/losses/simclr.py
Functions
call
Computes SimCLR loss for a pair of projections in a contrastive learning trainer.
Note that unlike most loss functions, this should not be called with y_true and y_pred, but with two unlabeled projections. It can otherwise be treated as a normal loss function.
Parameters:
-
projections_1
(KerasTensor
) –a tensor with the output of the first projection model in a contrastive learning trainer
-
projections_2
(KerasTensor
) –a tensor with the output of the second projection model in a contrastive learning trainer
Returns:
-
KerasTensor
–keras.KerasTensor: A tensor with the SimCLR loss computed from the input projections
Source code in neuralspot_edge/losses/simclr.py
Functions
l2_normalize
Performs L2 normalization on a tensor along a given axis.
Parameters:
Returns:
-
KerasTensor
–tf.Tensor: Normalized tensor