Skip to content

Index

Operators API

The 'operators' module provides classes to represent operators in the network graph. These classes all inherit from the AotOperator class, which provides a common interface for all operators. Each operator class is responsible for generating the C code for its specific operation. The OperatorMap dictionary maps the LiteRT operator IDs to the corresponding operator classes. This allows for easy lookup and instantiation of the appropriate operator class based on the operator ID in the LiteRT model.

Available Operators

  • Add: Element-wise addition
  • AssignVariable: Assigns a value to a variable
  • AveragePool: Average pooling operation
  • BatchMatMul: Batch matrix multiplication
  • Concatenation: Concatenates tensors along a specified axis
  • Conv: 2D convolution operation
  • DepthwiseConv: Depthwise separable convolution operation
  • Dequantize: Dequantizes a tensor
  • Fill: Fills a tensor with a specified value
  • FullyConnected: Fully connected layer
  • HardSwish: Hard swish activation function
  • LeakyRelu: Leaky ReLU activation function
  • Logistic: Logistic activation function
  • MaxPool: Max pooling operation
  • Maximum: Element-wise maximum
  • Mean: Computes the mean of a tensor along specified axes
  • Minimum: Element-wise minimum
  • Mul: Element-wise multiplication
  • Pack: Packs a list of tensors into a single tensor
  • Pad: Pads a tensor
  • Quantize: Quantizes a tensor
  • ReadVariable: Reads a variable
  • Relu: Rectified Linear Unit activation
  • Reshape: Reshapes a tensor
  • Shape: Returns the shape of a tensor
  • Softmax: Softmax activation
  • Squeeze: Removes dimensions of size 1 from the shape of a tensor
  • StridedSlice: Slices a tensor with strides
  • Tanh: Hyperbolic tangent activation function
  • TransposeConv: Transpose convolution operation
  • Transpose: Transposes a tensor
  • ZerosLike: Generates a tensor of zeros with the same shape and type as the input tensor

Copyright 2025 Ambiq. All Rights Reserved.

Classes