Skip to content

Bring blazing-fast, ultra-compact neural inference to Ambiq’s family of ultra-low-power SoCs.

📖 Overview

HeliosAOT is an ahead-of-time compiler that transforms LiteRT flatbuffers into highly optimized, standalone C inference modules tailored specifically for Ambiq’s ultra low-power System-on-Chips (SoCs). By shifting neural network inference code generation to build time, HeliosAOT delivers compact, efficient, and readable C output with zero runtime overhead. No more guessing arena sizes or packaging dead code!

🚀 Key Advantages

  • 10× smaller code size Dramatically reduce your flash footprint versus vanilla TFLM.
  • Expanded kernel library Get best-in-class performance with more operators and fused layers.
  • Ambiq-tuned efficiency Optimized code generation for Cortex-M4 & M55 (Helium).
  • Highly customizable Fine-tune at the operator, operator-group, or graph level.
  • Zero arena-size guessing Automatic memory planning for tensors—no more over-allocating.
  • Readable, structured C Your generated code mirrors your network topology for easy debugging.
  • Seamless integration Supports integration as a Zephyr module or neuralSPOT plugin.
  • Advanced optimizations Layer fusion, tensor reordering, and memory-type placement out of the box.
  • Install HeliosAOT with pip/uv and getting up and running in minutes.   Install HeliosAOT

  • CLI Reference for available commands, flags, and configuration options.   CLI Reference

  • Usage Examples showcasing real-world applications and best practices.   Usage Examples

  • API Documentation for the core HeliosAOT classes and functions.   API Documentation

  • Performance Benchmarks comparing HeliosAOT to other frameworks.   Performance Benchmarks

🔧 Install

Install helios-aot with pipx, pip, or uv.

Using pipx is the recommended way to install helios-aot as it allows installing and running python apps in isolated environments. Upon installation, helios-aot will be available in your PATH.

  • Install helios-aot: pipx install git+ssh://git@github.com/AmbiqAI/helios-aot.git@main
  • Upgrade helios-aot: pipx upgrade helios-aot
  • Invoke helios-aot: helios-aot --help
  • Remove helios-aot: pipx uninstall helios-aot
pip install git+https://github.com/AmbiqAI/helios-aot.git#egg=helios-aot
uv add "helios-aot @ git+https://github.com/AmbiqAI/helios-aot.git"

-OR-

Download a self-contained CLI binary for your OS:

🎬 First Run

$ helios-aot convert --model_path ./ad01-int8.tflite

INFO     Started AOT conversion
INFO     Step: Initializing module…
INFO     Step completed ✔
INFO     Step: Fusing and transforming operators…
INFO     Step completed ✔
INFO     Step: Planning memory allocation via greedy…
INFO     Step completed ✔
INFO     Step: Generating source code…
INFO     Step completed ✔
INFO     Step: Exporting zephyr module…
INFO     Step completed ✔
INFO     Step: Generating documentation…
INFO     Step completed ✔
INFO     AOT conversion completed

This generates a complete C inference module with following files:

module_name
├── LICENSE.txt ← license for this module
├── README.md ← module documentation
├── includes-api/ ← API header files
├── src/ ← C operator source files
└── module.mk ← For neuralSPOT integration

Ready to dive in? Head over to the Getting Started guide and generate your first module in minutes.

📜 License