Skip to content

Bring blazing-fast, ultra-compact neural inference to Ambiq’s family of ultra-low-power SoCs.

📖 Overview

HeliosAOT is an ahead-of-time compiler that transforms LiteRT flatbuffers into highly optimized, standalone C inference modules tailored specifically for Ambiq’s ultra low-power System-on-Chips (SoCs). By shifting neural network inference code generation to build time, HeliosAOT delivers compact, efficient, and readable C output with zero runtime overhead. No more guessing arena sizes or packaging dead code!

🚀 Key Advantages

  • 10× smaller code size Dramatically reduce your flash footprint versus vanilla TFLM.
  • Expanded kernel library Get best-in-class performance with more operators and fused layers.
  • Ambiq-tuned efficiency Optimized code generation for Cortex-M4 & M55 (Helium).
  • Highly customizable Fine-tune at the operator, operator-group, or graph level.
  • Zero arena-size guessing Automatic memory planning for tensors—no more over-allocating.
  • Readable, structured C Your generated code mirrors your network topology for easy debugging.
  • Seamless integration Supports integration as a Zephyr module or neuralSPOT plugin.
  • Advanced optimizations Layer fusion, tensor reordering, and memory-type placement out of the box.
  • Install HeliosAOT with pipx and getting up and running in minutes.   Install HeliosAOT

  • How-To examples demonstrating specific commands and features.   How-To...

  • Reference for available commands, flags, and configuration options.   Reference

  • Usage Examples showcasing real-world applications and best practices.   Guides

  • API Documentation for the core HeliosAOT classes and functions.   API Documentation

  • Performance Benchmarks comparing HeliosAOT to other frameworks.   Benchmarks

🔧 Install

Install helios-aot with pipx.

Using pipx is the recommended way to install helios-aot as it allows installing and running python apps in isolated environments. Upon installation, helios-aot will be available in your PATH.

  • Install helios-aot: pipx install git+ssh://git@github.com/AmbiqAI/helios-aot.git@main
  • Upgrade helios-aot: pipx upgrade helios-aot
  • Invoke helios-aot: helios-aot --help
  • Remove helios-aot: pipx uninstall helios-aot
pip install git+https://github.com/AmbiqAI/helios-aot.git#egg=helios-aot
uv add "helios-aot @ git+https://github.com/AmbiqAI/helios-aot.git"

-OR-

Download a self-contained CLI binary for your OS:

🎬 First Run

$ helios-aot convert --model.path ./ad01-int8.tflite

INFO     Started AOT conversion
INFO     Step: Loading model ad01_int8.tflite…
INFO     Step completed in 0.01s ✔
INFO     Step: Applying graph-level transforms…
INFO     Step completed in 0.00s ✔
INFO     Step: Resolving handlers…
INFO     Step completed in 0.00s ✔
INFO     Step: Memory planning via greedy…
INFO     Step completed in 0.00s ✔
INFO     Step: Emitting code as neuralspot module…
INFO     Step completed in 0.10s ✔
INFO     Step: Exporting module…
INFO     Step completed in 0.01s ✔
INFO     AOT conversion completed

This generates a complete C inference module with following files:

module_name
├── LICENSE.txt ← license for this module
├── README.md ← module documentation
├── includes-api/ ← API header files
├── src/ ← C operator source files
└── module.mk ← For neuralSPOT integration

Ready to dive in? Head over to the Getting Started guide and generate your first module in minutes.

📜 License