Bring blazing-fast, ultra-compact neural inference to Ambiq’s family of ultra-low-power SoCs.
📖 Overview
HeliosAOT is an ahead-of-time compiler that transforms trained models (e.g. LiteRT) into highly optimized, standalone C inference modules tailored specifically for Ambiq’s ultra low-power System-on-Chips (SoCs). By shifting neural network inference code generation to build time, HeliosAOT delivers compact, efficient, and readable C output with zero runtime overhead. No more guessing arena sizes or packaging dead code!
🚀 Key Advantages
- 10× smaller code size Dramatically reduce your flash footprint versus vanilla TFLM.
- Expanded kernel library Get best-in-class performance with more operators and fused layers.
- Ambiq-tuned efficiency Optimized code generation for Cortex-M4 & M55 (Helium).
- Highly customizable Fine-tune at the operator, tensor, or graph level.
- Zero arena-size guessing Automatic memory planning for tensors—no more over-allocating.
- Readable, structured C Your generated code mirrors your network topology for easy debugging.
- Seamless integration Supports integration as a CMake library, Zephyr module, or neuralSPOT module.
- Advanced optimizations Layer fusion, tensor reordering, and memory-type placement out of the box.
📚 Quick Links
-
Install HeliosAOT and getting up and running in minutes. Install HeliosAOT
-
How-To examples demonstrating specific commands and features. How-To...
-
Reference for available commands, flags, and configuration options. Reference
-
Usage Examples showcasing real-world applications and best practices. Guides
-
API Documentation for the core HeliosAOT classes and functions. API Documentation
-
Performance Benchmarks comparing HeliosAOT to other frameworks. Benchmarks
🔧 Install
Using pipx is the recommended way to install helios-aot as it allows installing and running python apps in isolated environments. Upon installation, helios-aot will be available in your PATH.
Grab the app for your OS from the latest release page. No Python, no dependencies.
Supported OS: macOS – Linux – Windows
🎬 First Run
$ helios-aot convert --model.path ./ad01-int8.tflite
INFO Started AOT conversion
INFO Step: Loading model ad01_int8.tflite…
INFO Step completed in 0.01s ✔
INFO Step: Applying graph-level transforms…
INFO Step completed in 0.00s ✔
INFO Step: Resolving handlers…
INFO Step completed in 0.00s ✔
INFO Step: Memory planning via greedy…
INFO Step completed in 0.00s ✔
INFO Step: Emitting code as neuralspot module…
INFO Step completed in 0.10s ✔
INFO Step: Exporting module…
INFO Step completed in 0.01s ✔
INFO AOT conversion completed
This generates a complete C inference module with following files:
module_name
├── LICENSE ← license for this module
├── README.md ← module documentation
├── includes-api/ ← API header files
├── src/ ← C operator source files
└── module.mk ← For neuralSPOT integration
📜 License
HeliosAOT generates C modules that include a license header restricting use to Ambiq hardware. See the generated module’s LICENSE for terms and the repository’s LICENSE.md for tooling.
Ready to dive in? Head over to the Getting Started guide and generate your first module in minutes.