MicroAI: Embedded AI for Smarter Devices, Even on the Smallest Silicon

MicroEJ introduces MicroAI™, a lightweight AI inference engine that brings TinyML capabilities directly to embedded systems, enabling real-time edge intelligence on microcontrollers with limited memory and compute.

Bringing Intelligence to Constrained Devices

With MicroAI, manufacturers and technology leaders can now deploy machine learning models inside MicroEJ applications running on ultra-low-power hardware. While the first available implementation is based on TensorFlow Lite (the widely adopted open-source framework for edge AI), MicroAI is designed to be model-format agnostic. Additional formats, such as ONNX, may be supported in the future to offer even more flexibility to developers.

Unlike traditional AI approaches that rely on the cloud or external accelerators, MicroAI runs entirely on-device, enabling real-time intelligence even on highly constrained embedded systems. Typical use cases include:

  • Anomaly detection using sensor-based time series
  • Event classification, such as detecting load patterns for electric vehicle charging
  • Probability-based analysis in wearables and health monitoring
  • Temporal data analysis using convolutional neural networks (CNNs) to detect patterns in time-series signals, a growing trend in embedded AI applications

Core Capabilities

  • TensorFlow Lite model support out of the box
  • Memory-optimized execution for MCUs with as little as a few kilobytes of RAM
  • Edge-based inference with real-time performance
  • Static model embedding for security and efficiency
  • Java and C interoperability, enabling seamless integration with native code and legacy components

Demo Spotlight: MNIST Digit Recognition on NXP FRDM-MCXN947

To showcase MicroAI in action, we’ve implemented a digit recognition demo using a TensorFlow Lite CNN model running on the NXP MCXN947 microcontroller.

This demo illustrates how MicroAI can execute a lightweight neural network entirely on-device, within a MicroEJ application, on resource-constrained hardware. It processes handwritten digits in real time, demonstrating efficient CNN-based inference without relying on the cloud.

While based on the MNIST dataset, the same architecture applies to real-world use cases like load detection in energy systems, gesture recognition, or health signal classification.

 

Unlocking New Use Cases

MicroAI is designed for edge applications where on-device decision-making matters most:

  • Smart energy devices performing load forecasting and anomaly detection – Explore VEE Energy
  • Industrial equipment requiring predictive maintenance with local inference
  • Wearables that compute probabilities for health or activity state – Explore VEE Wear
  • Consumer electronics with gesture or presence detection features

Strategic Value for Software-Defined Products

MicroAI empowers innovation leaders to add intelligence without increasing hardware complexity or cost. As part of the MicroEJ containerized software stack, it expands the ability of manufacturers to rapidly deploy AI features while maintaining strict constraints on power, memory, and BOM.

 

Additional Resources:

PRODUCT

MICROEJ VEE Software Containers for Embedded Systems

TinyML Embedded Systems

VIDEO

Video – Integrate and Deploy Machine Learning at the Edge

PRESS RELEASE

MicroEJ and Au-Zone Bring AI Closer to Home with Containerization at the Edge

Semir Haddad

Chief Product and Strategy Officer