-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Usage on microcontroller (ARM Cortex-M4) / LiteRT #139
Comments
For GradientBoostedTrees, you can (experimentally) convert to JAX and then convert to LiteRT. I've posted this in another issue (and should probably add it to the documentation)
For Random Forests and Isolation Forests, we haven't had time to implement it, but I don't see any blockers. The faster and, probably, more elegant approach is to just compile the YDF inference code for your architecture. People have done this for some architectures, e.g. Raspberry Pi, but we can't make any promises about it. For reference, here are the (since deleted) instructions https://github.com/google/yggdrasil-decision-forests/blob/4bdedd31c041706a3d022313f1edaf494dea53c1/documentation/installation.md#compilation-on-and-for-raspberry-pi Note that we're now using Bazel 5.3.0 (and will be migrating to Bazel 6 or 7 at some point). In the compilation step, just going for the
If you're successful, please let us know, we'd be happy to include an updated guide in the repo. |
Hi, has anyone experience with making predictions on a microcontroller? Is the YDF C++ Library compatible with microcontroller architecture?
For my understanding YDF models not yet compatible with TFlite (nowadays LiteRT). Is this true or has something changed here?
Every help is welcome!
Best regards,
/P
The text was updated successfully, but these errors were encountered: