Introduction to the Flutter-TFLite Project
The Flutter-TFLite project, a TensorFlow-managed fork of the original tflite_flutter_plugin developed by Amish Garg, is designed to enable the Flutter community to create apps with machine learning capabilities. This project leverages the TensorFlow Lite framework, ensuring that developers can integrate cutting-edge machine learning models into their Flutter applications efficiently and with ease.
Overview
The TensorFlow Lite Flutter plugin provides a seamless and swift means to utilize the TensorFlow Lite interpreter for performing machine learning inferences. Its API bears similarity to the Java and Swift APIs for TensorFlow Lite, and it binds directly to the TFLite C API, ensuring low-latency and efficient performance. It offers support for acceleration through NNAPI, GPU on Android, Metal and CoreML on iOS, and XNNPack on desktop platforms, catering to both Android and iOS platforms.
Key Features
- Multi-platform Compatibility: Supports app development across Android and iOS devices.
- Model Flexibility: Enables using any TFLite model.
- Performance Acceleration: Incorporates multi-threading for enhanced performance.
- API Consistency: Structural similarity to TensorFlow Lite Java API.
- Optimized Inference Speeds: Performance closely rivals that of native Android applications developed using the Java API.
- UI Responsiveness: Allows inference to run in separate isolates to prevent UI thread jank.
Setting Up
Android & iOS
The project now supports dynamic library downloads. For iOS, you can run samples with the following commands:
flutter build ios
flutter install ios
For Android, use:
flutter build android
flutter install android
Note that these processes require a physical device, as the iOS simulator may not support TFLite. Additionally, there are specific settings to adjust during the release build process for iOS devices.
MacOS, Linux, and Windows
For these platforms, adding a TensorFlow Lite dynamic library manually is necessary. The library must be built following instructions from either the Bazel or CMake build guides and then included in the project through simple procedures detailed in the provided instructions.
TFLite Flutter Helper Library
Note that the TFLite Flutter Helper Library is deprecated, with new developments and improvements underway, targeting widespread support by the end of August 2023.
Importing and Using the Library
To make use of the library, add tflite_flutter
to the pubspec.yaml
file under dependencies, ensuring that you adjust the version according to the latest release.
Creating an Interpreter
To create an interpreter from an asset:
-
Place your
.tflite
file inside theassets
directory. -
Include these assets in the
pubspec.yaml
. -
Create the interpreter in your code:
final interpreter = await Interpreter.fromAsset('assets/your_model.tflite');
Performing Inference
For running inference with a single input/output or multiple inputs/outputs, the project provides methods like run()
and runForMultipleInputs()
, enabling developers to execute model predictions efficiently.
Closing the Interpreter
Post-inference, ensure to close the interpreter to free up resources:
interpreter.close();
Asynchronous Inference
Use IsolateInterpreter
for asynchronous inference to keep the UI responsive by running inference in a separate isolate.
Contribution
Those interested in contributing to the project should use melos for managing the package. Initial set-up requires bootstrapping the project environment, and code generation tasks are managed via the ffigen
package.
The Flutter-TFLite project opens up vast possibilities for Flutter developers to leverage machine learning within their applications seamlessly. With ongoing improvements, it promises robust support and innovative features for future developments.