Introduction to 🤗 Exporters
The Exporters project by Hugging Face offers a convenient tool for exporting state-of-the-art machine learning models, specifically from the 🤗 Transformers library, to Apple's Core ML format. Tailored for models originally implemented in PyTorch, TensorFlow, or JAX, Exporters simplifies the process of converting these models into a format that can be efficiently executed on macOS, iOS, tvOS, and watchOS devices.
Why Use 🤗 Exporters?
Transformers models, known for their top-notch performance in various AI tasks, are typically large and not always optimized for mobile devices. Despite their high utility, deploying these on Apple's ecosystem might require conversion to the Core ML format. The 🤗 Exporters package is designed to seamlessly facilitate this conversion, integrating closely with the Hugging Face Hub and reducing the need for users to manually write conversion scripts using coremltools
.
Features and Functionality
The Exporters package supports a variety of ready-made configurations for different model architectures, ensuring a broad applicability. It covers popular models such as BERT, GPT-2, and Vision Transformer (ViT), among others. This intends to provide users with an easy starting point for converting these models without delving into the intricate details of Core ML compatibility.
To cater to a wider audience, Exporters powers a no-code solution available through the Hugging Face Spaces. This allows potential users to experiment with model conversions without installing any software. If successful, the converted model gets directly uploaded to the Hugging Face Hub, ready for broader distribution.
Installation and Setup
Starting to use Exporters is straightforward. Users can clone the repository from GitHub and install it as a Python package via:
$ git clone https://github.com/huggingface/exporters.git
$ cd exporters
$ pip install -e .
While the Core ML exporter can be executed on Linux systems, it is highly recommended to perform these operations on macOS for a smoother experience.
Core ML Overview
Core ML is Apple's dedicated software framework designed for optimizing machine learning models for various Apple operating systems. It’s crafted to leverage Apple's hardware capabilities, namely the CPU, GPU, and Neural Engine, to ensure fast and efficient on-device model execution. The models exported through this package utilize the mlpackage format introduced in recent years, which is preferred for its forward compatibility and performance enhancements.
Exporting a Model
To transform a model from the Hugging Face Hub or a local implementation to a Core ML format, users can employ the Exporters package via a simple command-line instruction:
python -m exporters.coreml --model=distilbert-base-uncased exported/
This command processes the specified model and outputs a Core ML model into the exported
folder. The resulting .mlpackage
file can then be integrated into an Xcode project for deployment in an app.
Customization and Advanced Configurations
Exporters offer options to fine-tune the export process. Users can select specific features associated with different model architectures, such as sequence classification or object detection, to tailor the conversion for particular tasks. Users can also customize input and output configurations of models by subclassing existing configuration objects, overriding defaults to fit specific project needs.
Utilizing the Exported Model
Once exported, the model can be incorporated into applications using Xcode, where it auto-generates Swift classes to facilitate integrating and making predictions within apps. Depending on chosen configurations, additional preprocessing or postprocessing might be needed, particularly for text models that require tokenization.
Moving Forward
For developers aiming to employ models not directly supported out-of-the-box by Exporters, the package provides a pathway to implement custom configuration and export pipelines. This flexibility ensures that users can leverage virtually any model within Apple's ecosystem with minimal hassle.
In sum, 🤗 Exporters aspire to streamline the process of deploying high-performance machine learning models within Apple's tightly integrated software and hardware environment, empowering a wide range of applications to harness the power of Transformers models effortlessly.