Transfer Learning Library
Introduction
The Transfer Learning Library, often referred to as TLlib, is an open-source library designed to facilitate transfer learning tasks. Built on top of PyTorch, it provides a high-performance environment with a user-friendly API for developing and deploying transfer learning algorithms. Its structure is reminiscent of torchvision, making it accessible for users familiar with PyTorch. Whether you're looking to develop novel approaches or implement existing strategies, TLlib offers a convenient framework.
The library's API is categorized into several methods, including:
- Domain Alignment: Methods for aligning different domains.
- Domain Translation: Techniques for translating domain data.
- Self-Training: Strategies for self-directed learning.
- Regularization: Boosting performance through various regularization methods.
- Data Reweighting/Resampling: Adjusting the data for optimal learning.
- Model Ranking/Selection: Tools for selecting the best models.
- Normalization-Based: Methods focused on data normalization.
The library also offers a comprehensive range of example codes within its examples
directory, structured according to learning setups such as Domain Adaptation (DA), Task Adaptation (TA), Out-of-Distribution Generalization (OOD), Semi-Supervised Learning (SSL), and Model Selection. These setups cater to tasks like classification, regression, object detection, segmentation, and keypoint detection.
Updates
The library is continuously updated to include new features and methods:
- March 15, 2024: An offline version of the documentation was made available.
- August 2023: Issues regarding dataset links were addressed in a specific notice.
- September 2022: Introduced installation support via
pip
, although it remains experimental. - August 2022: Launched version 0.4, with new methods for domain adaptation and semi-supervised learning, alongside a repository of significant transfer learning papers.
Supported Methods
Tllib supports a diverse array of algorithms across different transfer learning tasks. Here is a brief overview:
- Domain Adaptation for Classification: Includes methods like DANN, DAN, and CDAN, among others, each providing unique strategies for unsupervised domain adaptation.
- Domain Adaptation for Object Detection: Notable methods include CycleGAN and D-Adapt.
- Domain Adaptation for Semantic Segmentation: Techniques include CycleGAN, CyCADA, and FDA.
- Domain Adaptation for Keypoint Detection: Features RegDA.
- Domain Adaptation for Person Re-identification: Methods such as IBN-Net and MMT.
- Partial Domain Adaptation: Incorporates IWAN and AFN.
- Open-set Domain Adaptation: Utilizes OSBP.
- Domain Generalization for Classification: Contains MixStyle and IRM methods.
- Task Adaptation (Fine-Tuning) for Image Classification: Encompasses methods like L2-SP and DELTA.
- Pre-trained Model Selection: Tools for selecting models include H-Score and LogME.
- Semi-Supervised Learning for Classification: Includes Mean Teacher, UDA, and FlexMatch among others.
Installation
To install Tllib, users can either clone the source repository and execute setup commands or attempt installation via pip
:
python setup.py install
pip install -r requirements.txt
pip install -i https://test.pypi.org/simple/ tllib==0.4
Documentation
Comprehensive API documentation is hosted online, offering detailed guides for users.
Usage
Examples of using the library are provided in the examples
directory. For instance, training a DANN model on an Office-31 dataset using ResNet 50 can be easily executed with a simple command line instruction, illustrating its straightforward application process.
Contributing
Contributions to the library are encouraged. For bug fixes, contributors are free to submit without prior discussion. For new features or extensions, it's recommended to open an issue for discussion first.
Contact
For further inquiries or suggestions, users can contact Baixu Chen, Junguang Jiang, or Mingsheng Long via email or GitHub issues. Chinese speakers may access a dedicated Q&A platform on Zhihu.
Citation
For academic use, the project can be cited using the provided references, supporting further research and acknowledgment of the library's utility.