Project Introduction: Active-Passive Losses
The Active-Passive Losses project revolves around implementing normalized loss functions for deep learning models, particularly when dealing with noisy labels. This work is based on a paper presented at the International Conference on Machine Learning (ICML) in 2020, titled "Normalized Loss Functions for Deep Learning with Noisy Labels". The project aims to improve the robustness of machine learning models against mislabeled data, which is a common issue in large datasets.
Requirements
To run the project, the following software requirements must be met:
- Python: Version 3.6 or higher.
- PyTorch: Version 1.3.1 or higher is required for machine learning operations.
- torchvision: Version 0.4.1 or higher, which provides datasets, model architectures, and image transformations for computer vision.
- mlconfig: A package for handling the configurations necessary for experiments.
How to Run
To conduct experiments with this project, several configurations and command-line arguments are provided. Users need to refer to configuration files, usually labeled as '*.yaml', found within the config folder. Each experiment has distinct settings documented in these files.
Key Arguments
- noise_rate: This parameter specifies the rate of noise in the labels.
- asym: Denotes whether asymmetric noise is used; by default, the noise is symmetric if this is not specified.
- config_path: The path pointing to the folder where configuration files are stored.
- version: Represents the specific configuration file to use.
- exp_name: A name assigned to each experiment for differentiation.
- seed: A random seed for ensuring the reproducibility of results.
Example Commands
To illustrate, here's how one might run experiments on the CIFAR-10 and CIFAR-100 datasets with a symmetric noise rate of 0.4, using the NCE+RCE loss function:
-
For the CIFAR-10 dataset:
python3 main.py --exp_name test_exp \ --noise_rate 0.4 \ --version nce+rce \ --config_path configs/cifar10/sym \ --seed 123
-
For the CIFAR-100 dataset:
python3 main.py --exp_name test_exp \ --noise_rate 0.4 \ --version nce+rce \ --config_path configs/cifar100/sym \ --seed 123
Citing This Work
If you intend to use this code for research or practical applications, it is encouraged to cite the original paper. The citation format is as follows:
@inproceedings{ma2020normalized,
title={Normalized Loss Functions for Deep Learning with Noisy Labels},
author={Ma, Xingjun and Huang, Hanxun and Wang, Yisen and Romano, Simone and Erfani, Sarah and Bailey, James},
booktitle={ICML},
year={2020}
}
This project provides a structured approach to tackling the challenge of noisy labels in machine learning through innovative loss functions, which can be crucial for enhancing model performance and accuracy in real-world applications.