Online Harmonizing Gradient Descent for Imbalanced Data Streams One-Pass Classification

Authors: Han Zhou, Hongpeng Yin, Xuanhong Deng, Yuyu Huang

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experimental results demonstrate the high efficiency and effectiveness in handling imbalanced data streams.
Researcher Affiliation Academia Han Zhou , Hongpeng Yin , Xuanhong Deng and Yuyu Huang The School of Automation, Chongqing University, Chongqing, China, 400044.
Pseudocode Yes Finally, we summarize the pseudo-code of the proposed OHGD in Algorithm.1. Algorithm 1 The Proposed Online Harmonized Gradient Descent Algorithms.
Open Source Code Yes More information about the codes and supplementary can be found in https://github.com/Kan9594/OHGD. The implementations of this work can be found in https://github.com/Kan9594/OHGD.
Open Datasets Yes Twenty-four datasets from the UCI repository and KEEL with different imbalance ratios were selected as the test rigs for performance evaluation. More details can be found in the UCI 2 and KEEL 3 websites. (http://archive.ics.uci.edu/ml/index.php and https://sci2s.ugr.es/keel/datasets.php)
Dataset Splits No The paper describes an online learning setting where "data are sequentially received and models need to commit to an immediate decision at each round." It does not specify fixed training, validation, and test splits with percentages or sample counts.
Hardware Specification No The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory used for running the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions) that would be needed to reproduce the experiments.
Experiment Setup Yes The learning rate ηt was set as 1/√t. Other parameters were set as the original work suggested. The number of base learners M was set as 10.