Gradient Aligned Regression via Pairwise Losses
CoRR(2024)
Abstract
Regression is a fundamental task in machine learning that has garnered
extensive attention over the past decades. The conventional approach for
regression involves employing loss functions that primarily concentrate on
aligning model prediction with the ground truth for each individual data
sample. Recent research endeavors have introduced novel perspectives by
incorporating label similarity to regression via imposing extra pairwise
regularization on the latent feature space and demonstrated the effectiveness.
However, there are two drawbacks for those approaches: i) their pairwise
operation in latent feature space is computationally more expensive than
conventional regression losses; ii) it lacks of theoretical justifications
behind such regularization. In this work, we propose GAR (Gradient Aligned
Regression) as a competitive alternative method in label space, which is
constituted by a conventional regression loss and two pairwise label difference
losses for gradient alignment including magnitude and direction. GAR enjoys: i)
the same level efficiency as conventional regression loss because the quadratic
complexity for the proposed pairwise losses can be reduced to linear
complexity; ii) theoretical insights from learning the pairwise label
difference to learning the gradient of the ground truth function. We limit our
current scope as regression on the clean data setting without noises, outliers
or distributional shifts, etc. We demonstrate the effectiveness of the proposed
method practically on two synthetic datasets and on eight extensive real-world
tasks from six benchmark datasets with other eight competitive baselines.
Running time experiments demonstrate the superior efficiency of the proposed
GAR over existing methods with pairwise regularization in latent feature space
and ablation studies demonstrate the effectiveness of each component for GAR.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined