Abstract
We introduce a new supervised learning model using a discriminative regression approach. This new model estimates a regression vector to represent the similarity between a test example and training examples while seamlessly integrating the class information in the similarity estimation. This distinguishes our model from usual regression models and locally linear embedding approaches, rendering our method suitable for supervised learning problems in high-dimensional settings. Our model is easily extensible to account for nonlinear relationship and applicable to general data, including both high- and low-dimensional data. The objective function of the model is convex, for which two optimization algorithms are provided. These two optimization approaches induce two scalable solvers that are of mathematically provable, linear time complexity. Experimental results verify the effectiveness of the proposed method on various kinds of data. For example, our method shows comparable performance on low-dimensional data and superior performance on highdimensional data to several widely used classifiers; also, the linear solvers obtain promising performance on large-scale classification.
Original language | English |
---|---|
Article number | 30 |
Journal | ACM Transactions on Intelligent Systems and Technology |
Volume | 8 |
Issue number | 2 |
DOIs | |
State | Published - Nov 2016 |
Bibliographical note
Funding Information:This work was supported by the National Science Foundation under grant IIS-1218712.
Publisher Copyright:
© 2016 ACM.
Keywords
- Classification
- Discriminative regression
- High dimension
- Large-scale data
- Supervised learning
ASJC Scopus subject areas
- Theoretical Computer Science
- Artificial Intelligence