L1-2 Regularized Logistic Regression

Jing Qin, Yifei Lou

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations


Logistic regression has become a fundamental tool to facilitate data analysis and prediction in a variety of applications, including health care and social sciences. Depending on different sparsity assumptions, logistic regression models often incorporate various regularizations, including ℓ1-norm, ℓ2-norm and some non-convex regularizations. In this paper, we propose a nonconvex ℓ1-2-regularized logistic regression model assuming that the coefficients to be recovered are highly sparse. We derive two numerical algorithms with guaranteed convergence based on the alternating direction method of multipliers and the proximal operator of ℓ1-2. Numerical experiments on real data demonstrate the great potential of the proposed approach.

Original languageEnglish
Title of host publicationConference Record - 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
EditorsMichael B. Matthews
Number of pages5
ISBN (Electronic)9781728143002
StatePublished - Nov 2019
Event53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019 - Pacific Grove, United States
Duration: Nov 3 2019Nov 6 2019

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
ISSN (Print)1058-6393


Conference53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
Country/TerritoryUnited States
CityPacific Grove

Bibliographical note

Publisher Copyright:
© 2019 IEEE.

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications


Dive into the research topics of 'L1-2 Regularized Logistic Regression'. Together they form a unique fingerprint.

Cite this