FLSVR: Solving Lagrangian Support Vector Regression Using Functional Iterative Method

Document Type

Article

Publication Title

Neural Processing Letters

Abstract

The Lagrangian dual of the 2-norm support vector regression (LSVR) solves a quadratic programming problem (QPP) in 2m variables subject to the non-negative variable conditions where m is the size of the training set. Applying the Karush–Kuhn–Tucker (KKT) necessary and sufficient optimality conditions, this work's novel problem formulation is only derived as a fixed point problem in m variables. This problem is solvable either in its original form, having the non-smooth "plus" function, or by considering its equivalent absolute value equation problem using functional iterative methods. A linear convergence rate of the proposed iterative methods is rigorously established under appropriate assumptions. It leads to the unique optimum solution. Numerical experiments performed on several synthetic and real-world benchmark datasets demonstrate that the proposed formulation solved by iterative methods shows similar or better generalization capability with a learning speed much faster than support vector regression (SVR), very close to least squares SVR (LS-SVR), and comparable with ULSVR which indicates its effectiveness and superiority.

DOI

10.1007/s11063-025-11780-8

Publication Date

8-1-2025

This document is currently not available here.

Share

COinS