Global and quadratic convergent Newton metho for sparse optimisation -- Talk by Shenglong Zhou (Southampton) Event
- Time:
- 14:00 - 16:00
- Date:
- 13 December 2018
- Venue:
- Building 54, Room 8031, School of Mathematical Sciences, University of Southampton, Highfield Campus, SO17 1BJ
For more information regarding this event, please email Alain Zemkoho at A.Zemkoho@southampton.ac.uk .
Event details
Abstract: Newton method to deal with continuous optimisation has been known to be notoriously slow when the involved linear equation has a big size (e.g., the number of the variables/equations is over ten thousand). But it is capable of converging fast (with quadratic convergence rate often) as long as the sequence reaches to the local area of a solution. When it comes to the sparsity constrained optimisation (SCO), a combinatorial problem, what is its performance? This talk will answer this question. Theoretically, it is the first time to prove that our proposed Newton method is able to converge to a stationary point of SCO globally and quadratically under standard assumptions. Numerically, the complexity of solving a linear equation with n variables in each step by Newton method is O(n^3) in general, while our proposed method enjoys a significantly low complexity O(s^3), where s << n is a given scalar to control the number of non-zeros of a solution, which signifies it is computationally fast even for large n. When against with other state-of-the-art methods on solving compressed sensing in signal processing and sparse logistic regression in marching learning with big data, our proposed method exhibits its extraordinary advantages.
Speaker information
Shenglong Zhou,University of Southampton,