Skip to main navigationSkip to main content
The University of Southampton
CORMSIS Centre for Operational Research, Management Sciences and Information Systems

CORMSIS Seminar - SDNA: Stochastic dual Newton ascent for empirical risk minimization Event

Time:
16:00 - 17:00
Date:
17 November 2016
Venue:
Room 3041 Building 2, Southampton Business School

For more information regarding this event, please email Dr Yuan Huang at yuan.huang@soton.ac.uk .

Event details

Abstract of the talk: We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all curvature information contained in the examples, which leads to striking improvements in both theory and practice – sometimes by orders of magnitude. In the special case when an L2-regularizer is used in the primal, the dual problem is a concave quadratic maximization problem plus a separable term. In this regime, SDNA in each step solves a proximal subproblem involving a random principal submatrix of the Hessian of the quadratic function; whence the name of the method. SDNA converges linearly, and in addition, enjoys super linear speedup in minibatch size. The latter property can not be found, for fundamental reasons, in all previous stochastic minibatch first-order methods methods.

Speaker information

Dr Peter Richtarik,University of Edinburgh,Peter Richtárik is an Associate Professor in the School of Mathematics, University of Edinburgh. He obtained his PhD from Cornell University in 2007, after which he spent two years as a postdoctoral fellow at Universite Catholique de Louvain. Recently, he has been a visiting researcher at the Simons Institute at UC Berkeley. He is a Faculty Fellow at the Alan Turing Institute – the UK national research centre for data science. Prof Richtárik has made fundamental contributions to “big data optimization”, which is an emerging field at the intersection of mathematical optimization, convex analysis, probability theory, computer science, machine learning and high performance computing. He is the recipient of an EPSRC Fellowship in the Mathematical Sciences, which allows him to focus on further developing this line of work. In a sequence of influential papers, mostly written in collaboration with his PhD students and postdocs, he developed the theory of randomized coordinate descent and stochastic gradient methods of various flavours for convex optimization in very high dimensions. These algorithms have substantially better complexity bounds than previous deterministic methods, and have state-of-the-art practical performance for key statistical learning tasks involving large data. The underlying complexity theory is built on a blend of ideas and tools from optimization, convex analysis, linear algebra, computer science and probability theory. Prof Richtárik is the recipient of the 2016 SIAM SIGEST Paper Award and a 2016 EUSA Best Research or Dissertation Supervisor Award. His PhD students and postdocs have received a number of national and international prizes and awards for joint work, including 17th and 16th IMA Leslie Fox Prize (2015 and 2013), BASP Frontiers Best Contribution Award (2015), Optimization and Big Data Best Contribution Award (2015), OR Society Best Doctoral Dissertation Award (2014), and the INFORMS Computing Society Best Student Paper Prize (2012).

Privacy Settings