CCICADA Research Group Lab Meeting – CCICADA Seminar Series in Homeland Security
Date/Time: Thursday, February 8, 2018, 12:15pm to 1:00pm
Location: 4th Floor Conference Room – 433, Computing Research & Education Building (CoRE)
Busch Campus, Rutgers, the State University
REMOTE – 800 868-1123/41814849 and BigBlueButton meeting (to be announced)
FEATURED SPEAKER: Mert Gürbüzbalaban, Assistant Professor
Department of Management Science and Information, Rutgers University
Title: Incremental methods for additive convex cost optimization
Abstract:
Motivated by machine learning problems over large data sets and distributed optimization over networks, we consider the problem of minimizing the sum of a large number of convex component functions. We study incremental gradient methods for solving such problems, which process component functions sequentially one at a time. We first consider deterministic cyclic incremental gradient methods (that process the component functions in a cycle) and provide new convergence rate results under some assumptions. We then consider a randomized incremental gradient method, called the random reshuffling (RR) algorithm, which picks a uniformly random order/permutation and processes the component functions one at a time according to this order (i.e., samples functions without replacement in each cycle). We provide the first convergence rate guarantees for this method that outperform its popular with-replacement counterpart stochastic gradient descent (SGD). We finally consider proximal incremental aggregated gradient methods, which compute a single component function gradient at each iteration while using outdated gradients of all component functions to approximate the global cost function gradient, and provide state-of-the-art linear rate results.
In the second part of the talk, we focus on the coordinate descent (CD) methods which have seen a resurgence of recent interest because of their applicability in machine learning as well as large scale data analysis and superior empirical performance. CD methods have two variants, cyclic coordinate descent (CCD) and randomized coordinate descent (RCD) which are deterministic and randomized versions of the CD methods. There is a large gap between the theory and practice of CCD methods in terms of performance in contrast to RCD methods. In particular, existing convergence guarantees for CCD are far worse than that of RCD with the exception of several trivial cases despite the fact that CCD methods work well in practice on many problem instances. In this paper, we provide problem classes for which CCD (or CD with any deterministic order) is provably faster than RCD in terms of asymptotic worst-case convergence and quantify this improvement.
This is joint work with Denizcan Vanli, Asu Ozdaglar and Pablo Parrilo.
Bio:
Mert Gürbüzbalaban is an assistant professor at Rutgers University in the Department of Management Science and Information Systems and an affiliated faculty at the Electrical and Computer Engineering Department and the DIMACS Institute at Rutgers University. Previously, he was a postdoctoral associate at the Laboratory for Information and Decision Systems (LIDS) at MIT. He is broadly interested in optimization and computational science driven by applications in large-scale information and decision systems and networks. He received his B.Sc. degrees in Electrical Engineering and Mathematics as a valedictorian from Boğaziçi University, Istanbul, Turkey, the “Diplôme d’ingénieur” degree from École Polytechnique, France, and the M.S. and Ph.D. degrees in Applied Mathematics from the Courant Institute of Mathematical Sciences, New York University.
Dr. Gürbüzbalaban received the Kurt Friedrichs Prize (given by the Courant Institute of New York University for an outstanding thesis) in 2013, Bronze Medal in the École Polytechnique Scientific Project Competition in 2006, the Nadir Orhan Bengisu Award from Boğaziçi University in 2005 and the Bülent Kerim Altay Award from the Electrical-Electronics Engineering Department of Middle East Technical University in 2001.