I am a Ph.D. student at IMT Lucca (Italy) in the track of Computer, Decision and Systems Science (CDSS), XXIX cycle, in the Dynamical Systems, Control, and Optimization (DYSCO) research group under Prof. Alberto Bemporad, and at KU Leuven (Belgium), Department of Electrical Engineering (ESAT) in the Center for Dynamical Systems, Signal Processing and Data Analytics (STADIUS) research group under Prof. Panagiotis Patrinos.
I received my Bachelor Degree in Mathematics in April 2010 with thesis "Representations of the General Linear Group and Young tableaux" supervised by Prof. Giorgio Ottaviani, and my Master Degree in Mathematics in April 2013 with thesis "Development of the theory of probability “foldings” for the study of the convergence of stochastic processes" supervised by Prof. Alberto Gandolfi, both from the University of Florence (Italy).
Overview & research interests
Today's world is experiencing an always increasing demand for efficiency and speed in solving problems. The definition of “problem” in this context is broad as it sounds, as it is meant to cover fields ranging among engineering (control, embedded MPC, automotive industries, signal processing, image analysis, …), data-driven sciences (machine learning, data mining, statistics, …), and finance (investment management, markets forecasting, …) to name a few. All these fields are themselves broad, as they serve as ground frameworks of real world applications: cars, computers, hospital equipment, bank systems and so on.
The unifying framework of all these “problem” instances lies in the mathematical formulation, namely the minimization of a “cost” (function), and Optimization is the science that addresses such problems.
The challenge of my research is to derive optimization methods that are suitable for most applications, yet without trading-off performances. Due to my background in pure mathematics I mainly work on the theoretical aspects which involve advanced mathematics and are fundamental for deriving convergence guarantees and best performances under the least conservative assumptions, aiming at providing competitive optimization schemes applicable to the widest possible range of real-world problems.
Below, a list of papers (to be) published or currently under revision (see also my Google Scholar page).
- A. Themelis and P. Patrinos. Douglas-Rachford splitting and ADMM for nonconvex optimization: tight convergence results (2017)
(submitted to the SIAM Journal on Optimization on January 2018)
- L. Stella, A. Themelis and P. Patrinos. Newton-type alternating minimization algorithm for convex optimization (2017)
(accepted for publication on the IEEE Transactions on Automatic Control journal in March 2018)
- A. Themelis and P. Patrinos. SuperMann: a superlinearly convergent algorithm for finding fixed points of nonexpansive operators (2016)
(under 2nd review round in the IEEE Transactions on Automatic Control journal since March 2018)
- A. Themelis, L. Stella and P. Patrinos. Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone line-search algorithms (2016)
(accepted for publication on the SIAM Journal on Optimization in March 2018)
- L. Stella, A. Themelis and P. Patrinos. Forward-backward quasi-Newton methods for nonsmooth optimization problems, Computational Optimization and Applications (2017) 67:443. doi: 10.1007/s10589-017-9912-y
- A. Sathya, P. Sopasakis, R. Van Parys, A. Themelis, G. Pipeleers and P. Patrinos. "Embedded nonlinear model predictive control for obstacle avoidance using PANOC." 2018 European Control Conference (ECC), Limassol, 2018
- L. Stella, A. Themelis, P. Sopasakis and P. Patrinos, "A simple and efficient algorithm for nonlinear model predictive control," 2017 IEEE 56th Annual Conference on Decision and Control (CDC), Melbourne, VIC, 2017, pp. 1939-1944.
- P. Sopasakis, A. Themelis, J. Suykens and P. Patrinos, "A primal-dual line search method and applications in image processing," 2017 25th European Signal Processing Conference (EUSIPCO), Kos, 2017, pp. 1065-1069.
- A. Themelis, S. Villa, P. Patrinos and A. Bemporad, "Stochastic gradient methods for stochastic model predictive control," 2016 European Control Conference (ECC), Aalborg, 2016, pp. 154-159.
- Proximal envelopes. ECC 2018 Workshop on "Advances in Distributed and Large-Scale Optimization", Limassol (Cyprus), Jun. 12-15, 2018. http://www.ecc18.eu/index.php/workshop-6/
- Newton-type operator splitting algorithms. EUCCO 2016: 4th European Conference on Computational Optimization, Leuven (Belgium), Sep. 12-14, 2016. https://kuleuvencongres.be/eucco2016/programme
- A variable metric stochastic aggregated gradient algorithm for convex optimization. EURO 2016: 28th European Conference on Operational Research, Poznan (Poland). Jul. 3-6, 2016. https://www.euro-online.org/conf/euro28/edit_session?sessid=154
- A Globally and Superlinearly Convergent Algorithm for Finding Fixed Points of Nonexpansive Operators. CORE@50: Center for Operations Research and Econometrics Conference, Louvain la Neuve (Belgium). May 23-27, 2016
Below, a list of softwares developed by others based on (some of) the publications above. More info at github repository KUL-ForBES and Ph.D. Lorenzo Stella's page.
ForBES (Forward-Backward Envelope Solver)
by Lorenzo Stella and Panos Patrinos
Newton-type alternating minimization algorithm for convex optimization
Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone line-search algorithms
MATLAB solver for nonsmooth optimization, contains a library of mathematical functions to formulate problems arising in control, machine learning, image and signal processing.
SuperSCS (Superlinear Splitting Conic Solver)
by Pantelis Sopasakis and Panos Patrinos
SuperMann: a superlinearly convergent algorithm for finding fixed points of nonexpansive operators
SuperSCS is being developed on top of SCS, a splitting conic solver for convex optimization problems based on
B. O'Donoghue, E. Chu, N. Parikh and S. Boyd. Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding, JOTA vol. 169 pp. 1042-1068 (2016) https://link.springer.com/article/10.1007/s10957-016-0892-3
SuperSCS implements the SuperMann scheme on the fixed-point iterations of SCS. Thanks to the online preconditioning of (limited-memory) quasi-Newton directions, it reaches high accuracy solutions with superlinear convergence rates.