skip to main content

Parameter-free optimization, universal prediction, and Kolmogorov complexity

Project

Project Details

Program
Computer Science
Field of Study
​Computer Science, Mathematics or a related discipline
Division
Computer, Electrical and Mathematical Sciences and Engineering

Project Description

Online parameter-free optimization algorithms are optimization algorithms that do not require tuning of parameters, yet they provable achieve the optimal performance. The basic idea behind these algorithms is in the link between universal strategies for prediction with log loss and online convex optimization, first unveiled in Orabona&Pal (2016). By now, this line of work has produced a number of parameter-free optimization algorithms. However, universal prediction with log loss is also connected to the concept of Kolmogorov complexity for binary strings, as showed by Cover (1974). In this project, we aim at studying all these links, going from parameter-free optimization to Kolmogorov complexity, passing through universal prediction with log loss. We want to construct explicit reductions to transform each of these problem in one of the other ones. The final aim is to directly link convex optimization to an appropriate notion of computational complexity for functions. Moreover, we want to extend some of these concepts from binary strings to strings of bounded real numbers. Given the theoretical nature of this project, the ideal candidate must have an excellent mathematical background.

About the Researcher

Francesco Orabona
Associate Professor, Computer Science
Computer, Electrical and Mathematical Science and Engineering Division

Affiliations

Education Profile

  • Postdoctoral Researcher, University of Milano, 2010-2011
  • Postdoctoral Researcher, IDIAP Research Institute, 2007-2009
  • Ph.D., University of Genoa, 2007
  • B.Sc. and M.Sc. (""Laurea""), University of Napoli ""Federico II"", 2003

Research Interests

Professor Orabona's research lies at the intersection of practical and theoretical machine learning. His research interests encompass online learning, optimization, and statistical learning theory. His current research focus is on designing ""parameter-free"" machine learning algorithms, which are algorithms that operate effectively without requiring costly parameter tuning.

Selected Publications

  • A. Cutkosky, H. Mehta, F. Orabona. a€œOptimal Stochastic Non-smooth Non-convex Optimization through Online-to-Non-convex Conversiona€. in: Proc. of the International Conference on Machine Learning. 2023
  • N. Cesa-Bianchi and F. Orabona. a€œOnline Learning Algorithmsa€. In: Annual Review of Statistics and Its Application 8 (2021)
  • A. Cutkosky and F. Orabona. a€œMomentum-Based Variance Reduction in Non-Convex SGDa€. in: Advances in Neural Information Processing Systems 32. 2019
  • X. Li and F. Orabona. a€œOn the Convergence of Stochastic Gradient Descent with Adaptive Stepsizesa€. In: Proc. of the International Conference on Artificial Intelligence and Statistics (AISTATS). 2019
  • F. Orabona and D. PA¡l. a€œCoin Betting and Parameter-Free Online Learninga€. In: Advances in Neural Information Processing Systems 29. 2016

Desired Project Deliverables

Original research – contribution to a research paper​

Recommended Student Background

Knowledge of convex optimization
Knowledge of Kolmogorov complexity

We are shaping the
World of Research

Be part of the journey with VSRP

Find a Project
3-6 months
Internship period
100+
Research Projects
3.5/4
Cumulative GPA
310
Interns a Year