Large scale convex optimization and monotone operators, 7,5 hp,
Spring 2024.
(FSF3828 Selected Topics in Optimization and Systems Theory)
In this course we will study first-order methods related to the proximal
point method. These methods has in the last decade received widespread
use due to their effectiveness for solving large scale nonsmooth
convex optimization problems. The goal of this course is to give the
students an understanding of these methods and their theoretical
foundations, which build on the interplay between convex analysis and
monotone operator theory. In particular, we will consider methods
such a the proximal gradient method, the Douglas-Rachford method, ADMM
and the Chambolle-Pock method.
Lecturer and examiner:
Johan Karlsson,
email: johan.karlsson _at_ math.kth.se,
room 3550, Lindstedtsv 25, tfn 790 8440
Schedule:
First class February 6, 10.15-12.00.
Schedule: Tuesdays: 10.15-12.00. (No class 2/4)
Class week 9 is moved to Wednesday 28/2, 10.15-12.00. Location 3418.
Location: 3721, Lindstedtsvägen 25.
Tentative course plan:
Lectures 1-4: Convex functions, conjugation and duality. (Feb 6 & 13 &
20 & 27)
Lectures 5-7: Operators and basic iterative method. (Mar 5 & 12 & 19)
Lecture 8-10: Algorithms and operator splitting methods. (Mar 26 & Apr
9 & 16)
Lecture 11-12: Applications to inverse problems. (Apr 23 & 30)
Lecture 13-14: Topics related to specific research papers. Student
presentations. (May 7 & 14)
Lecture 15: Backup. (Tentatively May 21)
Prerequisites:
Basic knowledge in optimization and convex analysis is expected (duality, projections, Hilbert spaces, etc.).
Literature:
Ryu and Yin, Large-Scale Convex Optimization: Algorithms and Analyses via Monotone Operators,
Cambridge University Press, 2022
Bauschke and Combettes, Convex Analysis and Monotone Operator Theory in Hilbert Spaces,
Springer, 2011
Examination:
Mandatory homework assignments and
a project with a presentation in class.
There will be in total four collections of
homeworks.
Homework 1