Skip to main content

Legacy display Course

This is an archived course. The content might be broken.

S4E1 - Graduate Seminar on Scientific Computing (Winter Term 2015/16)

Continuous Optimization Methods

Prof. Dr. André Uschmajew

The analysis of nonlinear optimization methods for minimizing a continuous function with/without smooth constraints represents a challenging and fascinating task. Within the vast number of available algorithms, the following topics could be suggested for the seminar:

  • Local convergence of block coordinate descent (BCD) methods, like nonlinear Gauss-Seidel/SOR method.
  • The Lojasiewicz gradient inequality for real-analytic functions and single-point convergence of gradient methods.
  • Line-search methods on Riemannian manifolds with application to matrix manifolds.
  • Alternating projection methods.
  • Optimization methods for low-rank matrix/tensor approximation.
  • ...
Most of these topics focus on first-order methods which have regained considerable interest recently in the context of big data applications. If you are interested in second-order, Newton-type methods, we will find an interesting topic as well.

Date & time: Wednesday, 10.15–11.45 Uhr, Wegelerstr. 6, Room 6.020.

In case of interest, please contact me via email, or come to the first meeting on October 28, 2015 (date changed due to Panorama conference).