105.753 AKANA AKOR Convex and Tame optimization
This course is in all assigned curricula part of the STEOP.
This course is in at least 1 assigned curriculum part of the STEOP.

2022W, VO, 3.0h, 4.5EC

Properties

  • Semester hours: 3.0
  • Credits: 4.5
  • Type: VO Lecture
  • Format: Presence

Learning outcomes

After successful completion of the course, students are able to...

After successful completeion of the course, students are familiar with the two main paradigms of Modern nonsmooth analysis, namely the convex and the semialgebraic paradigm. They will recognize the crucial role of these paradigms in obtaining fundamental theoretical results in Optimization and Operations Research, such as good prior estimates in numerical descent algorithms.  

Subject of course

Contents

1. Convex and Nonsmooth Analysis 

From smooth manifolds to tangent and normal cones

Convex functions and subdifferentials 

Lipschitz functions, Clarke subdifferential

 

2. Tame variational analysis

Semialgebraic functions, o-minimal structures

Stratification vs Clarke subdiferential

Sard theorem for nonsmooth tame functions.

Lojasiewicz inequality and generalizations

 

3. Asymptotic analysis of descent systems

Proximal algorithm – steepest descent

Kurdyka’s desigularization: characterization and applications

A convex counterexample to Kurdyka’s desigularization

Asymptotic equivalence between continuous and discrete systems

Self-contracted curves, Manselli-Pucci mean width technique

 

Teaching methods

These lectures present the main theory of convex analysis and Tame variational analysis from the optimization viewpoint. A particular emphasis will be given in descent methods. Research topics will also be discussed. The lectures will be given in English, unless the participants decide otherwise.

The course will be supported by partial lecture notes and research papers. 


Mode of examination

Oral

Additional information

Nonsmoothness prevails optimization in most of its theoretical and practical aspects. Even if the starting point is a smooth (or even a polynomial) model, natural operations like marginal/value functions, min/max selections etc destroy smoothness. In addition, extrema of nonsmooth functions occur, in general, at points of nondifferentiability. This has inevitably led to the development of the modern variational analysis and of nonsmooth optimization algorithms. Since the seminal example of Weierstrauss, back to 1872, exhibiting a univariate continuous real-valued function which is nowhere differentiable, it has been commonly understood that pathologies are tightly linked with almost all theories in classical analysis. Variational Analysis, handling nonsmooth objects cannot be an exception. Notwithstanding, in most applications nonsmoothness arises together with an intrinsic structure: for instance, an initial polynomial model will give rise to a semialgebraic structure. Therefore, although a general nonsmooth theory will be full of pathological situations, it is founded to consider the trace of this theory within well-behaved paradigms. This course aims at underlining the use of these paradigms in optimization, focusing on minimization algorithms or general descent systems. After an introductory crash course in Nonsmooth Analysis, we shall consider

Nonsmooth Optimization problems enjoying a nice intrinsic structure: The Tame paradigm —which is what is nowadays called Tame Optimization encompassing the semialgebraic structures— and the (classical) Convex paradigm. Convergence analysis of the proximal algorithm —a central tool in nonsmooth minimization— will be presented in the light of these two paradigms, emphasizing important convergence properties. So far, some of these properties seem to be eluded even in the convex case. Relations between continuous vs discrete dynamical descent systems will be presented. A secondary aim of this course is to provide essential background and material for further research. During the lectures, some open problems will be eventually mentioned.

 

Lecturers

Institute

Course dates

DayTimeDateLocationDescription
Wed15:00 - 18:0012.10.2022 - 25.01.2023Sem.R. DB gelb 09 Convex and Tame Optimisation
Mon10:00 - 13:0024.10.2022Sem.R. DB gelb 09 Convex and Tame Optimization
Thu15:00 - 18:0003.11.2022Sem.R. DA grün 06B Convex and Tame Optimization
Wed14:00 - 15:0009.11.2022Sem.R. DB gelb 09 Convex and Tame Optimization
AKANA AKOR Convex and Tame optimization - Single appointments
DayDateTimeLocationDescription
Wed12.10.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed19.10.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mon24.10.202210:00 - 13:00Sem.R. DB gelb 09 Convex and Tame Optimization
Thu03.11.202215:00 - 18:00Sem.R. DA grün 06B Convex and Tame Optimization
Wed09.11.202214:00 - 15:00Sem.R. DB gelb 09 Convex and Tame Optimization
Wed09.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed16.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed30.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed07.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed14.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed21.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed11.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed18.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Wed25.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation

Examination modalities

Oral presentations and oral exam

Course registration

Not necessary

Precondition

The student has to be enrolled for at least one of the studies listed below

Curricula

Study CodeObligationSemesterPrecon.Info
860 GW Optional Courses - Technical Mathematics Not specified

Literature

No lecture notes are available.

Previous knowledge

Students with a good background in Analysis/Optimization, motivated by modern trends in Mathematics, are encouraged to take this course. 

Language

English