Teratec / NAG - Seminar on Mathematical Optimization
October 10th, 2019 - Bruyères-le-Chatel
This seminar will capture recent developments in state-of-the-art continuous optimization. The day is split into 4 thematic sessions: introduction to optimization; derivative-free optimization; modern convex optimization; and algorithmic differentiation. Each session will last for approximately 90 minutes.
1. Introduction to optimization
This session focuses on general optimization principles. We will give an overview of the capabilities and limitations of modern numerical solvers and how to correctly use them:
2. Derivative-free optimization
Calibrating numerical models is a common optimization problem. Very often, evaluating these models involve heavy computations that can be either very expensive or very noisy if they are not run to full convergence. In such a case using classical derivative-based optimization methods may not be advisable as computing the model's gradient can become a difficult challenge. Finite-differences are often not realistic in terms of computing time, and even more sophisticated methods such as algorithmic differentiation can fail in the presence of noise. In this session we give a description of state-of-the-art derivative-free algorithms and how they can handle noise in the function evaluations.
3. Modern convex optimization
When the modelled problem is convex, most of the time numerical optimization solvers can make use of its far stronger theoretical properties. However, formulating the problem in a form that is exploitable for the solver can be quite a challenge. We will present some background on conic convex problems (Second Order Conic Programming - SOCP - and Semi-Definite Programming - SDP) and give some hints on the modelling possibilities opened by such solvers. We will show some common convex reformulations and some examples of their use in various industries.
4. Algorithmic Differentiation (AD)
Derivatives are required in many different areas of numerical computation. Algorithmic differentiation is a technique that takes a function definition written in a high-level programming language such as C++ or Fortran, and automatically generates code to compute its exact derivatives. Compared to the commonly used finite difference (or bumping) approach, AD not only computes exact derivatives but also allows fast computation of gradients. In this session we will give a short introduction to AD, show the usage of an operator overloading AD tool, and discuss the usage of AD in the context of an optimization problem.
Program:
Free seminar, limited places!