Seminar Special Topics in Continuous Optimization and Optimal Control (SS 22)
Here you find information about the seminar on Special Topics in Continuous Optimization and Control in the summer term 2022.
2022-05-12
General Information
- Organizers
Please contact Ihno Schrot if you have questions regarding this seminar.
- Kick-Off Meeting Date
- Tuesday, April 26, 11:15, in SR1 (INF 205)
- Date
- Tuesdays, 11:00 - 13:00
- Room
- Seminarroom 1 (SR 1) in INF 205
We meet in person.
- Level:
- Master Students (Bachelor students with relevant experience and motivation are welcome, too, though!) We offer a proseminar for Bachelor students only, too.
- Language
- English
- Requirements
- You should have attended at least the following courses:
- Analysis 1, 2
- Linear Algebra 1
- Nonlinear Optimization
You can of course still join the seminar if you are missing one of these courses, but we strongly recommend that you attend the missing course in parallel then. The experience is that students without knowledge in these areas have problems with the topics of this seminar.
- Registration
- To participate in the seminar, please register for the seminar on MÜSLI. The number of participants is limited to 18.
Kick-Off Meeting
The kick-off meeting takes place on Tuesday, April 26, 11:15, in SR 1 (INF 205). Here, we
- discuss organizational matters,
- form pairs if necessary,
- distribute the topics,
- discuss the schedule.
Topics
- CG-Methods
- Active set methods for NLPs with vanishing constraints
- Penalty and Augmented Lagrangian methods
- SQP with indefinite Hessian approximations
- BFGS-SQP Method for nonsmooth, nonconvex, constrained optimization
- Interior-point filter line-search algorithm for NLP
- A dual Newton strategy for the efficient soulution of sparse QPs arising in SQP-based nonlinear model predictive control
- Global optimization methods
- Robust nonconvex optimization
Goal of the Seminar
Teach your fellow students your topic in an understandable yet professional way!
Expectations
In order to complete the seminar successfully you have to
- attend the weekly meetings,
- prepare and do a presentation of 90min (+10min discussion) in pairs
You do not need to write an essay.
Further we expect
- a thorough understanding of your topic,
- professional display and communication of Mathematics,
- scientific literature work,
- a professional presentation (the presentation technique is up to you).
Schedule
Date | Topic | Speaker(s) |
---|---|---|
April 26th | Kick-off meetig | None |
May 24th | Active set methods for NLPs with vanishing constraints | Laura |
May 31st | A dual Newton strategy for the efficient soulution of sparse QPs arising in SQP-based nonlinear model predictive control | Jörn, Szymon |
June 7th | An approximation technique for robust nonlinear optimization | Aarya |
None | CG-Methods | None |
None | Penalty and Augmented Lagrangian methods | None |
None | SQP with indefinite Hessian approximations | None |
None | BFGS-SQP Method for nonsmooth, nonconvex, constrained optimization | None |
None | Interior-point filter line-search algorithm for NLP | None |
None | Global optimization methods | None |
Grading
We primarly rate your talks based on this rubric.
Literature
We expect you to identify and use further literature if necessary on your own! The following literature suggestions are to be understood as starting points.
- General recommendations
- Nocedal, Jorge, and Stephen J. Wright, eds. Numerical optimization. New York, NY: Springer New York, 2006.
- Ulbrich, Michael, and Stefan Ulbrich. Nichtlineare Optimierung. Springer-Verlag, 2012.
- Recommendations by topics
- You find all the following literature online when you are connected to the University network. If you are not in the University, you can connect to the network using the VPN. You find information on the VPN here.
- CG-Methods
- J. Nocedal, S.J. Wright. Numerical Optimization, Springer, 2006. Ch. 5, pp. 101–133.
- Published papers about new research results on the field of nonlinear CG-methods
- Active set methods for NLPs with vanishing constraints
- C. Kirches, A. Potschka, H.G. Bock and S. Sager. A parametric active set method for quadratic programs with vanishing constraints.
- Penalty and augmented Lagrangian methods
- J. Nocedal, S.J. Wright. Numerical Optimization, Springer, 2006. Ch. 17, pp. 497–527.
- SQP with indefinite Hessian approximations
- D. Janka. Sequential quadratic programming with indefinite Hessian approximations for nonlinear optimum experimental design for parameter estimation in differential–algebraic equations. PhD Thesis. Heidelberg. (2015).
- D. Janka, C. Kirches, S. Sager and A. Wachter. An SR1/BFGS SQP algorithm for nonconvex nonlinear programs with block-diagonal Hessian matrix (2016).
- BFGS-SQP Method for nonsmooth, nonconvex, costrained optimization
- F.E. Curtis et. al. BFGS-SQP Method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles. Optimization Methods and Software (2017)
- Interior-point filter line-search algorithm for NLP
- A. Wachter and L.T., Biegler. On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Math. Program., Ser. A (2005).
- A dual Newton strategy for the efficient soulution of sparse QPs arising in SQP-based nonlinear model predictive control
- J. Frasch,M. Vukov, H.J. Ferreau and M. Diehl. A dual Newton strategy for the efficient solution of sparse quadratic programs arising in SQP-based nonlinear MPC.
- J. Frasch. Parallel Algorithms for Optimization of Dynamic Systems in Real-Time. PhD Thesis, KU Leuven and U Magdeburg, 2014.
- Global optimization methods
- C.S. Adjiman et. al. A global optimization method, \(\alpha\) BB, for general twice-differentiable constrained NLPs - I. Theoretical advances. Computers Chem. Engng Vol.2 No. 9, pp. 1137-1158. (1998).
- C. A. Meyer and C. A. Floudas. Convex underestimation of twice continuously differentiable functions by piecewise quadratic pertubation: spline \(\alpha\) BB underestimators.
- An approximation technique for robust nonlinear optimization
- M. Diehl, H.G. Bock, E. Kostina. An approximation technique for robust nonlinear optimization. Math. Program. 107, 213–230 (2006).
- Robust nonconvex optimization
- B. Houska. Robust Optimization of Dynamic Systems. Chapter 3-4. (2011).