terence/home

Spring 2023 plans

The big picture

Reading (by priority)

  • Luenberger, David G., and Yinyu Ye. Linear and nonlinear programming. Vol. 2. Reading, MA: Addison-wesley, 1984.

  • LeVeque, Randall J. Finite difference methods for ordinary and partial differential equations: steady-state and time-dependent problems. Society for Industrial and Applied Mathematics, 2007.

  • Steele, J. Michael. The Cauchy-Schwarz master class: an introduction to the art of mathematical inequalities. Cambridge University Press, 2004.

  • Tao, Terence. Solving mathematical problems: A personal perspective. OUP Oxford, 2006.

Research

With regard to the two papers David has given,

  • Scieur, D., Roulet, V., Bach, F., & d’Aspremont, A. (2017). Integration methods and optimization algorithms. Advances in Neural Information Processing Systems, 30.

  • Zhang, J., Mokhtari, A., Sra, S., & Jadbabaie, A. (2018). Direct Runge-Kutta discretization achieves acceleration. Advances in neural information processing systems, 31.

we will try to build a better connection between numerical methods used to solve IVPs for ODEs (specifically, the gradient flow equation) with optimization algorithms, and hopefully write up a paper about it.

Ideally, this should be done in both senses - linear multistep methods from the first paper and multistage methods from the second one. Once the structure is fully understood, the next step will be designing new optimization methods with the new tools. At least some good results are expected since Nesterov’s (optimal) gradient algorithm was derived somehow in a very natural way by stability analysis.

Building a personal website

This can be done in several ways:

Weekly plan

Feb. 5 - Feb. 11

  • Write a personal statement (2000 characters) for the ICERM workshop
  • Finish reading Chapter 6 of LeVeque, R. J. (2007).
  • Finish reading Chapter 2 of Luenberger, D. G., & Ye, Y. (2021).
  • Hopefully start scratching my personal website and record previous research ideas on it

Mar. 28 - Apr. 4

  • Apply the Schur-Cohn criterion to the analysis of three-step optimization methods (specifically that of \(\pi_{\lambda h}(z)\) regarding A-stability); start writing down the details on development of three-step methods in an organized way here
  • Figure out how traveling abroad works at KAUST, get in touch with the administrative assistant and make an appointment with the U.S. consulate in Riyadh
  • Keep exploring features of Vim and Emacs, identify the pros and cons and join the steep learning curve of one of them (most likely Vim)

Apr. 17 - Apr. 31

  • Try out deriving the conditions for A_\(\epsilon\)-stability of linear three-step methods.
  • Look into a simple case of two-stage Runge-Kutta methods and see if connecting with continuous dynamics can give an optimal optimization algorithm
  • Read some of the lecture notes by Prof. He on ‘Contraction Methods for Convex Optimization and Monotone Variational Inequalities’

May 1 - May 7

  • Figure out how the conditions imposed by A\(_\epsilon\) stability connects to the original problem (\(\lambda \in [\mu, L]\)). Maybe start from a two-step case and see how to derive the optimal method with the A\(_\epsilon\) approach
  • Most of the time will be devoted to preparing for the two finals in the next week (real analysis and numerical optimization)