The program below is a preliminary version, which will be periodically updated according to the actual lectures.

**Topology and calculus background**- Sets in the Euclidean space: openness, closedness, boundedneess and compactness
- Sequences and limits in the Euclidean space
- Functions of several variables: continuity, directional derivatives and differentiability
- Functions of several variables: first and second order Taylor's formulas

**Linear algebra background**- Normal, unitary, hermitian, positive definite, reducible matrices
- Minimal polynomial, diagonalizability, Jordan and Schur canonical forms
- Quadratical forms
- Gerschgorin theorem for irreducible matrices
- Matrix norms induced by vector norms

**Convex functions, convex sets and optimization problems**- Convex and strictly convex functions, convex sets: properties and characterizations
- Classification of optimization problems; local and global minima [and maxima]
- Properties of convex optimization problems

**Optimality conditions for unconstrained optimization**- First and second order necessary optimality conditions
- Sufficient optimality conditions; the convex case

**Direct and iterative methods for linear systems**- Matrix factorizations and elementary matrices
- LU, QR and LL^H factorizations; Gauss, Householder and Cholesky methods
- Convergence of iterative methods
- Stein-Rosenberg theorem
- Special results for tridiagonal and positive definite matrices
- The conjugate gradient method for linear systems

**Iterative methods for nonlinear systems**- General notions and conditions for convergence
- Newton-Raphson method and its convergence; the case of convex functions

**Solution methods for unconstrained optimization**- Gradients methods with exact and inexact linesearch
- Newton's method
- Conjugate gradient methods

**The least-squares problem**- The linear problem and the normal equations
- Solving the linear problem through QR factorization and SVD decomposition
- Solving the linear problem through the conjugate gradient method
- The nonlinear problem and the Gauss-Newton method

**Iterative methods for computing eigenvalues**- Bauer-Fike theorem
- The power method
- Unitary reduction to tridiagonal form
- Tridiagonal matrices: Sturm sequences; divide and conquer technique
- QR method for eigenvalues and its convergence

**Optimality conditions for constrained optimization**- Necessary optimality conditions for abstract constrained problems
- Fritz John and Karush-Kuhn-Tucker necessary optimality conditions: Lagrange multipliers
- Necessary and sufficient condition for convex optimization

**Solution methods for constrained optimization**- Methods of feasible directions; the conditional gradient method (aka the Frank-Wolfe method)
- Projections over a convex set and projected gradient methods
- Elimination of variables: the linear case
- Penalty methods: external penalization and the augmented Lagrangian
- Barrier methods; interior point primal-dual methods for linear programming

**The fast Fourier transform**- The Fourier matrix and its properties
- The algorithm and some applications