## Second order optimality conditions

for differentiable multiobjective problems

Giancarlo Bigi and Marco Castellani
### Summary

The study of optimality conditions is one of the main topics of
Optimization Theory. For multiobjective programming, some of the first
interesting results have been developed in the middle seventies;
since then, many papers appeared, dealing with first order necessary
optimality conditions both for differentiable and nondifferentiable problems
When the problem satisfies suitable convexity assumptions, these conditions
turn out to be also sufficient. However, in the general case there may be
feasible points, which satisfy the first order conditions but are not
optimal solutions. In order to drop them, additional optimality
conditions, involving second order derivatives of the given
functions, can be developed. A few results in this direction
have been presented in some recent papers. This paper
aims to deepen this type of analysis, providing more general results.
First, we investigate differentiable multiobjective problems, where the
constraint is given in set form. By linearizing
tecniques, we obtain necessary conditions in terms of the impossibility
of nonhomogeneous linear systems, involving the Jacobians and the Hessians of
the objective functions and the second order contingent set of the feasible
region. We stress that these systems depend upon the choice of
a common descent direction for the objective functions.
Moreover, we show that the gap between first order conditions for single and
multi objective problems holds also for second order conditions.
Then, we apply our results to the case where the feasible region is
expressed by both inequality and equality constraints.
This can be done, exploiting the connections between the second order
contingent set of the feasible region and the second order derivatives of
the constraining functions.
By means of theorems of the alternative, we are therefore able to deduce a
John type multipliers rule, involving both the Jacobians and the Hessians of
the objective and constraining functions. We stress that the multipliers are
not fixed but they depend upon the chosen descent direction.
In the last section, we analyse some conditions, which guarantee the
existence of nonzero multipliers corresponding to the objective functions;
following the approach developed by Kawasaki for scalar problems,
we consider a constraint qualification, which is weaker than those already
introduced, and we show that the Guignard type constraint
qualification is useless without convexity assumptions; on the contrary,
we introduce a Guignard type condition, which involves also the objective
functions and needs no convexity assumptions to achieve the goal.

If you are interested in this paper, feel free to contact me.

My Home Page