Reference no: EM133614545
Convex Optimization
Question 1. (Positive (semi)definiteness) Let A ∈ Rn×n be symmetric with the eigenvalues λ1,...., λn ∈ R (see spectral theorem). Show the following:
xT Ax ≥ 0 (x ∈ Rn) <==> λi ≥ 0 (i = 1, . . . , n);
xT Ax > 0 (x ∈ Rn \ {0}) <==> λi > 0 (i = 1, . . . , n).
Question 2. (Minimizing a quadratic function) Let A ∈ Sn and b ∈ Rn. Consider the quadratic function
q : Rn → R, q(x) = 1/2xTAx + bTx.
Computer ∇q(x) and ∇2q(x)
Show (without using first-order optimality conditions) that the following are equivalent:
i) infRn q > -∞;
ii) A is positive semidefinite (i.e. xT Ax ≥ 0 for all x ∈ Rn) and b ∈ rge A;
iii) argminRnq ≠ ∅.
Hint: You may use (if needed) without proof that a positive semidefinite matrix has only nonnegative eigenvalues.
Question 3. (Riesz representation theorem - finite dimensional version). Let (E, <·, ·>) be a (finite dimensional) Euclidean space and L ∈ L(E, R). Show that there exists a unique b ∈ E such that
L(x) = <b, x> for all x ∈ E.
Question 4. (Minimizing a linear function over the unit ball) Let g ∈ Rn \ {0}. Compute the solution of the optimization problem
min <g, d> s.t. d ≤ 1.
d∈Rn
Question 5. (Lower semicontinuous hull) Let f : E → R¯ Show that
epi(clf ) = cl(epif )
i.e., clf is the function whose epigraph is the closure of epif.
Question 6. (Logdet Barrier) We equip the linear space Sn with the trace inner product <A, B> = trace(AB) and the induced norm ||A||F = trace(A2). Consider the function f :
Sn → R¯ defined by
f (X) = - ln det(X), if X > 0,
+∞, else.
In this exercise, you will derive differential properties of f and prove that it is convex. Note, from basic properties of the determinant, it immediately follows
n
f (X) = -∑ ln(λi(X)) for any X > 0.
i=1
Here λi(X) are the eigenvalues of X. Now answer the following questions.
a) Let
Φ(x) = - log x, x > 0
+∞, else.
Compute Φ′(x) and Φ′′(x) for x > 0.
b) Let the function Φ : Rn → R¯ defined by
Φ(x) = (- Σn i=1 log xi, x ∈ R++n
+∞ else
Find the derivatives ∇Φ(x) and ∇2Φ(x) for x ∈ Rn++ .
(c) Prove ∇f (X) = -X-1 and ∇2f (X)[V] = X-1V X-1 for any X > 0. (Hint: Compute the gradient by computing directional derivatives limt→0 f(X+tV)-f(X)/t for every V ∈ Sn. Compute the Hessian by computing the directional derivative of the gradient. For this part, you will need to use the expansion
(I - A)-1 = I + A + A2 + A3 + . . .
whenever the maximal singular value of A is smaller than A (in particular for all A close to zero). Also do not try to take partial derivatives for this question. Use the definition of gradients, Hessians, and directional derivatives.)
(d) Using the property trace(AB) = trace(BA), prove
<∇2f (X)[V ], V = X-1/2V X-1/2||2F
for any X > 0 and V ∈ Sn.
Question 7. (Closedness of a positive combination) For p ∈ N, let fi : E → R ∪ {∞} be lsc and αi ≥ 0. Show that f def= Σpi=1 αifi is lsc.
Question 8. (Convexity preserving operations)(Intersection)
a) Let I be an arbitrary index set (possibly uncountable) and let Ci ⊂ E for all i ∈ I be a family of convex sets. Show that ∩i∈ICi is convex.
b) (Linear images and preimages) Let F ∈ L(E1, E2) and let C ⊂ E1, D ⊂ E2 be convex. Show that F(C) def= {Ax : x ∈ C} and F-1(D) def= {x : Ax ∈ D}are convex.