Convex Optimization — Boyd & Vandenberghe (Solution)

$ 20.99

Description

8. Geometric problems
• extremal volume ellipsoids
• centering
• classification
• placement and facility location
Minimum volume ellipsoid around a set
Lo¨wner-John ellipsoid of a set C: minimum volume ellipsoid E s.t. C ⊆ E
• parametrize E as E = {v | kAv + bk2 ≤ 1}; w.l.o.g. assume
• volE is proportional to detA−1; to compute minimum volume ellipsoid,
minimize (over A, b) logdetA−1
subject to supv∈C kAv + bk2 ≤ 1
convex, but evaluating the constraint can be hard (for general C) finite set C = {x1,…,xm}:
minimize (over A, b) logdetA−1
subject to kAxi + bk2 ≤ 1, i = 1,…,m
also gives L¨owner-John ellipsoid for polyhedron conv{x1,…,xm}
Maximum volume inscribed ellipsoid
maximum volume ellipsoid E inside a convex set C ⊆ Rn
• parametrize E as E = {Bu + d | kuk2 ≤ 1}; w.l.o.g. assume
• volE is proportional to detB; can compute E by solving
maximize logdetB subject to supkuk2≤1 IC(Bu + d) ≤ 0
(where IC(x) = 0 for x ∈ C and IC(x) = ∞ for x 6∈ C)
convex, but evaluating the constraint can be hard (for general C)
polyhedron :
maximize logdetB
subject to
(constraint follows from
Efficiency of ellipsoidal approximations
C ⊆ Rn convex, bounded, with nonempty interior
• L¨owner-John ellipsoid, shrunk by a factor n, lies inside C
• maximum volume inscribed ellipsoid, expanded by a factor n, covers C example (for two polyhedra in R2)

factor n can be improved to √n if C is symmetric
Centering
some possible definitions of ‘center’ of a convex set C: • center of largest inscribed ball (’Chebyshev center’) for polyhedron, can be computed via linear programming (page 4–19)
• center of maximum volume inscribed ellipsoid (page 1–3)

MVE center is invariant under affine coordinate transformations
Analytic center of a set of inequalities
the analytic center of set of convex inequalities and linear equations
fi(x) ≤ 0, i = 1,…,m, Fx = g
is defined as the optimal point of
minimize
subject to Fx = g
• more easily computed than MVE or Chebyshev center (see later)
• not just a property of the feasible set: two sets of inequalities can describe the same set, but have different analytic centers
analytic center of linear inequalities xac is minimizer of
inner and outer ellipsoids from analytic center:
outer
where
Einner = {x | (x − xac)T∇2φ(xac)(x − xac) ≤ 1}
Eouter = {x | (x − xac)T∇2φ(xac)(x − xac) ≤ m(m − 1)}
Linear discrimination
separate two sets of points {x1,…,xN}, {y1,…,yM} by a hyperplane:
aTxi + b > 0, i = 1,…,N, aTyi + b < 0, i = 1,…,M

homogeneous in a, b, hence equivalent to
aTxi + b ≥ 1, i = 1,…,N, aTyi + b ≤ −1, i = 1,…,M
a set of linear inequalities in a, b
Robust linear discrimination
(Euclidean) distance between hyperplanes
H1 = {z | aTz + b = 1}
H2 = {z | aTz + b = −1}
is dist(H1,H2) = 2/kak2
to separate two sets of points by maximum margin,
subject tominimize a(1aTT/xy2)ii++kabbk2≤ −≥ 1,1, i = 1i = 1,…,N,…,M (1)
(after squaring objective) a QP in a, b
Lagrange dual of maximum margin separation problem (1)
maximize 1Tλ + 1Tµ
subject to (2)
from duality, optimal value is inverse of maximum margin of separation interpretation
• change variables to θi = λi/1Tλ, γi = µi/1Tµ, t = 1/(1Tλ + 1Tµ)
• invert objective to minimize 1/(1Tλ + 1Tµ) = t
minimize t
subject to
optimal value is distance between convex hulls
Approximate linear separation of non-separable sets
minimize 1Tu + 1Tv
subject to
• an LP in a, b, u, v
• at optimum, ui = max{0,1 − aTxi − b}, vi = max{0,1 + aTyi + b}
• can be interpreted as a heuristic for minimizing #misclassified points

Support vector classifier
minimize kak2 + γ(1Tu + 1Tv)
subject to
produces point on trade-off curve between inverse of margin 2/kak2 and classification error, measured by total slack 1Tu + 1Tv same example as previous page, with γ = 0.1:
Nonlinear discrimination
separate two sets of points by a nonlinear function:
f(xi) > 0, i = 1,…,N, f(yi) < 0, i = 1,…,M
• choose a linearly parametrized family of functions
f(z) = θTF(z) F = (F1,…,Fk) : Rn → Rk are basis functions
• solve a set of linear inequalities in θ:
θTF(xi) ≥ 1, i = 1,…,N, θTF(yi) ≤ −1, i = 1,…,M quadratic discrimination: f(z) = zTPz + qTz + r

can add additional constraints (e.g., to separate by an ellipsoid)
polynomial discrimination: F(z) are all monomials up to a given degree

separation by ellipsoid separation by 4th degree polynomial
Placement and facility location
• N points with coordinates xi ∈ R2 (or R3)
• some positions xi are given; the other xi’s are variables
• for each pair of points, a cost function fij(xi,xj) placement problem
minimize Pi6=j fij(xi,xj)
variables are positions of free points interpretations
• points represent plants or warehouses; fij is transportation cost between facilities i and j
• points represent cells on an IC; fij represents wirelength
example: minimize P(i,j)∈Ah(kxi − xjk2), with 6 free points, 27 links optimal placement for h(z) = z, h(z) = z2, h(z) = z4
111
000
−1−1−1
−1 0 1 −1 0 1 −1 0 1
histograms of connection lengths kxi − xjk2

00 0.5 1 1.5 2 00 0.5 1 1.5 00 0.5 1 1.5

Reviews

There are no reviews yet.

Be the first to review “Convex Optimization — Boyd & Vandenberghe (Solution)”

Your email address will not be published. Required fields are marked *