APM 462 -无代写
时间:2025-08-12
APM 462 Nonlinear Optimization Summer 2025
Final Project
Due date: August 23, 2025; 11.59 pm
All problems are to be handed in on Quercus.
HW 1 (25 pts) Consider the problem
min f(x, y)
s.t. (x, y) ∈ Ω
where
f(x, y) = x2 + 9y2
and
Ω =
{
(x, y) ∈ R2 : 2x+ y ≥ 1, x+ 3y ≥ 1, x, y ≥ 0}.
(a) (2 pts) Draw Ω.
(b) (3 pts) Why is there a unique global minimizer for this problem?
(c) (2 pts) Write this problem as a constrained optimization problem.
(d) (8 pts) Find candidates for minimizers for the constrained optimization prob-
lem using the KKT conditions.
(e) (5 pts) Find all minimizers for the constrained optimization problem using
the second order necessary and sufficient conditions.
(f) (5 pts) Show that at the unique global minimizer for this problem, the first
order necessary condition for unconstrained problems (using feasible
directions) holds.
1
HW 2 (15 pts) Consider the problem of minimizing
f(x) =
1
2
xTQx− bTx
with variable x ∈ Rn, and Q ∈ Rn×n symmetric positive definite and
b ∈ Rn fixed.
Let {e0, . . . , en−1} be a basis of Rn (ei is not necessarily a unit vector) and
define B0 := {0}, Bi = span{e0, . . . , ei−1} for i = 1, . . . , n. Fix x0 ∈ Rn
and let xi be the minimizer of f over x0 +Bi for i = 1, . . . , n.
(a) (5 pts) Find the unique α0 ∈ R such that x1 = x0 + α0e0.
(b) (5 pts) Find the unique α′0, α

1 ∈ R such that x1 = x0 + α′0e0 + α′1e1.
(c) (5 pts) Under what conditions on e0 and e1 is α0 = α

0?
2
HW 3 (20 pts) Consider the problem
min f(x) (1)
s.t. gi(x) ≥ 0 i = 1, . . . ,m
where x ∈ Rn, f is convex and gi is concave for i = 1, . . . ,m. Assume
that the strong duality theorem holds and that the maximum in the dual
problem is attained at µ¯ = (µ¯1, . . . , µ¯m).
We fix t > 0 and define a second (unconstrained problem)
min rt(x) (2)
s.t. x ∈ Rn
where
rt(x) := f(x) + t max
i=1,...,m
{(gi(x))−}
and where for z ∈ R the function h(z) = z− := −min(z, 0) denotes the
negative part of z.
We can express (2) as
min f(x) + ty (3)
s.t. gi(x) ≥ −y i = 1, . . . ,m
(x, y) ∈ Ω = Rn × R≥0.
(a) (5 pts) Show that rt is a convex function.
(b) (5 pts) Find the dual function for (3) and express it in terms of the dual
function for (1).
(c) (10 pts) Use the result from (b) to show the following: If t > µ¯1+µ¯2+· · ·+µ¯m
then any minimizer of (2) is also a minimizer of (1).
Note: Part of this exercise is to discuss the relationships between
minimizers of (2) and (3).
3
HW 4 (20 pts) Let Ω ⊂ R2 be a ”nice” domain (connected, open, with smooth boundary)
and consider the problem
max J(u)
s.t. u ∈ S = {u ∈ C2(Ω¯) : u(x, y) = u0(x, y) (x, y) ∈ ∂Ω}
where u0 : ∂Ω→ R is a given smooth function and
J(u) =


u(x, y)f(x, y)dxdy −


1
2
(|∇u(x, y)| − 1)2
+
dxdy
where f : Ω→ R is a given smooth function.
Here z+ := max(z, 0) denotes the positive part of z.
(a) (10 pts) Find and simplify the Euler-Lagrange equation for J .
Hint: You can use without proving that for z ∈ R the function
g(z) = z2+ is differentiable with derivative g
′(z) = 2z+. You can
also use that for v ∈ R2 the function h(v) = |v| is differentiable if
v ̸= 0 with ∇h(v) = v|v| . The final equation should be in the form
div(. . . ) = . . . .
Note: The integration by parts in the derivation of the Euler-Lagrange
equation is purely formal, as neither u nor the integrand of J are
smooth enough to justify it.
(b) (5 pts) Is the functional J convex or concave? Justify your answer.
(c) (5 pts) Is the functional J strictly convex or strictly concave? Justify your
answer.
4
HW 5 (20 pts) Consider the problem
min J(y) (1)
s.t. y ∈ S = {y ∈ X : y(a) = ya, y(b) = yb}
where X = C2([a, b]), a, b, ya, yb ∈ R with a < b and
J(y) =
∫ b
a
f(x, y(x), y′(x)) dx
with f ∈ C2([a, b]×R×R). In this exercise we will investigate the relation-
ship between the optimality conditions in the calculus of variations and
the optimality conditions for finite-dimensional optimization problems.
In order to approximate a solution y to (1) we discretize the interval [a, b]
with a fixed step size h > 0. We define
N :=
b− a
h
(assume that N ∈ N) and then we get points xi := a+ ih for i = 0, . . . , N .
We define yi := y(xi) and we approximate the first derivative through the
forward finite-difference
y′(xi) ≈ yi+1 − yi
h
for i = 0, . . . , N − 1. Then we get
J(y) =
∫ b
a
f(x, y(x), y′(x)) dx ≈
N−1∑
i=0
f
(
xi, yi,
yi+1 − yi
h
)
· h.
We therefore consider the discretized minimization problem
min
N−1∑
i=0
f
(
xi, yi,
yi+1 − yi
h
)
· h (2)
s.t. y0 = ya
yN = yb
with variable y = (y0, y1, . . . , yN ) ∈ RN+1.
(a) (5 pts) Discretize the Euler-Lagrange equation similar to the discretization
of J .
Hint: You should get a system of equations.
(b) (10 pts) Write down the KKT conditions for (2). How are the KKT conditions
related to the discretized Euler-Lagrange equation from (a)?
(c) (5 pts) Solve the discretized problem (2) for
J(y) =
∫ b
a
(y′)2 dx.
5
HW 6 (25 pts) Consider the problem
min J(y)
s.t. y ∈ S =
{
y ∈ X : y(0) = 0, y(1) = c
}
where c is a given constant, X = C2([0, 1]) and
J(y) =
∫ 1
0
(y′)2(1 + y′)2 dx.
(a) (5 pts) Is J convex?
(b) (5 pts) Show that y¯(x) = cx is an admissible extremal.
(c) (5 pts) Show that y¯ is a weak local minimum if c = − 16 .
(d) (5 pts) Show that y¯ is a weak local maximum if c = − 12 .
(e) (5 pts) Show that y¯ is a global minimum if c = − 16 .
6

学霸联盟
essay、essay代写