Description
Instructions
CS365 Spring ’23
Foundations of Data Science
Assignment 9
starts.
• There are 15 points available for extra credit.
• No extension will be provided, unless for serious documented reasons.
• Start early!
• Study the material taught in class, and feel free to do so in small groups, but the solutions should be a product of your own work.
• This is not a multiple choice homework; reasoning, and mathematical proofs are required before giving your final answer.
1 SVD again [20 points]
1. (5 pts) Find the SVD of A = [1,1] without the use of computing devices/software.
2. (15 pts) Let A ∈ Rm×n and let σ1 be the maximum singular value of A. For x ∈ Rn{0} the spectral norm of A is defined as . Prove that
||A||2 = σ1.
2 Taylor polynomial approximation [10 points]
1. (5 pts) Let f(x) = sin(x) + cos(x). Compute the degree 5 Taylor polynomial for f at x = 0.
2. (5 pts) Compute the quadratic approximation of the function f(x,y) = x2+y2+2xy− 3x + 2y + 5 at the point x = 5,y = 10.
3 Derivatives [35 points]
Compute the derivative for the following functions. It will be helpful to identify n,m where f : Rn → Rm, and the dimensions of the derivative first.
(c) [5 pts] f(x) = sin(x1)cos(x2),x ∈ R2.
(d) [5 pts] f(x) = xxT,x ∈ Rn.
1 of 2
CS365 Spring ’23
Foundations of Data Science
Prof. C.E. Tsourakakis Assignment 9
(e) [5 pts] f(x) = sin(log(xTx)),x ∈ Rn. (f) [5 pts] f(z) = log(1 + z) where z = xTx,x ∈ Rn (g) [5 pts] f(x) = xTAx where x ∈ Rn,A ∈ Rn×n.
4 Optimization [15 points]
1. (7.5 pts) Consider the univariate function f(x) = x3+6x −3x−5. Find its stationary points and indicate whether they are maximum, minimum or saddle points.
2. (7.5 pts) Explain how to solve the least squares loss in a linear model using (i) gradient descent and (ii) SVD. Discuss the pros and cons.
5 Coding [35 points]
Check the Jupyter notebook on our Git repo.
Reviews
There are no reviews yet.