## Description

General Instructions

The following document contains the solutions to the theory-based questions for

Question 1

–NOTE: The property mentioned in the question holds only for real, skew-symmetric matrices and we will show the proof for the same. If we wanted to prove this property for complex matrices, then A would have to be a skew-hermitian matrix. The structure of the proof would remain the same in both cases.

Given A is a skew-symmetric matrix ⟹ AT = −A. Now,

Ax = λx

The norm of the vector Ax is given as

T

Ax

= (x∗)TATAx

= −(x∗)TA2x

= −(x∗)Tλ2x

⟹

The final equation indicates that λ2 must take a negative value (since is always positive) which is only possible if λ is purely imaginary. Thus the eigenvalues of A are purely imaginary.

Question 2

Given A ∈ R𝕞×𝕟, whose column vectors are linearly independent, we want to prove that the eigen values of ATA are purely real and positive.

For this, let λ denote the eigen values and denote the respective eigen vectors of matrix ATA. Hence,

Tx

Tx

Tx

Since the norms are purely real and positive, we can see that λ is also purely real and positive.

Question 3

a)

⎤

1

⎣1 0 0⎦

The characteristic polynomial of

p(λ) =

⎤

=

⎦

The eigen values of 1 are given by the roots of the characteristic polynomial, i.e., λ1 = 1, and

The eigen vector corresponding to

= 0

On solving the equation, we get x

The eigen vector corresponding to is given by

Solution of above equation is given by

The eigen vector corresponding to

Solution of above equation is given by

⎤

√

⎦

⎤

3 2 3 √ 2

⎦

b)

The characteristic polynomial of

p(λ) = det(A2 − λI)

⎡1 − λ 1 1 ⎤

= det 1 1 − λ 1

⎣ 1 1 1 − λ⎦

1 − λ 1 1 1 1 1 − λ

= (1 − λ)det] − det[ ] + det[ ]

1 1 − λ 1 1 − λ 1 1

= (1 − λ)(λ2 − 2λ) + λ + λ

= λ2 − 2λ − λ3 + 2λ2 + λ + λ

= −λ3 + 3λ2

= −λ2(λ − 3)

The eigen values of A2 are given by the roots of the characteristic polynomial, i.e., λ1 = 3, λ2 = λ3 = 0

The eigen vector corresponding to λ1 = 3 is given by

(A2 − λ1I)x = 0

⎡−2 1 1 ⎤

1 −2 1 x = 0 ⎣ 1 1 −2⎦

Solving above equation, we get x = t[1 1 1]T.

The eigen vector corresponding to λ2 = λ3 = 0 is given by

(A2 − λ2I)x = 0

⎡1 1 1⎤

1 1 1 x = 0

⎣1 1 1⎦

Solution of above equation is given by x = t[1 − 1 0]T + s[1 0 − 1]T

to summarize, the eigen values and eigen vectors of A2 are given by

⎡1⎤

λ1 = 3,×1 = 1

⎣1⎦

⎡ 1 ⎤ ⎡ 1 ⎤

λ2 = λ3 = 0,×2 = −1 ,x3 = 0

⎣ 0 ⎦ ⎣−1⎦

Question 4

Proof via counter-example: Consider the 3 × 3 matrix A, such that

⎡0 1 1⎤

A = 1 0 1

⎣1 1 0⎦

We first solve the characteristic equation:

det(A − λI) = 0

⎡−λ 1 1⎤

A − λI = 1 −λ 1

⎣ 1 1 −λ⎦

⟹ − λ(λ2 − 1) − 1(−λ − 1) + 1(1 − (−λ)) = 0

⟹ λ3 − 3λ + 2 = 0

⟹ (λ + 1)2(λ − 2) = 0

So, the eigen values are λ1 = λ2 = −1 and λ3 = 2.

Finding the eigenvectors corresponding to the eigenvalue λ = −1

⎡0 1 1⎤ ⎡x1⎤⎡x1⎤

1 0 1 x2 = (−1) x2 ⎣1 1 0⎦ ⎣x3⎦⎣x3⎦

⎡1 1 1⎤ ⎡x1⎤ ⎡0⎤

⟹ ⎣11 11 11⎦ ⎣xx23⎦ = ⎣00⎦

Thus,

x1 + x2 + x3 = 0

⎡ α ⎤

⟹ x = β

⎣−α − β⎦

⎡ 1 ⎤ ⎡0 ⎤

= α 0 + β1

⎣−1⎦ ⎣−1⎦

Thus, the eigenvectors corresponding to the eigenvalue λ = −1 are:

⎡ 1 ⎤ ⎡ 0 ⎤ x = 1 0 x = 11

⎣−1⎦⎣−1⎦

Finding the eigenvectors corresponding to the eigenvalue λ = 2

⎡0 1 1⎤ ⎡x1⎤ ⎡x1⎤

1 0 1 x2 = 2 x2

⎣1 1 0⎦ ⎣x3⎦ ⎣x3⎦

⎡−2 1 1 ⎤⎡x1⎤ ⎡0⎤

⟹ ⎣ 11 −12 −12⎦⎣xx23⎦ = ⎣00⎦

⎡−2 1 1 ⎤⎡x1⎤ ⎡0⎤

⟹ 0 −1.5 1.5 x2 = 0

⎣ 0 0 0 ⎦⎣x3⎦ ⎣0⎦

Thus,

− 2×1 + x2 + x3 = 0

− 1.5×2 + 1.5×3 = 0

⎡α⎤

⟹ x = α

⎣α⎦

Thus, the eigenvectors corresponding to the eigenvalue λ = 2 are: ⎡1⎤ x = 1 1

⎣1⎦

Clearly, the eigenvectors are independent of each other but the eigenvalues are not distinct.

Thus, the given proposition is false.

The converse of the above proposition is true i.e if the eigenvalues are distinct then the eigenvectors are independent of each other.

Question 5

The matrix given is as follows:

⎡6 −2 −1⎤

A = 3 1 −1

⎣2 −1 2 ⎦

As it is a 3 by 3 matrix, we use I3 as the identity matrix. This gives us the characteristic equation:

det(A − λI3) = 0

⎡6 − λ −2 −1 ⎤

⟹ det 3 1 − λ −1 = 0

⎣ 2 −1 2 − λ⎦

⟹ (6 − λ)[(1 − λ)(2 − λ) − 1] + 2[3(2 − λ) + 2] − [3(−1)− 2(1 − λ)] = 0

⟹ (6 − λ)[λ2 − 3λ + 1] + 2[6 − 3λ + 2] − [−3 − 2+ 2λ] = 0

⟹ (6 − λ)[λ2 − 3λ + 1] + 2[8 − 3λ] − [−5 + 2λ] =0

⟹ (6 − λ)[λ2 − 3λ + 1] + 16 − 6λ + 5 − 2λ = 0

⟹ 6λ2 − 18λ + 6 − λ3 + 3λ2 − λ − 8λ + 21 = 0

⟹ −λ3 + 9λ2 − 27λ + 27 = 0

⟹ λ3 − 9λ2 + 27λ − 27 = 0

⟹ (λ − 3)3 = 0

⟹ λ = 3,3,3

Note that the algebraic multiplicity of the eigen value 3 is 3. We now find the corresponding eigen vectors. We have:

(A − λI3)v = 0

⎡3 −2 −1⎤ ⎡v1⎤ ⎡0⎤

⟹ 3 −2 −1 v2 = 0

⎣2 −1 −1⎦ ⎣v3⎦ ⎣0⎦

On swapping the last 2 rows and performing Row Substituttions, we get:

⎡3 −2 −1⎤⎡v1⎤ ⎡0⎤

0 1 −1 v2 = 0

⎣0 0 0 ⎦⎣v3⎦ ⎣0⎦ ⎡α⎤ ⎡1⎤

⟹ v = α = α 1 [α ∈R] ⎣α⎦ ⎣1⎦

Taking α = 1, we get the eigenvector as v = [1 1 1]T.

As the solution obtained is only one parametric form, the geometric multiplicity of the eigenvalue 3 is 1. In order to span the entire space R3, we need 2 more linearly independent vectors(u and w). We can obtain these by using the generalized eigenvectors approach. We have:

⎡3 −2 −1⎤ ⎡u1⎤ ⎡1⎤

(A − λI3)u = v ⟹ 3 −2 −1 u2 =1

⎣ 2 −1 −1⎦ ⎣u3⎦ ⎣1⎦

On swapping the last 2 rows and performing Row Substituttions, we get:

⎡3 −2 −1⎤⎡u1⎤ ⎡1⎤ ⎡α + 1⎤

0 1 −1 u2 = 1 ⟹ u =α + 1

⎣0 0 0 ⎦⎣u3⎦ ⎣0⎦ ⎣ α ⎦

Taking α = −1, we get the eigenvector as u = [0 0 − 1]T.

Lastly, we have:

⎡3 −2 −1⎤⎡w1⎤ ⎡ 0 ⎤

(A − λI3)w = u ⟹ 3 −2 −1 w2 = 0

⎣2 −1 −1⎦⎣w3⎦ ⎣−1⎦

On swapping the last 2 rows and performing Row Substitutions, we get:

⎡3 −2 −1⎤⎡w1⎤ ⎡ 0 ⎤⎡α − 2⎤

0 1 −1 w2 = −3⟹ w = α − 3

⎣0 0 0 ⎦⎣w3⎦ ⎣ 0 ⎦⎣ α ⎦

Taking α = 2, we get the eigenvector as u = [0 − 12]T.

Thus, the generalized eigen vectors corresponding to A are

⎡1⎤ ⎡ 0 ⎤ ⎡ 0 ⎤

1 , 0 ,−1

⎣1⎦ ⎣−1⎦ ⎣ 2 ⎦

NOTE: The values of α were chosen randomly in the above solution. The eigenvectors obtained will be linearly independent ∀α ∈ R i.e., the triplet of eigenvectors obtained will always span entire space R3.

## Reviews

There are no reviews yet.