NYU 163 -无代写
时间:2026-03-10
2026 Prof.Jiang@ECE NYU 163
Lecture IV
Key Issues:
Real symmetric matrices and canonical
forms
2026 Prof.Jiang@ECE NYU 164
Symmetric Matrices
 Recall that a symmetric matrix
satisfies: , 1 , .
It is a if, additionally,
all ' are real.
real symmetric matrix
ij
ij ji
ij
A a
a a i j n
a s

   
:
, .T n n
Notation
A A A  
2026 Prof.Jiang@ECE NYU 165
Fact 1 about Symmetric Matrices
The eigenvalues of a real symmetric matrix are
always real.
2026 Prof.Jiang@ECE NYU 166
Proof of Fact 1
By contradiction, assume that a real symmetric
has a complex eigenvalue, say, . Then,
, or .
is symmetric. This further implies that
and .

T T
T T T T
A
Ax x Ax x x A x
because A
x Ax x x x Ax x x

      
   
  
 
0
0, a contradiction.
Tx x  
   
2026 Prof.Jiang@ECE NYU 167
Fact 2 about Symmetric Matrices
For any real symmetric matrix, its eigenvectors
associated with distinct eigenvalues are
orthogonal.
.
Remarks:
Two vectors , are orthogonal if 0

.

n T
Orthogonal vectors ar
y
e linearly independe
x x
t
y
n
  


2026 Prof.Jiang@ECE NYU 168
Proof of Fact 2
 
For a real symmetric , consider a pair of eigenvectors
, associated with distinct eigenvalues , , i.e.,
and .
This further implies that
and .
symmetric
T T T T
T
A
x y
Ax x Ay y
y Ax y x x Ay x y
A y
 
   
   
  
 

0
, as wished.0
TT T
T
T
Ax y Ax x Ay
x y
x y
 
   



2026 Prof.Jiang@ECE NYU 169
Canonical Form – First Pass
 
 
1
1
Consider a real symmetric matrix ,
with (real, by Fact 1) eigenvalues .
Then, orthogonalthere is an matrix , i.e., ,
such that
0
.
0
n n
n
i i
T
T
i
n
A
O O O I
O A
distin
O diag
ct





       


   

2026 Prof.Jiang@ECE NYU 170
Constructive Proof
 
 
 
1
1
For each eigenvalue , take an eigenvector ,
which has unit norm, i.e., ( ) 1.
Define a matrix as:
, ,
,
i
i
i i T i
n n n
T
T n n
Tn
x
x x x
O
O x x
x
Then O
x



 

       
  
 
2026 Prof.Jiang@ECE NYU 171
Constructive Proof (cont’d)
 
It is directly checked using Fact 2 that ,
i.e., is an orthogonal matrix.
In addition, .
T
T
i
O O I
O
O AO diag

  
2026 Prof.Jiang@ECE NYU 172
Exercise
1 2
1
2
Compute the eigenvalues , of
1 2

2 3
find a transformation matrix s.t.
0
.
0
T
A
and O
O AO
 
    
    
2026 Prof.Jiang@ECE NYU 173
What if A is not necessarily symmetric
   1
Answer:
Yes! As long as the eigenvalues are mutually distinct,
there is a matrix such that
, denoted ~ .
, this may be orthogonal.
nonsingular
i i
P
P AP diag A diag
However P not
   
2026 Prof.Jiang@ECE NYU 174
Show that the following matrix
0 1

0 0
is not diagonalizable.
A     
Remark: A non symmetric matrix may not be diagonanizable.
2026 Prof.Jiang@ECE NYU 175
Comment
   
 
1
1
Two similar matrices have the same eigenvalues. So, if
, i.e., ,
the eigenvalues of are simply .
, the converse is true.
i i
n
i i
A diag P AP diag
A
However not


  


2026 Prof.Jiang@ECE NYU 176
Exercise
Show that the following matrix
1 0

1 1
is diagonizable. In other words, it is similar to
1 0

0 1
A
not not
    
     
Two matrices having the same eigenvalues may not be similar.
2026 Prof.Jiang@ECE NYU 177
Question (Necessity and Sufficiency):
When is a matrix similar to a diagonal
matrix?
Spectral Theorem
2026 Prof.Jiang@ECE NYU 178
Necessary and Sufficient Condition
for the Canonical Diagonal Form
An matrix is similar to a diagonal matrix
has linearly independent eigenvectors.
When has distinct eigenvalues, it is similar
to a diagonal ma
i
t i
ff
r x.
n n A
A n
A n
 

2026 Prof.Jiang@ECE NYU 179
Proof
 
   
1
1 2
First, note that Statement 2 follows from Statement 1
and a result proved previously.
Assume is similar to a diagonal matrix .
, nonsingular s.t. .
Let , with linear
i
n i
A diag
Then P P AP
P p p p p

  
  
  ly independent.
, 1, 2,....,
implying that is an eigenvector for eigenvalue .
i i
i
i
i
AP P Ap p i n
p
      

2026 Prof.Jiang@ECE NYU 180
Proof (cont’d)
 
 
1
1 2
1
Conversely, assume that has linearly independent
eigenvectors , i.e., .
Then, is nonsingular and
satisfies (by direct computation) that
.
ni i i
ii
n
A n
p Ap p
P p p p
P AP


 

 

Comment
2026 Prof.Jiang@ECE NYU 181
   1
1
From the proof of Part 1, it follows that the
following is an equivalent condition for
diagonalization of :
dim dim
,..., are the distinct eigenvalues of , .
k
k
A
N A I N A I n
where
A k n
 
 
    


2026 Prof.Jiang@ECE NYU 182
An Example
0 1
Bring the matrix
1 0
into a diagonal form.
A
    
2026 Prof.Jiang@ECE NYU 183
 
1 2
1 2
1 2
1
The eigenvalues of are , .
As it can be directly checked, the associated
independent eigenvectors are:
1 1
and .
, , implying that
0

0
A j j
c c
j j
Then P c c
j
P AP
j

    
          

   .
Diagonalizable Matrix
2026 Prof.Jiang@ECE NYU 184
A matrix is said to be " ", if it is similar to
a diagonal matrix.
:
(1) Two diagonalizable matrices always commute.
(2) The block-diagonal matri
diagonaliza
x
ble

Are the following statements true or false
  ,
is diagonalizable if and only if each is diagonalizable.
i in n
i i
i
B block diag B B
B
 
2026 Prof.Jiang@ECE NYU 185
Let’s stop for a short review…
• Review of the results on nontrivial solutions
to homogeneous equations:
• How about inhomogeneous linear equations?
0, , .m n nAx A x   
2026 Prof.Jiang@ECE NYU 186
A Quiz?
Any set of vectors , with 1 , are
always linearly dependent, if .
i nx i N
N n
   


2026 Prof.Jiang@ECE NYU 187
Real and Symmetric Matrices
• The eigenvalues are always real.
• Eigenvectors associated with distinct
eigenvalues are always orthogonal.
• Any matrix with no repeated eigenvalues is
diagonalizable.
• How to transform a real and symmetric
matrix into a diagonal form?
2026 Prof.Jiang@ECE NYU 188
A General Result for General
Symmetric Matrices
1
For any real and symmetric matrix ,
there always exists an orthogonal matrix, say ,
, such that

0
0
n n
T
T
n
A
O
O O I
O AO


      


  

2026 Prof.Jiang@ECE NYU 189
Special case: A Trivial Example
1 0
0 n
a
A
a
      

  

Clearly, the identity matrix is an orthogonal matrix.
2026 Prof.Jiang@ECE NYU 190
Before proving this general and fundamental
result, let us introduce some useful tools.
2026 Prof.Jiang@ECE NYU 191
The Gram-Schmidt Orthogonalization
Process
 
 
1
1

How to generate a set of mutually orthogonal
vectors ,
from a set of real linearly independent
-dimensional vectors
:
?
i
i
N
i
N
i
successi
Question
v
n x
yy
N
el

2026 Prof.Jiang@ECE NYU 192
 
1 1
2 2 1
11
11
1 2
1
Let us start with a set of vectors
. Here is the systematic procedure.
,
:
:
where is a scalar to be determined
rea
so
l-val
that
ue
produc
d
t ,
N
i
i
i
x
First
y x
y x a x
a
ynner y


 
  1 2
1 2 1
11
0
, 0.
T
y y
x x a x

  
2026 Prof.Jiang@ECE NYU 193
1 2 1 1 2 1 1
11 11
1 1
1
3
3 3 1 2
2221
21 22
3 1 3 2
3 1 3 2
, 0 : , / ,
with : , 0.
, construct as:
:
where , are scalars to be determined s.t.
, 0, , 0
, 0, , 0.
x x a x a x x x x
D x x
Next y
y x a x a x
a a
y y y y
y x y x
   
 
  
 
 


2026 Prof.Jiang@ECE NYU 194
3 1 3 2
3 1 1 1 2 1
21 22
3 2 1 2 2 2
21 22
21 22
1 1 1 2
2 2 1 2 2
, 0, , 0
, , , 0
, , , 0
which has a (unique) solution , if
, ,
: det 0.
, ,
y x y x
x x a x x a x x
x x a x x a x x
a a
x x x x
D
x x x x
 
      
     

2026 Prof.Jiang@ECE NYU 195
1 1 1 2
2 2 1 2 2
1 1
1 1 1 2
1 1
2 1 2 2
1 1
1 1 2
1 1
, assume that
, ,
: det 0
, ,
Then, there are two scalars , , ,
such that
, , 0
, , 0

not bo

th
,
0
, 0
By contradiction
x x x x
D
x x x x
r s
r x x s x x
r x x s x x
x r x s x
     
 
 

  2 1 21 1, 0x r x s x 
2026 Prof.Jiang@ECE NYU 196
1 1 2 2 1 2
1 1 1 1
1 2
1 1
1 2
1 1 1 2
2
1 2
1 1
1 2
2 2
1 1
1 2
, 0, , 0
, 0

Contradiction with , being
linearly independent. Thus,
, ,
: det 0.
, ,
0.
x r x s x x r x s x
r x s x
x x
x x x x
D
x x
r x s x
r
x
s x
x
x


   

 

     

2026 Prof.Jiang@ECE NYU 197
1 1
2 2 1
11
3 3 1 2
21 22
, we have obtained three mutually orthogonal
vectors:
:
:
:
So
y x
y x a x
y x a x a x

 
  
2026 Prof.Jiang@ECE NYU 198
1
( 1)
1
( 1)
Continuing this process, we can find other
mutually orthogonal vectors:
:
the scalars chosen to achieve
the condition:
, 0
mutual orthogonality

i
i i k
i k
k
i k
i j
y x a x
with a
y y i j




 
  

,
equivalently, , 0, 1 1.i jor y x j i    
2026 Prof.Jiang@ECE NYU 199
Othonormal Vectors
 1 2
They are defined as follows:
: / , 1, 2, , .
It is easy to show that, if ,
, , ,
is an orthogonal matrix.
i i i
N
u y y i N
n N
O u u u
 




2026 Prof.Jiang@ECE NYU 200
An Example
1 2
1 2
Consider the linearly independent vectors:
1 0
0 , 1 .
1 1
By means of the Gram-Schmidt process,
find a set of orthonormal vectors , .
x x
u u
                
Exercise
2026 Prof.Jiang@ECE NYU 201
 1Show that if , , is a set of linearly independent
vectors in , then there exists an invertible upper triangular
matrix such that the matrix has
orthonormal columns.
k
n
k k
v v k
T U VT 



2026 Prof.Jiang@ECE NYU 202
Comment
 
 
1
During the Gram-Schmidt process, we proved that
the determinants , called , are nonzero.
Indeed, we can prove that
det , 0, 1 ,
any set of linearly independent vectors .
k
i j
k
ki
i
D
D x x k
Gr
N
for x
amians

   
2026 Prof.Jiang@ECE NYU 203
Indeed,
 
1 1
, 1
1
Each Gramian det , is associated with
a positive-definite quadratic form:
( ) ,
,
in ( , , ) .
( ) 0, where equality
positive d
hol
f i
d
e in te
i j
k
k k
i j
i j
i j
k
i j
i j
i j
k
k
D x x
Q u u x u x
x x u u
Q u u u
Q u
 





 
 

  
s only when 0.u 
Leading principle minor
2026 Prof.Jiang@ECE NYU 204
An Interesting Result
 
, 1
For any quadratic form
,
the associated determinant
det
is always posi
positive-definit
t
e
ive.
N
ij i j
i j
ij
Q a u u
D a




2026 Prof.Jiang@ECE NYU 205
Proof
1
1 1
First, we prove that 0. By contradiction,
assume otherwise, there is a nontrivial solution to
0, 1, 2, ,
Then, it follows that
0
a contradiction.
N
ij j
j
N N
i ij j
i J
D
a u i N
Q u a u

 
 
 
    

 

2026 Prof.Jiang@ECE NYU 206
 
 
2
1
, we prove that 0. For [0,1], consider
a family of quadratic forms defined as
( ) 1 .
, 0, for all nontrivial . Then,
based on the above analysis, the associated
d
econd
e
S
N
i
i
D
P Q u
Clearly P u

  
    
 

terminants are nonzero.
0, the determinant is det 0.
So, by continuity, 1, the determinant is
which cannot be negative.
At I
at D
  
 
2026 Prof.Jiang@ECE NYU 207
General 2x2 Symmetric Matrices
1
11 12
2
21 22
12 21
1
111
12
We begin with the two-dimensional case:

is symmetric, i.e., .
Consider a pair of eigenvalue and associated
(normalized) eigenvector :
a a a
A
a a a
which a a
x
x
x
          


    

1 1 1 1 2 1
1 1 11 1 12
, . .
, , ,
i e
Ax x a x x a x x     
2026 Prof.Jiang@ECE NYU 208
General Symmetric Matrices (Cont’d)
 1 2 1 12
1
2 2
2
Using the Gram-Schmidt process, take a 2 2
orthogonal matrix , with := the
eigenvector.
It will be shown that
0

given normalize

0
d
T
O y y y x
O AO


    
2026 Prof.Jiang@ECE NYU 209
General Symmetric Matrices (Cont’d)
 
1 2
1 11 1 12
2 2 2 2 2
221 12
12
2 2 2 2
22 2
, show that
,

0,
, 0 using symmetry;
.
because the eigenvalues are
underunchan d e .g
T T
TT T
First
y a y b
O AO O
by a y
Then b
O AO O AO
and b
O
           


 
2026 Prof.Jiang@ECE NYU 210
Exercise 1
Try to reduce the real symmetric matrix
1

1
to a diagonal form.
k
A
k
    
2026 Prof.Jiang@ECE NYU 211
Exercise 2
 
   
, 1
the real
, , ,
that reduces to the inner product when .
Prove that is symmetr
b
ic, i.e., , ,
if and only i
ilinear
f is symmetr
f
ic.
orm
n
T n
ij i j
i j
Define
Q x y y Ax a y x x y
A I
Q Q x y Q y x
A

   


 
See the text (Horn & Johnson, 2nd edition, 2013; page 226)
2026 Prof.Jiang@ECE NYU 212
Homework #4
1. Does the singular matrix
1 1

1 1
have two independent eigenvectors?
2. Show that and have the same eigenvalues.T
A
A A
    
2026 Prof.Jiang@ECE NYU 213
Homework #4
3. Show by direct calculation for and , 2 2 matrices,
that and have the same characteristic equation.
4. Can you give two matrices that are reducible to
the following canonical diagona
A B
AB BA

l matrix
2 0

0 1
Justify your answer.
A     

学霸联盟
essay、essay代写