python3代写-SS 2021
时间:2021-04-14
Maximilian Weigand 17. April 2019
Andreas Hense Übung in Anwesenheit
Petra Friederichs InvMod SS 2021
Solve the following exercises using the JupyterHub. Submit a running Python 3 Jupyther Notebook (.ipynb),
and (optionally) a PDF of a fully executed Jupyter notebook to the corresponding eCampus exercise. Please refer
to the eCampus exercise for the deadline.
Total number of points: 25.0
1 Vector and Matrix calculus (19.0 Points)
Matrix multiplication for two matrices AMxN and BNxM is defined as
(AB)µ,λ =
N∑
i=1
aµ,ibi,λ = cµ,λ
This relationship can be simplified vor vectors, as vectors can be seen as matrices with dimension 1 in the second
dimension:
vM = vMx1, → vT1,M
The L2-norm for a vector is defined as
||v|| = ||v||2 =
√√√√ M∑
i=1
x2i
Note also that the dot product of vectors can be represented as a matrix multiplication
u · v = uTv
1.1 Matrix-vector product (Points: 1)
Reformulate the Matrix-Matrix product for a Matrix-vector product between AMxN and vN .
1.2 Vector Norm (Points: 1.5)
Using the definition of the vector norm, show the validity of the equation
||v||2 = vTv, v ∈ RM
Start by treating v and vT as matrices.
1.3 Show (AB)T = BTAT (Points: 2)
Use only the equation for the matrix-matrix product and the fact that the transposed matrix AT of A can be
denoted element-wise by ai,j = aj,i.
A ∈ RM,N , B ∈ RN,P
1.4 Show ||Wm|| =mTWTWm (Points: 2)
1
1.5 Determine the type and size of the following computation results (Points: 2)
Using A ∈ RR,Q,B ∈ RQ,S ,m ∈ RR
• Am =?
• AB =?
• vTv =?
• vT =?
1.6 Matrix calculus (Points: 2)
Certain derivative operations can be nicely simplified by employing matrix calculus. When certain identities are
used this gravely simplifies certain computations that are central to deriving some inversion problems.
More information on matrix calculus, including a comprehensive list of identities, can be found here:
https://en.wikipedia.org/wiki/Matrix_calculus
m ∈ RN (1)
∂s
∂m =

∂s
∂m1...
∂s
∂mN
 (2)
The Jacobian matrix is defined as:
v ∈ RM (3)
m ∈ RN (4)
∂v
∂m =

∂v1
∂m1
. . . ∂v1∂mN... ...
∂v1
∂mM
. . . ∂vM∂mN
 (5)
1.6.1 Derivative of Matrix-Vector product
Show ∂∂m(Am) = A, A ∈ RMxN
1.6.2 Derivative of the norm (Points: 4)
Show ∂∂m
[
mTCm
]
= 2Cm, provided that C is a symmetric matrix. m ∈ RN ,C ∈ RMxN .
Start with the sum-equation for the matrix-vector product, as well as the formulation for a matrix-matrix product
in the form of vTA.
1.7 Show that uTAv = vTATu (Points: 3)
Again, use only the sum equations for matrix-matrix and matrix-vector multiplication.
A ∈ RNxM ,u ∈ RN ,v ∈ RM
1.8 Show that ∂
∂mvTATu = ATu (Points: 1.5)
2 Linear Algebra with Numpy and Scipy (6 Points)
Using the numpy module, especially functions from numpy.linalg, solve the following tasks. Where possible, use
the provided functions instead of implementing the solutions by hand.
The documentation for the numpy linear algebra submodule can be found here:
https://numpy.org/doc/stable/reference/routines.linalg.html
2
[2]: import numpy as np
2.1 Compute the dot product of the following two vectors x and y: (Points: 1)
x =
23
6
 , y =
93
5

2.2 Compute the inverse matrix of: (Points: 2)
A =
1 2 73 4 5
6 7 8

Check the result by multiplying by A. Shortly explain any unexpected results that you encounter.
While the diagonal entries are 1, as expected, some off-diagonal entries are not zero, although close to. This is
caused by numerical inaccuracies. You can check that the result is close to the unit matrix with:
2.3 Eigenvectors and eigenvalues (Points: 3)
How are the Eigenvalues and Eigenvectors λi and vi of A (from the previous task) defined? Using numpy, compute
the Eigenvectors and Eigenvalues of A. Check the results.
3














































































































学霸联盟


essay、essay代写