Homework 3

1

Show that if \(P\) and \(Q\) are stochastic matrices of the same size, then \(P Q\) is also stochastic.

2

The digraph \(H\) that results from reversing all the arrows in a digraph \(G\) is called reverse digraph of \(G\). Show that if \(A\) is the adjacency matrix for \(G\) then \(A^{T}\) is the adjacency matrix for the reverse digraph \(H\).

3

Solve the PageRank problem with \(P\) as in Example 2.46, teleportation vector \(\mathbf{v}=\frac{1}{6} \mathbf{e}\) and teleportation parameter \(\alpha=0.8\). (In this example, the correction vector was \(\frac{1}{3}(1,1,1,0,0,0)\); that’s what you’ll use here.)

4

Modify the surfing matrix \(P\) of Example 2.46 by using the correction vector \(\frac{1}{5}(1,1,1,0,1,1)\) and solve the resulting PageRank problem with teleportation vector \(\mathbf{v}=\frac{1}{6} \mathbf{e}\) and teleportation parameter \(\alpha=0.8\).

2.5.22

Solution vector is \(\mathbf{x}=(95,133,189,75,123,123) / 738\) or \(\mathbf{x}=\) \((0.101,0.141,0.200,0.087,0.236,0.236)\) exactly.

5

Show that there is more than one stationary state for the Markov chain of Example 2.46.

6

Repair the dangling node problem of the graph of Figure 2.7 by creating a correction vector that makes transition to all nodes equally likely. (Note that this means all nodes, includes transitioning back to the dangling node.)

Next, find all stationary states for the resulting Markov chain.

7

Solve the nonlinear system of equations of Example 2.48 by using nine iterations of the vector Newton formula (2.5), starting with the initial guess \(\mathbf{x}^{(0)}=(0,1)\). Evaluate \(F\left(\mathbf{x}^{(9)}\right)\).

8

Find the minimum value of the function \(F(x, y)=\left(x^{2}+y+1\right)^{2}+x^{4}+y^{4}\) by using the Newton method to find critical points of the function \(F(x, y)\), i.e., points where \(f(x, y)=F_{x}(x, y)=0\) and \(g(x, y)=F_{y}(x, y)=0\).

9

Apply the following digital filter to the noisy data of Example 2.71 and graph the results. Does it appear to be a low pass filter?

\[ y_{k}=\frac{1}{2} x_{k}+\frac{1}{2} x_{k-1}, \quad k=1,2, \ldots, 33 \]

10

Apply the following digital filter to the noisy data of Example 2.71 and graph the results. Does it appear to be a high pass filter?

\[ y_{k}=\frac{1}{2} x_{k}-\frac{1}{2} x_{k-1}, \quad k=1,2, \ldots, 33 \]

11

Show that \(L=\left[\begin{array}{lll}1 & 0 & 0 \\ 1 & 1 & 0 \\ 2 & 1 & 1\end{array}\right]\) and \(U=\left[\begin{array}{rrr}2 & -1 & 1 \\ 0 & 4 & -3 \\ 0 & 0 & -1\end{array}\right]\) is an LU factorization of \(A=\left[\begin{array}{rrr}2 & -1 & 1 \\ 2 & 3 & -2 \\ 4 & 2 & -2\end{array}\right]\).

12

By hand:

Find an LU factorization of the matrix \(A=\left[\begin{array}{rrr}2 & 1 & 0 \\ -4 & -1 & -1 \\ 2 & 3 & -3\end{array}\right]\).

13

By hand:

Find a PLU factorization of the matrix \(A=\left[\begin{array}{rrr}2 & 1 & 3 \\ -4 & -2 & -1 \\ 2 & 3 & -3\end{array}\right]\).