Definition: A discrete linear dynamical system is a sequence of vectors \mathbf{x}^{(k)}, k=0,1, \ldots, called states, which is defined by an initial vector \mathbf{x}^{(0)} and by the rule
We can continue into future quarters by multiplying by A again:
\mathbf{x}^{(2)}=A \mathbf{x}^{(1)}=A \cdot\left(A \mathbf{x}^{(0)}\right)=A^{2} \mathbf{x}^{(0)}
This is true of any homogeneous linear dynamical system!
For any positive integer k and discrete dynamical system with transition matrix A and initial state \mathbf{x}^{(0)}, the k-th state is given by
\mathbf{x}^{(k)}=A^{k} \mathbf{x}^{(0)}
Distribution vector and stochastic matrix
\mathbf{x}^{(k)} of the toothpaste example are column vectors with nonnegative coordinates that sum to 1.
Such vectors are called distribution vectors.
Also, each of the columns the matrix A is a distribution vector.
A square matrix A whose columns are distribution vectors is called a stochastic matrix.
Markov Chain definition
A Markov chain is a discrete dynamical system whose initial state \mathbf{x}^{(0)} is a distribution vector and whose transition matrix A is stochastic, i.e., each column of A is a distribution vector.
Checking that our toothpase example is a Markov chain
We can define the 1-norm of a vector \mathbf{x} as \|\mathbf{x}\|_{1}=\sum_{i=1}^{n}\left|x_{i}\right|.
If all the elements of a vector are nonnegative, then \|\mathbf{x}\|_{1} is the sum of the elements.
Can show: For any stochastic matrix P and compatible vector \mathbf{x},\|P \mathbf{x}\|_{1} \leq\|\mathbf{x}\|_{1}, with equality if the coordinates of \mathbf{x} are all nonnegative.
→ if a state in a Markov chain is a distribution vector (nonnegative entries and sums to 1), then the sum of the coordinates of the next state will also sum to one
→ all subsequent states in a Markov chain are themselves Markov Chain State distribution vectors.
Moving into the future
Suppose that initially Brand A has all the customers (i.e., Brand B is just entering the market). What are the market shares 2 quarters later? 20 quarters?
A graph is a set V of vertices (or nodes), together with a set or list E of unordered pairs with coordinates in V, called edges.
Digraph
A directed graph (or “digraph”) has directed edges.
Walks
A walk is a sequence of edges \left\{v_{0}, v_{1}\right\},\left\{v_{1}, v_{2}\right\}, \ldots,\left\{v_{m-1}, v_{m}\right\} that goes from vertex v_{0} to vertex v_{m}.
The length of the walk is m.
A directed walk is a sequence of directed edges.
Winning and losing
Suppose this represents wins and losses by different teams. How can we rank the teams?
Power
Vertex power: the number of walks of length 1 or 2 originating from a vertex.
Makes most sense for graphs which don’t have any self-loops and at most one edge between nodes. These are called dominance directed graphs
How can we compute the number of walks for a given graph?
Adjacency matrix
Adjacency matrix: A square matrix whose (i, j) th entry is the number of edges going from vertex i to vertex j
Edges in non-directed graphs appear twice (at (i,j) and at (j,i))