Markov Matrix
\(\bullet\)
A
\(n\times n\)
matrix whose elements are
non-negative and every column vector sums to
\(1\)
is known as
Markov Matrix(a.k.a.
stochastic matrix).
\(\bullet\)
Markov matrix always have one of the eigenvalues
\(=1\)
Assume a
\(n\times n\)
markov matrix
\(M\)
and a vector
\(\vec{u}\in\mathbb{R}^n\)
and
\(\vec{u}_{t=k+1}=M\vec{u}_{t=k};\quad k\in\{0,1,2,3,\cdots\}\)
The
eigenvalues of
\(M\)
are
\(\lambda_1\)
and
\(\lambda_2\)
and
eigenvectos of
\(M\)
are
\(x_1\)
and
\(x_2\)
, So
\[\displaystyle \vec{u}_{t=k}=c_1\lambda_1^k\vec{u}_{t=0} + c_2\lambda_2^k\vec{u}_{t=0} \]
Expansion with Orthogonal basis
Assume a vector
\(\vec{v}\in\mathbb{R}^n\)
in this
\(n\)
dimensional vector space, and
\(\vec{q}_1, \vec{q}_2, \cdots, \vec{q}_n\)
are the
orthogonal basis for that
\(n\)
dimensional vector space.
\(\vec{v}=x_1\vec{q}_1 + x_2\vec{q}_2 + \cdots + x_n\vec{q}_n\)
, then we can get
\(x_i\)
's as,
\[\displaystyle x_i=\vec{v}^T\vec{q}_i;\quad\forall i\in\{1,2,\cdots, n\}\]