next up previous
Next: Gauss-Markov channel Up: Time-Recursive Maximum Likelihood Based Channels Previous: System model

   
Time-Independent Gaussian channel

For channel coefficients independent over time, $f({\bf H})$ can be expressed as

 \begin{displaymath}f({\bf H}) = \ \prod_{k=1}^{n} f({\bf h}_k).
\end{displaymath} (4)

If $f({\bf h}_k)$ is Gaussian with mean ${\bf d}_k$ and covariance ${\bf C}_k$, then


 \begin{displaymath}f({\bf h}_k) =
\frac {1}
{\pi^M \vert{\bf C}_k\vert}
e^{-
({\bf h}_k-{\bf d}_k)^{*T} {\bf C}_k^{-1} ({\bf h}_k-{\bf d}_k)
} ,
\end{displaymath} (5)

where * denotes conjugate, M is the FIR channel length and we assume that ${\bf d}_k$ and ${\bf C}_k$ are known. Substituting (2) and (4) into (3), taking the negative natural logarithm of the integration result and ignoring the terms that are irrelevant to the minimization, we obtain:

 \begin{displaymath}- \log ( f({\bf r}\vert{\bf A})) \doteq
\sum_{k=1}^n
\frac { ...
... {\bf d}_k \vert^2 }
{ \sigma_k^2 }
+ \log ( \sigma_k^2 ) ~ ,
\end{displaymath} (6)

where

\begin{displaymath}\sigma_k^2 = \sigma^2 + {\bf a}_k^T {\bf C}_k {\bf a}_k^* ~
\end{displaymath} (7)

($\doteq$ denotes equivalent for optimization purposes).

The time-recursive form for (6) can be expressed as


 \begin{displaymath}\Lambda _{n} =
\Lambda _{n-1} + \lambda _{n} ~ ,
\end{displaymath} (8)

where $\Lambda _{n}$ is the cumulative cost up to time n. From (6), $\lambda _{n}$ is the incremental cost from time n-1 to n:


 \begin{displaymath}\lambda _{n}=
\frac { \vert r_n - {\bf a}_n^T {\bf d}_n \vert^2 }
{ \sigma_n^2 }
+ \log ( \sigma_n^2 ) ~ .
\end{displaymath} (9)

Since the incremental cost only depends on the state transition at time n, the Viterbi algorithm can be used directly to minimize (8). Therefore, (8) represents a computationally efficient, time-recursive, exact MAP (optimum) algorithm.


next up previous
Next: Gauss-Markov channel Up: Time-Recursive Maximum Likelihood Based Channels Previous: System model
Rick Perry
2000-10-29