码迷,mamicode.com
首页 > 其他好文 > 详细

Report, 20150402, Formulas on Entropy, Part I

时间:2015-04-03 23:47:05      阅读:196      评论:0      收藏:0      [点我收藏+]

标签:

Section 1: Papoulis‘s Formula

Lemma 1: If the random variables $y_1,\ldots,y_n$ are the linear combination of random variables $x_1,\ldots,x_n$, say ${\left[ y_1,\ldots,y_n \right]^T} = A{\left[ x_1,\ldots,x_n \right]^T}$, then
\begin{equation} \nonumber
h\left( {{y_1}, \ldots ,{y_n}} \right) = h\left( {{x_1}, \ldots ,{x_n}} \right) + \log \left| \det A \right|
\end{equation}

Lemma 2: If a function $H\left( z \right) = \sum\nolimits_{n = 0}^\infty  {{h_n}{z^{ - n}}} $ is minimum-phase and $h_0 \ne 0$, then
\begin{equation} \nonumber
\ln h_0^2 = \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }
\end{equation} Remark: If $h_0 = 0,h_1 \ne 0$ and $H\left( z \right)$ is minimum-phase, then $zH\left( z \right) = \sum\nolimits_{n = 1}^\infty  {{h_n}{z^{ - n + 1}}}  = {h_1} + {h_2}{z^{ - 1}} +  \cdots $ is also minimum-phase. Thus, from lemma 2, we have
\begin{equation} \nonumber
\begin{aligned}
\ln h_1^2 &= \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {{e^{j\omega }}H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega } \\
&= \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }.
\end{aligned}
\end{equation} Similarly, if $H\left( z \right)$ is minimum-phase and $h_0 = \dots = h_{k-1} = 0,~h_k \ne 0$ for some positive integer $k$ (i.e., $k$ is the relative degree of the system), then $\ln h_k^2 = \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }$.

Lemma 3: Let the random vector $X \in \mathbb{R}^n$ have zero mean and covariance $K = EXX^t$ (i.e., $K_{ij} = EX_iX_j,~1 \le i,j \le n$). Then $h(X) \le \frac{1}{2}\log {\left( {2\pi e} \right)^n}\left| K \right|$, with equality iff $X \sim N\left( {0,K} \right)$.

Theorem 1: If $H(z)$ is minimum-phase, then the entropy rate $\bar h(y)$ of the output $y_n$ is
\begin{equation} \nonumber
\bar h\left( y \right) = \bar h\left( r \right) + \frac{1}{{{\rm{2}}\pi }}\int_{ - \pi }^\pi  {\ln \left| {H\left( {{e^{j\omega }}} \right)} \right|d\omega }
\end{equation} where $\bar h\left( r \right) = \mathop {\lim }\limits_{n \to \infty } \frac{1}{n}h\left( {{r_0}, \cdots ,{r_{n - 1}}} \right)$ is the entropy rate of the input $r_n$.

Proof: Suppose that $H\left( z \right) = \sum\nolimits_{i = 0}^\infty  {{h_i}{z^{ - i}}} $ and $h_0 \ne 0$, then
\[\left[ {\begin{array}{*{20}{c}}
{{y_0}}\\
{{y_1}}\\
 \vdots \\
{{y_{n - 1}}}
\end{array}} \right] = \left[ {\begin{array}{*{20}{c}}
{{h_0}}&0& \cdots &0\\
{{h_1}}&{{h_0}}& \cdots &0\\
 \vdots & \vdots &{}& \vdots \\
{{h_{n - 1}}}&{{h_{n - 2}}}& \cdots &{{h_0}}
\end{array}} \right]\left[ {\begin{array}{*{20}{c}}
{{r_0}}\\
{{r_1}}\\
 \vdots \\
{{r_{n - 1}}}
\end{array}} \right] \buildrel \Delta \over = A_n\left[ {\begin{array}{*{20}{c}}
{{r_0}}\\
{{r_1}}\\
 \vdots \\
{{r_{n - 1}}}
\end{array}} \right]\] and $\det A_n = h_0^n$. From lemma 1 we have
\begin{equation} \nonumber
\begin{aligned}
h\left( {{y_0}, \ldots ,{y_{n-1}}} \right) &= h\left( {{r_0}, \ldots ,{r_{n-1}}} \right) + \log \left| \det A_n \right| \\
&= h\left( {{r_0}, \ldots ,{r_{n - 1}}} \right) + \log \left| {h_0^n} \right| \\
&= h\left( {{r_0}, \ldots ,{r_{n - 1}}} \right) + n\log \left| {{h_0}} \right|
\end{aligned}
\end{equation} and
\begin{equation} \label{Hy_Hr_logh0}
\bar h\left( y \right) = \bar h\left( r \right) + \log \left| {{h_0}} \right|
\end{equation} which, combines with lemma 2, complete the proof.

Remark: If $h_0 = 0$ but $h_1 \ne 0$, we then have
\[\left[ {\begin{array}{*{20}{c}}
{{y_1}}\\
{{y_2}}\\
 \vdots \\
{{y_n}}
\end{array}} \right] = \left[ {\begin{array}{*{20}{c}}
{{h_1}}&0& \cdots &0\\
{{h_2}}&{{h_1}}& \cdots &0\\
 \vdots & \vdots &{}& \vdots \\
{{h_n}}&{{h_{n - 1}}}& \cdots &{{h_1}}
\end{array}} \right]\left[ {\begin{array}{*{20}{c}}
{{r_0}}\\
{{r_1}}\\
 \vdots \\
{{r_{n - 1}}}
\end{array}} \right] \buildrel \Delta \over = {\bar A_n}\left[ {\begin{array}{*{20}{c}}
{{r_0}}\\
{{r_1}}\\
 \vdots \\
{{r_{n - 1}}}
\end{array}} \right]\] and
\begin{equation} \label{hy_hr_h1}
\begin{aligned}
h\left( {{y_1}, \ldots ,{y_n}} \right) &= h\left( {{r_0}, \ldots ,{r_{n - 1}}} \right) + \log \left| {\det {{\bar A}_n}} \right| \\
&= h\left( {{r_0}, \ldots ,{r_{n - 1}}} \right) + n\log \left| {{h_1}} \right|
\end{aligned}
\end{equation} However, since $ h\left( {{y_{\rm{0}}}, \ldots ,{y_n}} \right) = h\left( {{y_1}, \ldots ,{y_n}} \right){\rm{ + }}h\left( {{y_0}\left| {{y_1}, \ldots ,{y_n}} \right.} \right)$, we get
\begin{equation} \nonumber
\begin{aligned}
\frac{{h\left( {{y_1}, \ldots ,{y_n}} \right)}}{n} &= \frac{{h\left( {{y_{\rm{0}}}, \ldots ,{y_n}} \right) - h\left( {{y_0}\left| {{y_1}, \ldots ,{y_n}} \right.} \right)}}{n} \\
&= \frac{{h\left( {{y_{\rm{0}}}, \ldots ,{y_n}} \right)}}{{n + 1}} \cdot \frac{{n + 1}}{n} - \frac{{h\left( {{y_0}\left| {{y_1}, \ldots ,{y_n}} \right.} \right)}}{n}
\end{aligned}
\end{equation} which implies that \[\mathop {\lim }\limits_{n \to \infty } \frac{{h\left( {{y_1}, \ldots ,{y_n}} \right)}}{n} = \bar h\left( y \right).\]Therefore, divides (\ref{hy_hr_h1}) by $n$ and let $n$ tends to infinity, we can get
\begin{equation} \nonumber
\bar h\left( y \right) = \bar h\left( r \right) + \log \left| {{h_1}} \right|
\end{equation} which, combines with the remark of lemma 2, implies that theorem 1 holds for all minimum-phase systems not only for that $h_0 \ne 0$.

Section 2: Papoulis‘s Formula for Non-MP System

Suppose that a function $H\left( z \right) = \sum\nolimits_{n = 0}^\infty  {{h_n}{z^{ - n}}} $ is stable (minimum-phase is unneeded), $k$ is the relative degree of this system such that $h_0 = \ldots = h_{k-1} = 0$ and $h_k \ne 0$. By Cauchy‘s residual theorem (in fact, inverse $Z-$transformation), we have
\[\frac{1}{{2\pi i}}\int_{\left| z \right| = 1} {H\left( z \right){z^{k - 1}}dz}  = {h_k}\] or
\begin{equation} \nonumber
\frac{1}{{2\pi }}\int_{ - \pi }^\pi  {H\left( {{e^{j\omega }}} \right){e^{jk\omega }}d\omega }  = {h_k}.
\end{equation} Then theorem 1 can be rewritten as \[\bar h\left( y \right) = \bar h\left( r \right) + \ln \left| {\frac{1}{{2\pi }}\int_{\left| z \right| = 1} {H\left( {{e^{j\omega }}} \right){e^{jk\omega }}d\omega } } \right|.\] On the other hand, each (stable) transfer function can be written as $$H\left( z \right) = {H_{{\rm{mp}}}}\left( z \right){H_{{\rm{ap}}}}\left( z \right) = {z_{{u_i}}} \cdots {z_{{u_l}}}\bar H\left( z \right){H_{{\rm{ap}}}}\left( z \right)$$ where ${{z_{{u_i}}},\ldots,{z_{{u_l}}}}$ are the unstable zeros of $H(z)$ and $\bar H(z)$ is minimum-phase and has the first Laurent series coefficient same as $H(z)$. That is, $\bar H(z) = {h_0} + {\bar h_1}{z^{ - 1}} +  \cdots $ iff $H(z) = {h_0} + {h_1}{z^{ - 1}} +  \cdots $. Therefore, we have
\begin{equation} \nonumber
\begin{aligned}
\frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }  \\
&= \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {{z_{{u_i}}} \cdots {z_{{u_l}}}\bar H\left( {{e^{j\omega }}} \right){H_{{\rm{ap}}}}\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }  \\
&= \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {{z_{{u_i}}} \cdots {z_{{u_l}}}\bar H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }  \\
&= \sum\limits_{i = 1}^l {\ln {{\left| {{z_{{u_i}}}} \right|}^2}}  + \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln {{\left| {\bar H\left( {{e^{j\omega }}} \right)} \right|}^2}d\omega }  \\
&= \sum\limits_{i = 1}^l {\ln {{\left| {{z_{{u_i}}}} \right|}^2}}  + \ln h_0^2
\end{aligned}
\end{equation} or simply, \[\ln \left| {{h_0}} \right| = \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln \left| {H\left( {{e^{j\omega }}} \right)} \right|d\omega }  - \sum\limits_{i = 1}^l {\ln \left| {{z_{{u_i}}}} \right|} .\] Finally, by equation (\ref{Hy_Hr_logh0}), we get \[\bar h\left( y \right) = \bar h\left( r \right) + \frac{1}{{2\pi }}\int_{ - \pi }^\pi  {\ln \left| {H\left( {{e^{j\omega }}} \right)} \right|d\omega }  - \sum\limits_{i = 1}^l {\ln \left| {{z_{{u_i}}}} \right|} \] if the system is stable only where $z_{u_i}$‘s are the unstable zeros.

Section 3: Extended Case for 1st-order System

Consider the following stable discrete-time linear system with state equation
\begin{equation} \nonumber
\begin{aligned}
{x_{k + 1}} &= A{x_k} + b{u_k}\\
{y_k} &= c{x_k}.
\end{aligned}
\end{equation} The initial state ${x_0}\sim N\left( {0,\sigma _0^2I} \right)$. By simple computation, we have ${x_{k + 1}} = {A^{k + 1}}{x_0} + \sum\nolimits_{i = 0}^k {{A^{k - i}}b{u_i}} $ and ${y_k} = c{A^{k + 1}}{x_0} + \sum\nolimits_{i = 0}^k {c{A^{k - i}}b{u_i}} $. Then
\begin{equation} \label{hy_inequal}
\begin{aligned}
&h\left( {{y_0}, \ldots ,{y_{n - 1}}\left| {{u_0}, \ldots ,{u_{n - 1}}} \right.} \right) \\
&= h\left( {c{x_0}, \ldots ,c{A^{n-1}}{x_0} + \sum\limits_{i = 0}^{n - 2} {c{A^{n - 2 - i}}b{u_i}} \left| {{u_0}, \ldots ,{u_{n - 1}}} \right.} \right) \\
&= h\left( {c{x_0}, \ldots ,c{A^{n-1}}{x_0}\left| {{u_0}, \ldots ,{u_{n - 1}}} \right.} \right) \\
& \le h\left( {c{x_0}, \ldots ,c{A^{n-1}}{x_0}} \right) \\
& \le \sum\limits_{i = 0}^{n-1} {h\left( {c{A^i}{x_0}} \right)}.
\end{aligned}
\end{equation} Since ${x_0}\sim N\left( {0,\sigma _0^2I} \right)$, the covariance matrix of $cA^ix_0$ is
\begin{equation} \nonumber
\begin{aligned}
{K_i} &= \varepsilon \left\{ {c{A^i}{x_0}{{\left( {c{A^i}{x_0}} \right)}^T}} \right\} \\
&= c{A^i}\varepsilon \left\{ {{x_0}{x_0}^T} \right\}{\left( {{A^i}} \right)^T}{c^T} \\
&= c{A^i}\sigma _0^2{\left( {{A^i}} \right)^T}{c^T} \\
&= \sigma _0^2{\left\| {c{A^i}} \right\|^2}.
\end{aligned}
\end{equation} Therefore, by lemma \ref{Maximum_entropy}, we have
\begin{equation} \label{hCAX0_inequal}
\begin{aligned}
h\left( {c{A^i}{x_0}} \right) &\le \frac{1}{2}\log \left( {2\pi e\sigma _0^2{{\left\| {c{A^i}} \right\|}^2}} \right) \\
&\le \frac{1}{2}\log \left( {2\pi e\sigma _0^2{{\left\| c \right\|}^2}{{\left\| A \right\|}^{2i}}} \right) \\
&= \frac{1}{2}\log \left( {2\pi e\sigma _0^2{{\left\| c \right\|}^2}} \right) + i\log \left\| A \right\|
\end{aligned}
\end{equation} where $\left\| c \right\|$ and $\left\| A \right\|$ refers, respectively, to the $l_2-$norm of the vector $c$ and the matrix $A$ (i.e., $\left\| c \right\| = \sum\nolimits_{i = 1}^n {{{\left| {{c_i}} \right|}^2}},~ \left\| A \right\| = {\lambda _{\max }}\left( A \right)$), and the second inequality in (\ref{hCAX0_inequal}) follows that $\left\| {Ac} \right\| \le \left\| A \right\|\left\| c \right\|$.
Now substitute (\ref{hCAX0_inequal}) into (\ref{hy_inequal}), we can get
\begin{equation} \nonumber
\begin{aligned}
&h\left( {{y_0}, \ldots ,{y_{n - 1}}\left| {{u_0}, \ldots ,{u_{n - 1}}} \right.} \right) \\
&\le \sum\limits_{i = 0}^{n - 1} {h\left( {c{A^i}{x_0}} \right)} \\
& \le \sum\limits_{i = 0}^{n - 1} {\left( {\frac{1}{2}\log \left( {2\pi e\sigma _0^2{{\left\| c \right\|}^2}} \right) + i\log \left\| A \right\|} \right)}  \\
&= \frac{n}{2}\log \left( {2\pi e\sigma _0^2{{\left\| c \right\|}^2}} \right) + \frac{{n\left( {n - 1} \right)}}{2}\log \left\| A \right\|
\end{aligned}
\end{equation} and
\begin{equation} \label{Iyu_ineuqal}
\begin{aligned}
&\frac{{I\left( {{y_0}, \ldots ,{y_{n - 1}};{u_0}, \ldots ,{u_{n - 1}}} \right)}}{n} \\
&= \frac{{h\left( {{y_0}, \ldots ,{y_{n - 1}}} \right) - h\left( {{y_0}, \ldots ,{y_{n - 1}}\left| {{u_0}, \ldots ,{u_{n - 1}}} \right.} \right)}}{n} \\
&\ge \frac{{h\left( {{y_0}, \ldots ,{y_{n - 1}}} \right)}}{n} - \frac{1}{2}\log \left( {2\pi e\sigma _0^2{{\left\| c \right\|}^2}} \right) + \frac{{\left( {n - 1} \right)}}{2}\log \frac{1}{{\left\| A \right\|}}
\end{aligned}
\end{equation} However, since the system we considered is stable, $\left\| A \right\| < 1$ or $\log \frac{1}{{\left\| A \right\|}} > 0$. Taking $n \to \infty $ in (\ref{Iyu_ineuqal}) we will get $\bar I\left( {y;u} \right) = \infty $, which is coincide with the 1st-order case.

Report, 20150402, Formulas on Entropy, Part I

标签:

原文地址:http://www.cnblogs.com/aujun/p/4391179.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!