Capacity of SISO system over a fading channel

As reiterated in the previous article, a MIMO system is used to increase the capacity dramatically and also to improve the quality of a communication link. Increased capacity is obtained by spatial multiplexing and increased quality is obtained by diversity techniques (Space time coding). Capacity equations of a MIMO system over a variety of channels (AWGN, fading channels) is of primary importance. It is desirable to know the capacity improvements offered by a MIMO system over the capacity of SISO system.

To begin with, we will be looking into the capacity equations for a conventional SISO system over AWGN and fading channels followed by capacity equations for a MIMO systems. As a pre-requisite, readers are encouraged to go through the detailed discussion on channel capacity and Shannon’s Theorem.

To begin with, clarity over few definitions are needed.

Entropy

The average amount of information per symbol (measured in bits/symbol) is called Entropy. Given a set of N discrete information symbols – represented as random variable X \in \{x_1,x_1,...,x_{N} \} having probabilities denoted by a Probability Mass Function p(x)=\{p_1,p_2,...,p_N \}, the entropy of X is given by

equation 1 entropy of a random variable SISO capacity

Entropy is a measure of uncertainty of a random variable X, therefore reflects the amount of information required on an average to describe the random variable. In general, it has the following bounds

0 < H(X) < log_2(N) \quad\quad\quad\quad (2)

Entropy hits the lower bound of zero (no uncertainty, therefore no information) for a completely deterministic system (probability of correct transmission p_i=1). It reaches the upper bound when the input symbols x_i are equi-probable.

Capacity and mutual information

Following figure represents a discrete memoryless (noise term corrupts the input symbols independently) channel, where the input and output are represented as random variables X and Y respectively. Statistically, such a channel can be expressed by transition or conditional probabilities. That is, given a set of inputs to the channel, the probability of observing the output of the channel is expressed as conditional probability p(Y/X)

memoryless channel

For such a channel, the mutual information I(X;Y) denotes the amount of information that one random variable contains about the other random variable

I(X;Y) = H(X) - H(Y|X) \quad\quad\quad\quad (3)

H(X) is the amount of information in X before observing Y and thus the above quantity can be seen as the reduction of uncertainty of X from the observation of Y latex .

The information capacity C is obtained by maximizing this mutual information taken over all possible input distributions p(x) [1].

C = \max_{p(x)} I(X;Y) \quad\quad\quad\quad (4)

SISO fading Channel

A SISO fading channel can be represented as the convolution of the complex channel impulse response (represented as a random variable h) and the input x.

y = h \ast x + n \quad\quad\quad\quad (5)

Here, n is complex baseband additive white Gaussian noise and the above equation is for a single realization of complex output y. If the channel is assumed to be flat fading or of block fading type (channel does not vary over a block of symbols), the above equation can be simply written without the convolution operation (refer this article to know how equations (5) & (6) are equivalent for a flat-fading channel).

y = h x + n \quad\quad\quad\quad (6)

For different communication fading channels, the channel impulse response can be modeled using various statistical distributions. Some of the common distributions as Rayleigh, Rician, Nakagami-m, etc.,

Capacity with transmit power constraint

Now, we would like to evaluate capacity for the most practical scenario, where the average power, given by P = \sigma_x^2, that can be expensed at the transmitter is limited to P_t. Thus, the channel capacity is now constrained by this average transmit power, given as

C = \max_{p(x) \;,\; P \leq P_t} I(X;Y) \quad\quad\quad\quad (7)

For the further derivations, we assume that the receiver possesses perfect knowledge about the channel. Furthermore, we assume that the input random variable X is independent of the noise N and the noise is zero mean Gaussian distributed with variance \sigma_n^2 -i.e, N \sim \mathcal{N}(0,\sigma_n^2).

Note that both the input symbols x and the output symbols y take continuous values upon transmission and reception and the values are discrete in time (Continuous input Continuous output discrete Memoryless Channel – CCMC). For such continuous random variable, differential entropy – H_d(.) is considered. Expressing the mutual information in-terms of differential entropy,

equation 8 mutual information differential entropy SISO channel capacity

Mutual Information and differential entropy

Since it is assumed that the channel is perfectly known at the receiver, the uncertainty of the channel h conditioned on X is zero, i.e, H_d(hX)=0. Furthermore, it is assumed that the noise N is independent of the input X, i.e, H_d(N|X) = H_d(N). Thus, the mutual information is

I(X;Y) =H_d(Y) - H_d(N) \quad\quad\quad\quad (9)

For a complex Gaussian noise N with non-zero mean and variance N \sim \mathcal{N}(\mu_n,\sigma_n^2), the PDF of the noise is given by

f_N(n) = \frac{1}{\pi \sigma_n^2} exp\left[ - \frac{(\mu_n- n)^2}{\sigma_n^2}\right] \quad\quad\quad\quad (10)

The differential entropy for the noise H_d(N) is given by

equation 11 differential entropy of noise

This shows that the differential entropy is not dependent on the mean of N. Therefore, it is immune to translations (shifting of mean value) of the PDF. For the problem of computation of capacity,

C = \max_{p(x) \;,\; P \leq P_T} I(X;Y) = \max_{p(x) \;,\; P \leq P_T} \left[H_d(Y) - H_d(N) \right] \quad\quad\quad\quad (12)

and given the differential entropy H_d(N), the mutual information I(X;Y)= H_d(Y) - H_d(N) is maximized by maximizing the differential entropy H_d(Y). The fact is, the Gaussian random variables itself are differential entropy maximizers. Therefore, the mutual information is maximized when the variable Y is also Gaussian and therefore the differential entropy H_d(Y)=log_2(\pi e \sigma_y^2). Where, the received average power is given by

\sigma_y^2 = E[Y^2] = E[(hX+N)(hX+N)^*] = \sigma_x^2 \left | h \right |^2 + \sigma_n^2 \quad\quad\quad\quad (13)

Thus the capacity is given by

equation 14 capacity of SISO system

Representing the entire received signal-to-ratio as \gamma = \frac{P_t}{\sigma_n^2} \left | h \right |^2, the capacity of a SISO system over a fading channel is given by

\boxed{C=log_2(1+\gamma)=log_2 \left ( 1 + \frac{P_t}{\sigma_n^2} \left | h \right |^2 \right ) }

For the fading channel considered above, the term channel h is modeled as a random variable. Thus, the capacity equation above is also a random variable. Thus, for fading channels two different capacities can be defined.

Ergodic Capacity

Ergodic capacity is defined as the statistical average of the mutual information, where the expectation is taken over \left | h \right |^2

\boxed{C_{erg}=\mathbb{E} \left\{ log_2 \left ( 1 + \frac{P_t}{\sigma_n^2} \left | h \right |^2 \right ) \right\} }

Outage Capacity

Defined as the information rate at which the instantaneous mutual information falls below a prescribed value of probability expressed as percentage – q\%.

\boxed{ Pr \left( log_2 \left [ 1 + \frac{P_t}{\sigma_n^2} \left | h \right |^2 \right ] < C_{out,q \%} \right) = q\%}

Continue reading on simulating ergodic capacity of a SISO channel…

Rate this article: PoorBelow averageAverageGoodExcellent (26 votes, average: 4.31 out of 5)

References:

[1] Andrea J. Goldsmith & Pravin P. Varaiya, Capacity, mutual information, and coding for finite-state Markov channels,IEEE Transactions on Information Theory, Vol 42, No.3, May 1996.↗

Articles in this series
[1] Introduction to Multiple Antenna Systems
[2] MIMO - Diversity and Spatial Multiplexing
[3] Characterizing a MIMO channel - Channel State Information (CSI) and Condition number
[4] Capacity of a SISO system over a fading channel
[5] Ergodic Capacity of a SISO system over a Rayleigh Fading channel - Simulation in Matlab
[6] Capacity of a MIMO system over Fading Channels
[7] Single Input Multiple Output (SIMO) models for receive diversity
[8] Receiver diversity - Selection Combining
[9] Receiver diversity – Maximum Ratio Combining (MRC)

Books by the author

Wireless Communication Systems in Matlab
Wireless Communication Systems in Matlab
Second Edition(PDF)

PoorBelow averageAverageGoodExcellent (180 votes, average: 3.62 out of 5)

Digital modulations using Python
Digital Modulations using Python
(PDF ebook)

PoorBelow averageAverageGoodExcellent (134 votes, average: 3.56 out of 5)

digital_modulations_using_matlab_book_cover
Digital Modulations using Matlab
(PDF ebook)

PoorBelow averageAverageGoodExcellent (136 votes, average: 3.63 out of 5)

Hand-picked Best books on Communication Engineering
Best books on Signal Processing

11 thoughts on “Capacity of SISO system over a fading channel”

  1. In Probability density function of noise (i.e. eqn no.10) ” pi ” is missing in the denominator. Is it right or wrong ?

    Reply
  2. Sir,
    please continue this discussion. it is extremely useful. I request you to extend it to space time codes, OFDM and CDMA.

    Reply

Post your valuable comments !!!