CMI (channel mean information)

Introduction:

In the field of communication theory, the performance of a communication channel is usually measured by its channel capacity, which is the maximum rate of reliable data transmission over the channel. However, channel capacity can be a difficult quantity to calculate and is not always a practical measure of performance in real-world scenarios. Channel mean information (CMI) is an alternative measure of performance that can be more easily calculated and is often more relevant to practical communication scenarios.

Definition:

CMI is defined as the average amount of information that can be transmitted per channel use. It is a statistical quantity that describes the average information content of a message that is transmitted over the channel, averaged over all possible messages that could be sent.

Mathematical Definition:

The mathematical definition of CMI is as follows:

CMI = H(Y) - H(Y|X)

Where H(Y) is the entropy of the output (received) signal Y, and H(Y|X) is the conditional entropy of the output signal given the input signal X. In other words, CMI is the difference between the entropy of the output signal and the conditional entropy of the output signal given the input signal.

The entropy of a signal is a measure of its uncertainty, or the amount of information it carries. The conditional entropy is a measure of the remaining uncertainty in the output signal after the input signal has been observed. Therefore, the CMI can be thought of as the amount of information that is transmitted over the channel, on average, per channel use.

Explanation:

To understand CMI, it is useful to consider a simple example of a binary communication channel. A binary channel can transmit one of two possible symbols, 0 or 1. Suppose that the channel is noisy, and there is a probability p that the transmitted symbol will be flipped (i.e., a 0 is received when a 1 was transmitted, or vice versa).

The entropy of the output signal Y can be calculated as follows:

H(Y) = - p log2 p - (1-p) log2 (1-p)

This expression gives the amount of uncertainty in the received signal, given the probability p of a transmission error. The conditional entropy of the output signal given the input signal can be calculated as follows:

H(Y|X) = p H(p, 1-p) + (1-p) H(1-p, p)

Where H(p,1-p) = -p log2 p - (1-p) log2 (1-p) is the binary entropy function, and the second term is the entropy of the complement of p.

Substituting these expressions into the CMI formula, we obtain:

CMI = H(Y) - H(Y|X) = 1 - H(p, 1-p)

This expression gives the amount of information that is transmitted over the channel, on average, per channel use. It is a function of the probability of a transmission error, and it approaches 1 when p is small (i.e., when the channel is relatively noise-free) and approaches 0 when p is large (i.e., when the channel is very noisy).

Importance:

CMI is an important quantity in communication theory because it provides a more practical measure of channel performance than channel capacity. Channel capacity is a theoretical limit that is difficult to achieve in practice, and it assumes that the channel is used for a very large number of transmissions. In many real-world scenarios, however, the channel is used for a finite number of transmissions, and the goal is to achieve a high level of reliability in each transmission.

CMI is a more relevant measure of performance in these scenarios because it takes into account the uncertainty in the received signal due to noise and other factors. It provides a measure of the amount of information that is reliably transmitted over the channel per channel use, which is a more practical measure of performance than channel capacity.

Furthermore, CMI can be used to compare the performance of different communication systems, even if they have different channel capacities. For example, suppose two communication systems have the same channel capacity but different levels of noise. The system with lower noise will have a higher CMI, indicating that it can reliably transmit more information per channel use than the other system.

CMI is also useful in designing communication systems that can achieve high levels of reliability. By optimizing the CMI, it is possible to design a system that can reliably transmit a desired amount of information over the channel, even in the presence of noise and other sources of uncertainty.

Finally, CMI has applications in many fields beyond communication theory, including information theory, statistics, and machine learning. It is a fundamental quantity that describes the information content of signals and the uncertainty associated with them, and it has many important practical applications in these fields.

Conclusion:

In conclusion, channel mean information (CMI) is a statistical quantity that describes the average amount of information that can be transmitted per channel use. It is a more practical measure of performance than channel capacity and takes into account the uncertainty in the received signal due to noise and other factors.

CMI is an important quantity in communication theory because it provides a measure of the amount of information that can be reliably transmitted over the channel, which is a more relevant measure of performance in many real-world scenarios. It is also useful in designing communication systems that can achieve high levels of reliability and has applications in many fields beyond communication theory.