MSSE Minimum Mean Square Error


The Minimum Mean Square Error (MMSE) is a statistical approach that is widely used in signal processing and communication systems to estimate unknown parameters, such as signal amplitudes, signal frequencies, or channel characteristics. MMSE is based on the principle of minimizing the mean square error between the true value of a parameter and its estimated value. MMSE is particularly useful in systems where there is noise or other forms of interference that can affect the accuracy of the estimates.

MMSE is often used in the context of linear estimation problems, where the unknown parameter can be represented as a linear combination of a set of observed random variables. For example, in a communication system, the transmitted signal can be represented as a linear combination of the transmitted symbols and the channel coefficients, which are unknown. The MMSE estimator is then used to estimate the channel coefficients based on the received signal.

The MMSE estimator is derived by minimizing the expected value of the square of the difference between the true value of the parameter and its estimate, which is also known as the mean square error (MSE). The MMSE estimator is the estimator that minimizes the MSE.

Let's assume that we have a random variable x, which is the true value of the parameter we want to estimate. We also have a set of observed random variables y1, y2, ..., yn, which are related to x through a linear relationship of the form:

y = Ax + n

where A is a known matrix of coefficients and n is a noise vector that represents the random variations and uncertainties in the system. Our goal is to estimate x given y.

The MMSE estimator of x, denoted by x̂, is the estimator that minimizes the MSE:

MSE = E[(x - x̂)²]

where E is the expectation operator.

To derive the MMSE estimator, we need to find the estimator that minimizes the MSE. Let's define the estimation error as:

e = x - x̂

We can rewrite the MSE as:

MSE = E[e²]

Using the linear relationship between x and y, we can express e as:

e = x - x̂ = x - (Ay + An)

where y is the observed vector of random variables and n is the noise vector.

We can now substitute e into the expression for the MSE and simplify:

MSE = E[(x - (Ay + An))²] = E[x²] - 2E[x(Ay + An)] + E[(Ay + An)²] = E[x²] - 2AE[xy] - 2E[xn] + A²E[y²] + 2AE[yn] + E[n²]

where we have used the properties of expectation and the fact that E[xn] = E[yn] = 0 (since x and n are uncorrelated).

To find the MMSE estimator, we need to find the value of x̂ that minimizes the MSE. We can do this by taking the derivative of the MSE with respect to x̂ and setting it to zero:

dMSE/dx̂ = 2E[(Ay + An - x)A'] = 0

where A' is the transpose of A. Solving for x̂, we get:

x̂ = E[x] + A'E[(y - A E[x])n']

where n' is the transpose of n. This is the MMSE estimator of x, which is a linear function of the observed vector y.

The MMSE estimator can be interpreted as the best linear unbiased estimator (BLUE) of x. This means that the MMSE estimator has the smallest variance among all linear unbiased estimators of x. In other words, the MMSE estimator is the most efficient estimator in terms of minimizing the estimation error.

To see why the MMSE estimator is the BLUE, let's consider another estimator, denoted by x̂*. This estimator is also a linear function of the observed vector y:

x̂* = a'y + b

where a' is a vector of coefficients and b is a constant. We can calculate the MSE of this estimator as:

MSE* = E[(x - x̂*)²] = E[(x - a'y - b)²]

We can minimize this MSE by taking the derivative with respect to a' and b and setting them to zero:

dMSE*/da' = -2E[x(y - a E[y])] + 2E[yy']a' = 0 dMSE*/db = -2E[x] + 2E[y']a' = 0

Solving these equations for a' and b, we get:

a' = (E[yy'])⁻¹E[xy] b = E[x] - E[y']a'

Substituting these values of a' and b into the expression for x̂*, we get:

x̂* = E[x] + (E[yy'])⁻¹E[xy]'(y - E[y])

We can now compare the MSE of x̂* with the MSE of the MMSE estimator x̂. The difference in MSE is:

MSE* - MSE = E[(x - x̂*)²] - E[(x - x̂)²] = E[(x - x̂*)²] - E[(x - E[x])²] - E[(x̂ - E[x])²] + E[(x̂* - x̂)²] = E[(x - x̂*)²] - E[(x - E[x])²] - E[(x̂* - E[x̂*])²] + E[(x̂ - E[x̂])²]

The first term on the right-hand side is the MSE of x̂*, which is greater than or equal to the MSE of x̂. The second term is the variance of x, which is independent of the estimator. The third term is the variance of x̂*, which is greater than or equal to the variance of x̂. The fourth term is the difference in the MSE between x̂ and x̂*. By minimizing the fourth term, we can see that the MMSE estimator has the smallest MSE among all linear unbiased estimators of x.

In summary, the MMSE estimator is a linear estimator that minimizes the MSE between the true value of a parameter and its estimate. The MMSE estimator is derived by minimizing the expected value of the square of the estimation error, which is a function of the observed vector and the unknown parameter. The MMSE estimator is the best linear unbiased estimator of the unknown parameter, which means that it has the smallest variance among all linear unbiased estimators. The MMSE estimator is widely used in signal processing and communication systems to estimate unknown parameters in the presence of noise and other forms of interference.