CLMI (Closed Loop Mutual Information)
Closed Loop Mutual Information (CLMI) is a measure of causal interaction between two time series that are influenced by one another in a feedback loop. CLMI is a metric that quantifies how much information is transferred between two systems that are causally connected. In simpler terms, it measures how much one system influences another, and how much information is shared between them.
To understand CLMI, it is important to first understand mutual information (MI). MI is a statistical measure of the similarity between two random variables. It measures the amount of information that can be obtained about one variable by observing the other. The concept of mutual information is based on the idea of entropy, which is a measure of the uncertainty or randomness of a system.
In the context of time series data, mutual information can be used to quantify the degree of dependence between two time series. This can be useful in many applications, such as studying the relationships between financial market variables, analyzing brain activity, or modeling environmental systems.
However, in many cases, the relationship between two time series is not one-way. Instead, the two systems interact with each other in a feedback loop. This means that the behavior of each system is influenced by the other, and the relationship between them is more complex than a simple cause-and-effect relationship. In such cases, mutual information alone may not be sufficient to capture the full extent of the causal interactions between the two systems.
This is where CLMI comes in. CLMI is a modification of mutual information that takes into account the feedback loop between the two systems. It is a way to quantify the degree of causality between two time series when the causal relationship is bidirectional.
To compute CLMI, the time series data from the two systems are divided into two parts: the past and the future. The past includes all data up to a certain point in time, and the future includes all data after that point. The CLMI is then calculated as the difference between the mutual information of the future of system 1 and the past of system 1, and the mutual information of the future of system 1 and the past of system 2. This can be expressed mathematically as:
CLMI = MI(S1_future; S1_past) - MI(S1_future; S2_past)
Where S1 and S2 are the two time series, and MI(S1_future; S1_past) and MI(S1_future; S2_past) are the mutual information between the future and past of each system, respectively.
The intuition behind this formula is that the CLMI measures the amount of information that is transferred from system 2 to system 1, and how much of that information is returned to system 2 in the form of feedback. A positive value of CLMI indicates that system 2 is causing changes in system 1, which are then being fed back to system 2. A negative value of CLMI indicates the opposite: system 1 is causing changes in system 2, which are being fed back to system 1.
CLMI has been used in a variety of applications, such as studying the interactions between brain regions, analyzing the dynamics of ecosystems, and modeling financial markets. One of the advantages of CLMI is that it can capture the complex causal relationships between two systems, even when the relationship is bidirectional and involves feedback loops. This can provide valuable insights into the underlying mechanisms that govern the behavior of these systems.
In conclusion, Closed Loop Mutual Information (CLMI) is a metric that quantifies the degree of causal interaction between two time series that are influenced by one another in a feedback loop. CLMI is a modification of mutual information that takes into account the bidirectional nature of the causal relationship, and it can provide valuable insights into the dynamics of complex systems.