In a multivariate evolutionary system, the present state of a variable is a resultant outcome of all interacting variables through the temporal history of the system. How can we quantify the information transfer from the history of all variables to the outcome of a specific variable at a specific time? We develop information theoretic metrics to quantify the information transfer from the entire history, called causal history. Further, we partition this causal history into immediate causal history, as a function of lag τ from the recent time, to capture the influence of recent dynamics, and the complementary distant causal history. Further, each of these influences are decomposed into self- and cross-feedbacks. By employing a Markov property for directed acyclic time-series graph, we reduce the dimensions of the proposed information-theoretic measure to facilitate an efficient estimation algorithm. This approach further reveals an information aggregation property, that is, the information from historical dynamics are accumulated at the preceding time directly influencing the present state of variable(s) of interest. These formulations allow us to analyze complex inter-dependencies in unprecedented ways. We illustrate our approach for: (1) characterizing memory dependency by analyzing a synthetic system with short memory; (2) distinguishing from traditional methods such as lagged mutual information using the Lorenz chaotic model; (3) comparing the memory dependencies of two long-memory processes with and without the strange attractor using the Lorenz model and a linear Ornstein-Uhlenbeck process; and (4) illustrating how dynamics in a complex system is sustained through the interactive contribution of self- and cross-dependencies in both immediate and distant causal histories, using the Lorenz model and observed stream chemistry data known to exhibit 1/f long memory.
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Statistics and Probability
- Condensed Matter Physics