TY - JOUR
T1 - Causality detection based on information-theoretic approaches in time series analysis
AU - Hlaváčková-Schindler, Katerina
AU - Paluš, Milan
AU - Vejmelka, Martin
AU - Bhattacharya, Joydeep
N1 - K.H.-S. was supported by grant of Austrian Research Fonds FWF-H-226 (2005) under Charlotte Bühler Program and partly by ASCR 1ET 100 750 401, Project Baddyr. M.P. and M.V. were supported by the EC FP6 project BRACCIA (Contract No. 517133 NEST), and partly by the Institutional Research Plan AV0Z10300504. J.B. was supported by JST.ERATO Shimojo project.
Publisher Copyright:
© 2007 Elsevier B.V. All rights reserved
PY - 2007/3
Y1 - 2007/3
N2 - Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied in physical, biological and other natural sciences, as well as in social sciences, economy and finance. While studying such complex systems, it is important not only to detect synchronized states, but also to identify causal relationships (i.e. who drives whom) between concerned (sub) systems. The knowledge of information-theoretic measures (i.e. mutual information, conditional entropy) is essential for the analysis of information flow between two systems or between constituent subsystems of a complex system. However, the estimation of these measures from a set of finite samples is not trivial. The current extensive literatures on entropy and mutual information estimation provides a wide variety of approaches, from approximation-statistical, studying rate of convergence or consistency of an estimator for a general distribution, over learning algorithms operating on partitioned data space to heuristical approaches. The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.
AB - Synchronization, a basic nonlinear phenomenon, is widely observed in diverse complex systems studied in physical, biological and other natural sciences, as well as in social sciences, economy and finance. While studying such complex systems, it is important not only to detect synchronized states, but also to identify causal relationships (i.e. who drives whom) between concerned (sub) systems. The knowledge of information-theoretic measures (i.e. mutual information, conditional entropy) is essential for the analysis of information flow between two systems or between constituent subsystems of a complex system. However, the estimation of these measures from a set of finite samples is not trivial. The current extensive literatures on entropy and mutual information estimation provides a wide variety of approaches, from approximation-statistical, studying rate of convergence or consistency of an estimator for a general distribution, over learning algorithms operating on partitioned data space to heuristical approaches. The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.
KW - Causality
KW - Entropy
KW - Estimation
KW - Mutual information
UR - https://www.scopus.com/pages/publications/33947524701
UR - https://www.sciencedirect.com/science/article/abs/pii/S0370157307000403?via%3Dihub
U2 - 10.1016/j.physrep.2006.12.004
DO - 10.1016/j.physrep.2006.12.004
M3 - Review article
AN - SCOPUS:33947524701
SN - 0370-1573
VL - 441
SP - 1
EP - 46
JO - Physics Reports
JF - Physics Reports
IS - 1
ER -