The lecture will consist of three parts. In the first part we will deal with the identification and analysis of statistical dependencies using graphical models and methods from information theory. This corresponds to the paradigm that the complexity of a system emerges from the interactions of subsystems. Topics will include Bayesian Networks, Granger causality, Conditional mutual information and information theoretic complexity measures. In the second part we will explore a second paradigma for complexity - criticality. Keywords are phase transitions, self-organized criticality and power law distributions. In the third part we will combine both paradigms by exploring the foundation of slogans such as ”Computation at the edge of chaos“, i.e. the idea that critical states are states of maximal complexity on the one hand side and that they have particular computational power or adaptability on the other hand side.

The lecture will deal with textbook material, but also more recent research problems and results will be discussed.

- Thomas Cover amd Joy A. Thomas, Elements of Information Theory, Wiley
- Didier Sornette, Critical Phenomena in Natural Sciences, Springer
- Remo Badii and Antonio Politi, Complexity - Hierarchical structures and scaling in physics, Cambridge University Press

A still readable overview about the early days of the ``science of complexity'' and in particular the activities at the Sante Fe Institute is provided by

- M. Mitchell Waldrod, Complexity: The Emerging Science at Edge of Order and Chaos, Simon and Schuster, New York, 1992, dt. Inseln im Chaos. Die Erforschung komplexer Systeme, Rowohlt Hamburg, 1993.

- Introduction
- First lecture.
- Second lecture. An exercise.
- Third lecture. Literature:
- Peter Grassberger, Toward a quantitative theory of self-generated complexity,
*International Journal of Theoretical Physics*,**25**(1986), 907-938. - James P. Crutchfield and David P. Feldman, Regularities unseen, randomness observed: Levels of entropy convergence,
*Chaos*,**13**(2003), 25-54. - William Bialek, Ilya Nemenman and Naftali Tishby, Predictability, complexity, and learning.
*Neural Computation*,**13**(2001), 2409-2463. - Peter Grassberger, Entropy estimates from insufficient samplings, arXiv: physics/0307138.

- Peter Grassberger, Toward a quantitative theory of self-generated complexity,
- Fourth lecture.
- Fifth lecture on porperties of statistical complexities of composite systems, conditional independence and graphical models. Further reading:
- Kevin Murphy's page on Graphical Models and Bayesian Networks
- Steffen L. Lauritzen, Graphical Models, Oxford University Press, 1996
- Judea Pearl, Causality. Models, Reasoning and Inference. Cambridge University Press, 2000.

- Sixth lecture on interdependence measures for time series and in particular on the concept of Granger causality.
- Granger Causality: C.W.J. Granger, Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37 (1969), pp. 424–438. C.W.J. Granger Testing for causality: A personal viewpoint. Journal of Economics and Control, 2 (1980), 329-352.
- Transfer entropy: Thomas Schreiber, Measuring Information Transfer, PRL, 85(2000) 461.
- An interesting application: R. Marschinski and H. Kantz, Analysing the information flow between financial time series, European Physical Journal B, 30(2002), 275-281.
- For a connection to graphical models see e.g. Michael Eichlers contribution to the Handbook of time series analysis: Graphical modelling of dynamical relationships in multivariate time series.

- 7th lecture on critical phenomena - starting with an introduction to equilibrium phase transitions in physics.
- 8th lecture on the 2-D Ising model and starting with the modern theory of critical phenomena - scaling, universality and the renormalization group.
- 9th lecture on the renormalization group for the Ising model. Literature:
- J.M. Yeomans, Statistical Mechanics of Phase Transitions, Oxford University Press, 1992, ch.8 and 9.

- 10th lecture on self-organized criticality (SOC). Literature:
- Henrik Jeldtoft Jensen, Self-Organized Criticality, Cambridge University Press, 1998.
- Per Bak, How nature works - the science of self-organized criticality, Springer New York, 1996.

- 11th lecture on power laws --- mechanisms and detection. Literaure:
- M.E.J Newman, Power laws, Pareto disributions and Zipf's law, Contemporary Physics, 46 (2005), 323-331. ArXive cond-mat/0412004
- A. Clauset, C. R. Shalizi and M. E. J. Newman, Power-law distributions in empirical data, ArXive physics/0706.1062

- 12th lecture on "Computation at the edge of chaos" --- What is the relationship between criticality, complexity, computational capabilities and evolution?

Some matlab functions from the exercises

- Logistic map:
- logistic.m generates a trajectory of the logistic map observed via a binary partition.
- temporal_entropies.m estimates entropies on a binary string.

- Cellular automata:
- CA_1d.m generates a space-time pattern for an elementary cellular automaton with a random initial condition.
- ca_temporal_entropies.m estimates the entropies of temporal strings on the output of CA_1d.m.

- Multivariate test data for transfer entropy estimation
- var_data.m iterates a VAR-model.
- Two coupled tent maps.
- transfer-entropies.m esimates the transfer entropies between two symbol sequences (untested).
- shuffle.m for producing a randomly permuted version of a time series. Useful for testing non-zero transfer entropies.

**Back to Eckehard Olbrich's page.**