This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye.

Zenil presents Experimental Algorithmic Theory as Algorithmic Information Theory and NKS, put together in a mixer.

Algorithmic Complexity Theory defines the algorithmic complexity k(s) as the length of the shortest program that produces s. But since finding this short program is in general an undecidable question, the only way to approach k(s) is to use compression algorithms. He shows how to use the Compress function in Mathematica to give an idea about the compressibility of various sequences. However, the idea of applying a compression algorithm breaks down for very short sequences. This is true not only for the Compress function, but also for any other compression algorithm.

Zenil's approach is to construct a metric of algorithmic complexity for short sequences from scratch. He defines the algorithmic probability as the probability that an arbitrary program produces a sequence. The basic idea is to run a whole class of computational devices such as Turing Machines or Cellular Automata, and compute the distributions of the sequences they generate.

Zenil presents a comparison of frequency distributions of sequences generated by 2-state 3-color Turing Machines and 2-color radius 1 Cellular Automata. He also compared these distributions to distributions found in data from the real world, and found that not only there is correlation across different systems, but also that the distributions are rather stable, and the difference between the distributions in abstract systems and real-world data can be attributed to noise. In his paper Zenil elaborates on the nature of the noise he has encountered.

Zenil conjectures that the correlation distances between different systems decreases with a larger number of steps, and converge in the infinite limit case.

More details can be found ad:

http://www.mathrix.org/experimentalAIT/UniversalDistribution.htm

A normal blog post just doesn't seem right, so I cordially invite everyone to post their impressions and thoughts about Greg's talk in the comments.

Dodig-Crnkovic has been inspired by a conference by Chaitin to think about the relationship between epistemology and information theory -- the question of how ideas in mathematics and biology emerge. In a "info-computational" universe the interaction of organisms and their interaction with one another generate persistent patterns that imprint the organisms themselves and create a basis for epistemology.

Dodig-Crnkovic then moves on presenting an overview of "computationalist" thinkers and ideas framing natural process in terms of computational systems, as well as the idea that the types of computation done in nature may go beyond the Turing machine mode.

She then moves to the question of how ideas from organismic computation and computationalism may provide a new framework for epistemology -- the idea that the universe is a computer running on random initial conditions may provide a basis for understanding how cognition emerged in natural evolution.

During his work as a "government clerk", Svozil was involved in creating a database, whose main engineer stumbled into a very complicated subproblem, which Karl found out could be reduced to the halting problem. The engineer would not give up trying to solve a solution for the problem, which probably shows that he as not educated in Vienna.

Svozil discusses the randomness encountered in Chaitin's constant Omega, as well as the complexity observed in the three-body problem -- and mentions that it is possible to encode a universal computer into a n-body problem. Svozil mentions that it seemed to him a contradition that Chaitin computed the first digits of omega, and that at the same time omega is a uncomputable number. Svozil then discussed issues of retrieving theinformational content from a variety of systems from Mathematics to Modern Art.

The notebooks from Stephen Wolfram's live experiment at NKS 2007 last night are now available!

Download NKS2007-LiveExperiment-01.nb (76.0K)

Download NKS2007-LiveExperiment-02.nb (5124.2K)

*(Right-click and select "Save Target As" or "Save Link As")*

Stephen explored various interesting properties of ECA rules 18, 22 and 122.

Calude starts talking about the volume in honor of Gregory Chaitin's 60th birthday: "Randomness and Complexity from Leibniz to Chaitin". The book is divided in techical contributions, philosophy, and reminiscence essays. Calude mentions that this book is special among the several he edited because it presents original contributions and is as much forward-looking as recollecting the past.

Calude than presents a number of remarkable results produced by Chaitin in the course of his scientific career. He was among the first to connect physics and computation. The original paper on algorithmic information theory already contained such connections, although it was removed from the paper "at the strong request" of one of the referees. He also mentions Chaitin contribution to the development of RISC processor architecture technology.

Next, Calude exposes his contribution to the volume. Computers are having a high impact on the way mathematics is done, and computer systems such as Mathematica now enable mathematics to be done through a mix of Hilbert-like proving schemes and computational experiments. However, this doesn't mean that there is any less rigor in the way mathematics is done. As Chaitin says, it's much harder to convince a computer than a referee. Implementing ideas in a computer also forces practitioners do develop more clear ideas about the work they are doing. And time will come when the checking of mathematical proofs will be done only by machine.

**Natural Sciences lecture series**

Mark Burgin discusses complexity and automaton

High-level languages compress information and create complexity. In order to reduce complexity, we need to build hierarchies. Mark describes grid automaton as a way to allow hierarchy, and in particular to use cellular automaton. In a sense, we program cellular automaton to obtain more complex devices.

Mark described a mathematical model for a grid automaton, comprised of three sets and three mappings.

Brian Silverman showed us a project called Turtle Art that he is developing. It's a "simple programming" environment for kids. Silverman contrasted simple programming and simple programs. The user starts with a turtle in a blank gray field and has a collection of graphical programming building block that can be snapped together to control the turtle's path and the kinds of colored trails the turtle can leave. Each graphical "command" has sockets where are other commands can fit, usually one socket for the previous action and one socket for the next one. Control structures are also building blocks, so for instance, "repeat" has a socket for the previous action, a socket for the command to be repeated and a socket for the action to continue to after the repetition is done.

Silverman showed that unpredictable behavior is possible even with this limited set of primitives by changing some parameters for some intrinsically defined curves with unpredicatble overall behavior.

He ended his talk with a call for long-term "project-based" learning for children, saying that there is a trend to have shorter and shorter term assignments, and that he thinks that many of the long term projects that adults find interesting would also be interesting for children.

**Natural Sciences lecture series**

Alfred Hubler discusses the relationship between CA and continuous systems

Cellular automaton are low-dimensional dynamical systems which are discrete and local in time and space and have discrete amplitude. Which high-dimensional continuous systems make good models as CA?

In order to find a low-dimensional model, the first step is to separate the fast and slow variables in the system; Since the fast variables typically decay rapidly, they are not of interest. This is usually accomplished by integration over one period of the fast motion, on both sides of the differential equation. Next, a flow vector field is approximated from the previous integro-difference equation. This gives a difference equation, which is then approximated by a differential equation, which are easy to integrate analytically.

Low-dimensional models are good if the separation of time scales is very large, e.g. in rigid-body motion but not for "soft-body" motion. The equations of motion can be improved through discretization, using Euler's method. Continuous-time motion equations are good if the time scale is very large.

Time-discrete models are accurate for both rigid and soft systems, while continuous systems are only accurate for rigid systems. Models in the form of difference equations are often qualitatively better than time continuous models with the same number of variables. Cellular automaton can be shown to be more accurate than PDEs in some situations.