"High-information context" here is in the sense of entropy.

This talk explores distribution of digits of powers of 2 written in the base 3. More precisely, a balanced ternary system (digits -1,0,1) is used. Adrian used Mealy machine (a finite state machine where the outputs are determined by the current state and the input) to show the multiplication operation. The observation was that the probabilities of transition between states are approximately equal to 1/3, the distribution of digits gets increasingly random, and representation is incompressible. Hence, high entropy.

Everything is correct, just a typo: content not context. High information content, that is, the strings carry a lot of information (in the sense of information entropy, http://en.wikipedia.org/wiki/Information_entropy)

I also had a typo on one of the slides, so I totally understand.

Thanks a lot for the review. All the best.

... Adrian

Posted by: Adrian German | Wednesday, June 21, 2006 at 10:21 PM