This is to propose the way to measure chaos in a computational system.
I studied the reversible algorithms of arithmetic multiplication (factorization problem). I have found a very interesting fact that when the information describing the ensemble grows fast, this exact process is difficult to reverse, i.e. find feasible computation reverse algorithm. Let me explain by example of computation.
Suppose there is an initial set of N bits {x}. If all bits are independent and can initially be set to either 0 or 1, then this is the ensemble of 2N states. Now suppose we have a function G describing all forbidden states of the ensemble. [The same up to inversion arguments would go with all allowable states.] Initially G=0, since all values of x are allowed.
Next suppose that we have a computational process which is done in atomic steps. Each step can be one of two kinds: either creating a new bit or forgetting an old. For example, add a new bit y1=x1&x2 (let us signify & as AND operation, | as OR, and ^ as XOR). The G function becomes y1^(x1&x2) if previous G=0, or in general case Gnext=Gprevious|y1^(x1&x2). Forgetting a bit x2 would make Gnext=Gx2=0&Gx2=1, where Gx2=0 is G with x2 replaced with 0.
It seems easy to keep G function defined while doing the calculation process. This is important because while we know G function the calculation is immediately reversible – it is easy at each backward step find out the previous forgotten bit. But the problem appears when G function becomes impractically huge in representation. This happens, for example, with arithmetic multiplication. The only way to attack this problem is to find symmetries in the function so to make it computationally practical, smaller.
Suppose that with a given intellectual strength we find the smallest representation of G function. This function can be written as a sequence of the atomic steps A=B&C, where A,B,C are bits or their negations (say A=B|C is ~A=~B&~C). The negation is not an operation, rather a perception of the bit. With this representation the function becomes measurable in number of atomic computational steps.
That makes it very natural to define the chaos as the increase of the size of the function G. Note that the definition includes the intellectual strength of the observer, which seems natural for the definition of chaos. Also it qualitatively resembles Lyapunov exponents as chaos measure.
Subscribe to:
Post Comments (Atom)
2 comments:
More on this topic Chaos and Reversibility in Simple Computations
Genial dispatch and this post helped me alot in my college assignement. Say thank you you on your information.
Post a Comment