## Entropic Overunity Paradox

Entropic Overunity Paradox

Pure information (every part unique) is an interesting thing. One can sample a temporal sequence [analog to digital or analog only] from a suitably random source (nanomechanical noise, quantum noise [Josephson Junction receivers, currently used on the P4+ hardware RNGs], etc.

Now any particular sequence of a purely random source, must be random as well. Random also means the average is at zero if all values are summed. However, the largest amount of information any single channel can hold is all bits unique.

This is a problem because at all scales there are segments with nonunique values. These segments are compressible with a positive reduction coefficient.

In order for the sum of all information in the totality to be limited only by the dynamic range of the sensor or by the structural range of a numeric abstraction system (NAS), never mind infinity, the areas of less than total unique saturation, must be compensated for by areas of more saturation than average.

However, the average entropy content of the totality of a random stream is the maximum density of the receptor or counting system (as observed), or infinite (in theory).

Thus while the system requires complete saturation in its totality, any segment exhibiting partial saturation must be compensated by one or more supersaturated segments.

However, supersaturation is impossible because by definition median saturation is equal to maximum for a random stream, due to its complete use of the given representation system’s descriptor ability at minimum, and complete use of the stream’s physical measuring system at maximum. Beyond even these constraints is the saturation point of the universe in question’s physical system.

Thus, a means by which full saturation can be achieved, which abolishes the problem of supersaturation in a measured, real, or theorized system must be explored.

In a real-world system, of course, no stream exists in isolation. All grains of a granular worldsystem interact, and all indefinites of a non-granular worldsystem interact.

Thus, intersegment saturation exchange between neighboring flows could annul part of the problem, but not relieve it entirely – merely closing in toward the solution.

The ultimate and perhaps most convenient solution, is to suppose that a counteracting reality-segment existed parallel to the original, in realtime. The mirror segment counteroscillates with the prime segment, preserving the observed perfectly random streams. Call it Interuniversal Entropy Exchange, or IEA for short. However, this theory requires the use of another universe, inverted entropy, maybe more. Not that such things are impossible, but there are other solutions too.

This problem need not exist in a quantized or simulated-quantized universe. The reason being that if a universe is quantized then the state-time spaces of all its measurable components can only produce a limited variation by definition, and thus over time will complete grand cycles. Depending on the precision of the simulation, inhabitants may or may not be able to detect them, at least in the case of a ‘sealed’ universe.

Such is an argument that our universe is either spacetime quantized or simulated-quantized. Quantization sets the ceiling for entropic density in measurable and unmeasurable channels.

Thus it may be that in the real world, even with infinite precision instruments we may never measure the totality of states within any moment, while using abstract math functions we can make nongranular equations that describe potentials of which the underlying universe is incapable of living up to.

A painful yet decisively necessary dividing line must be drawn between the math which has a resemblance to the real universe, and that which is merely abstraction. For example, the perfect information source P, which provides nothing but nice noise forever, can be compared to the God’s sensor p on a nicely vibrating particle in a normal universe like ours. The God’s sensor is a Deus ex Machina contrived to show the best of what could in theory be done to measure the cleanest random stream(s) in a real universe.

The God Sensor reads out position data without affecting the system, which according to current thinking is impossible.

Even with the God Sensor, which provides the cleanest data possible by not affecting the system being measured, it seems likely that real-world random streams can’t reach the perfect level, because our Universe can’t do superunity at all.

A subunity system could have close-to-random values but would characteristically never reach saturation. If possible, the observation of real-world sources may be able to detect the compressibility difference.

If a positive compression ratio can be achieved, then we have a subunity universe, which means a portion of the ‘totality debt’ is lost – transferred, deferred, transformed, or doesn’t exist for an as yet unknown reason.

I don’t know where the magic ‘missing information’ to fill a perfectly random channel exists. Which means the concept of a ‘perfectly random’ source, as well as that of a source of ‘pure information’ may be an intellectual divergence from the fabric of this universe or any of a class of universes, real or simulated.

If that is the case, the raw sample data collected from real-world sensors cannot be purely random, but must contain residual redundancy, and therefore these channels (even the ‘best’ of them) must be compressible.

There are many compression systems for quantized data, but none as yet for nonquantized data. Since compression is used to measure redundancy, the problem of real-world verification becomes more complicated.

Without a compression test for nonquantized data, there is no way to measure the information content of nonquantized streams.

Therefore, until better technical means and understanding is achieved, a cleanup and better conceptualization of the dividing line between the theoretical concept of complete information saturation of an arbitrary source measurement sequence (SMS) and the real-world potential of such, must be further elucidated in a subsequent paper.

leave a comment