New Digital World Theory. Part 8

Temperature and Energy.

In all my work I have so far missed such important dimension as energy. I have done that deliberately, as it would not have been possible to explain its meaning without previous comments. Hopefully, I would make up for it now.

Let’s imagine various possible operational modes of the Computer. One extreme option: the Processor is busy with just one single task, not distracting at all at anything else. What does it mean in practice? It means that speed of completion of that operation would be maximal, i.e. the computation is close to the speed of light. In the meantime, time used for completing the operation would be minimal.

The other extreme option is when the Processor, in fact, ignores a task, completing, at best, a couple of operations a year, or even less.

As a result, we can talk about specific engagement of a Processor with some specific task. In reality, the Processor is simultaneously conducting parallel computations, as well as constantly shifting between various tasks. All that leads to a share of time spent for servicing one specific task being much less than 100%. I suppose that the time share of the Processor’s engagement in processing one specific task is that same bespoken energy. It is for that very Processor’s attention that all Computer programs are competing for. The more attention the Processor pays to you, the more advantage you have over other programs. You just have to be able to, and to want to.

It is the Processor’s local specific activity transmitted via control lines that characterises computing energy. Where the Processor “loves” something a lot, there would be the high temperature and energy.

From the first glance, our dealing with energy could be ended here. But this is so only from the first glance. In reality, we forget that all data are optimized and kept in archives. What does that mean in practice? That actually means that the delay of unpacking arrays leads to part of bulky packed programs stopping responding to control commands along with the very high activity of the Processor. In order to accelerate its work, the Processor compulsory destroys and simplifies packed data. It is also important that the Computer, in principle, is not capable of working with complicated colossal archives. When the size of an archive starts to exceed some figure (a dwarf star), then the Processor starts to simplify the packed contents. Probably, the Processor simply has no option, it has not enough productivity, and some upper level program joins the process of destruction of all that is too complicated. Data is divided inside down to a level necessary just to sustain addressing that archive in the memory. The archive’s temperature becomes very high due to excessive activity of the Processor.

It is my understanding that too large archives cannot, in principle, participate in valid Computer computations. For that to happen, the Processor must concentrate on solving too limited number of tasks, and that is prohibited. And this is where an astonishing thing happens: colossal archives actually become a waste which has to be turned over and over again, for it not to be completely lost. We see two forces competing – a program trying to make calculations and a program limiting (directing) the Processor’s activity. This is like balanced scales: neither up nor down. As a result, depending on the archive’s size, we observe different types of a cosmic zombie: from red “warm” dwarfs to all sorts of superluminal quasar and giant stars. The size of the latter is so huge that the archive is, in fact, consisting of bytes in process of destruction (molecules, atoms, etc). It is the Program limiting the Processor’s activity which destroys these super archives. The Processor obviously cannot manage processing such an archive, and whole pieces burst out of it, in form of explosions or elementary particles (bytes and bits).

And here I would add one more process which we have temporary forgotten about. That process is gravitation or compulsory archiving of linked (nearby) data. Unfortunately for supermassive archives, the compression utility does not “sleep” and continues to add new data from nearby archives that just happen to be in a danger zone. The Super archive keeps growing until the Simplifying program starts breaking the contents into information bits. And here it turns out that the binary code 1111111111111111111 means, in fact, just 1. The problem is that the Processor has to ignore that 1. With applied limitations, there is nothing it can compute, and super 1 becomes a Black hole.

The fourth limitation that could influence Black holes appearing is, of course, a maximal size of a file (archive) that the Computer can work with. I have earlier described those limitations discussing gravitation. It is hard to say what particular influence that complex activity has on super archives of data. Black holes appear under the influence of all factors described above. Perhaps, in every single case one factor that takes an archive out of Computer’s area of access can be the decisive one.

In the next Chapter, I will examine versions as to where Black holes and “broken” information bits can disappear.

Leave a Reply