New Digital World Theory. Part 7

The computer and limitation of everything including speed.

Same as a man-made computer, the global Computer has its limitations. It cannot complete all the necessary tasks instantaneously. This happens step by step, with a certain delay that determines timing in the Computer. The time delay really exists, as all running Computer programs detect and use it in completing their tasks and making forecasts.

We’ve already determined long ago that nothing in the Universe can move faster than 300,000 km/sec. This is the speed of light or speed of one of control lines. Electromagnetic waves also spread at this speed, but they are more affected by the density of the data arrays that they control. It turns out that the fastest operating speed of our Computer is determined by the shortest time of changing state in the control lines.

Of course, measuring Computer speed in km/sec is not appropriate, because these are all conditional concepts invented by us to describe our perception of the world. Objectively, the processing speed has to be measured by the number of information bits processed by the Computer at one step of computation. In one such step Computer can process a huge number of bits. How many – we do not know yet, because we do not know the Computer’s structure. This topic we will discuss later.

And here is one more interesting point related to the processing speed. The speed of commands in control lines can be affected by density of data arrays. The denser they are, the slower the commands in reaching its destinations. Eventually, at certain density, a control command cannot get into the array at all, interacting only with the outer area. I don’t know exactly why this is happening, but I assume that this is due to the work of global backing up program (gravity). It is likely that a compressed program cannot be run at all. To start it, the Computer decompresses it and only then allows interacting with the control lines. The denser and bigger a file is, the longer is delay and, therefore, the effect of slowing down of control signals. The Processor is just waiting for an uncompressed program to respond to a control command. At a certain height of density of compressed data arrays, the Processor simply ignores the most densely packed part of them, interacting with data from the outer area only. The response from this area comes faster, so we can assume that some maximum waiting time limits are set in the control lines. If the response from a program is not received in a specified timing interval, the next command is sent for a different program, etc. That is why too densely compressed files are not accessible for the control commands.

By the way, a similar effect occurs if the size of bytes (molecules) or small blocks of data files do not match the properties or the operating mode of the control channel. I mean now the frequency of light or electromagnetic wave. This frequency is actually the operational mode of the control lines. The operating mode of high frequency is designed to work with micro arrays, and of low frequency — with large arrays. What is the difference here? it is simply the minimum waiting time for a program to response. Large arrays, in principle, cannot be uncompressed quickly. Therefore, we should not expect a prompt response from them; so some other tasks can be processed while waiting. As for micro arrays, these can be unpacked very quickly. Thus, the waiting time could be very short, and large files could be ignored altogether.

It follows that the Processor or a program, by setting short- and long-time waiting limits for reply from a respondent, can filter only those data arrays that match a certain “weight category”. This approach saves the Processor’s time during computations.

In addition, I want to highlight that control lines operate in a rather busy mode, and so there is a continuous stream of commands coming without stopping to wait for any response. This is why we can see combined white light and not only changing coloured light. That means that control lines are packed so densely by the commands in order to, on one hand, not to lose the responses, and on the other — not to wait for too long. It is likely that each command in a control line carries information about an addressee, waiting time for response, and the command itself. As a result, only the addressee responds to that request, and the addressee knows exactly how, and to whom, to reply.

Requests actually could be multicast or unaddressed. Addressees simply have to meet certain parameters. This is like a torch lighting up: something would be clearly visible (response received), and something would not be seen at all (no response).

Leave a Reply