New Digital World Theory. Part 9

Does the Super Computer exist?

All previous discussions lead me to conclusion that our Universal Computer is only a part of a grand computing structure that solves even bigger tasks.

The presence of “invisible” Black Holes and breakable bytes of information (molecules, atoms, etc) is evidence to that. All these disappear from our Universe into some places unknown. Obviously, macro- and micro-bits participate in some further computing process we know nothing at all about – not a scintilla! Our Computer stops following that process, but it obviously continues in these invisible parallel worlds.

Computer is akin to a calculator that computes, for example, up to ten grades of figures. We have both the limit of calculating capacity and the limit of data array sizes, which could be processed. But imagine that there are several such calculators, working so that if the first one calculates the biggest possible number, it passes part of its functions over to the next calculator. The second calculator uses these partial calculations in its own tasks. When the second calculator reaches its limits, the third calculator joins in. And such process can go back and forth. The second calculator can compute too small a number and, without rounding it down to zero, re-engage the first one, for the first one to complete the computation.

Such chain can be very long – we may only guess how long. Our Computer is, probably, a part of such chain, positioned somewhere in-between. Being the Computer number N, it works within the chain with even larger arrays of data, receiving portions 0 and 1 (bits) from a Computer number N-1. And when we run out of calculating capacity, a Black Hole (macro-unit) is being formed. Here, a Computer number N+1 joins, etc.

The only real proof of the Super Computer’s existence is the excessive size of data arrays. We see clearly only a tip of an iceberg, and not the whole. The whole is visible to someone else – someone who controls all these “minor” Computers. I don’t know what and how is happening there, but I think that on that very level the compressing program (gravitation) runs. Other optimising programs might also be run that level, but that is far from being certain. Combing effect of those programs working at different levels could lead to a macro-unit (the Black Hole) and a micro-bit (micro-particle) being formed in different conditions. It is not at all necessary for one Black Hole’s size being equal to another’s. They could be different; but it is important that they are not “lost” on the Super Computer level. Increasing density of data in compressed files of our Computer does not at all lead to loss of data in a global data array. At a certain point of growth of our data file, the excessive complexity becomes unnecessary, and the file is simplified while approaching the Black Hole state.

The same is happening on the micro-bit level. That is, a bit disappears from our Universe not necessarily in same, but also in different conditions. All depends on work of the higher level programs. It is them that decide which part of a bit can be sent to a level below. What is important is not to lose task continuity. As it turns out, the nuclear physicists have “divided” such a particle “zoo” that nobody could put it into a complete system.  I suppose that the problem is linked to discovery of debris of micro-bits, and not the bits themselves. Micro-bits should, in principle, be infinitely stable in the working area of our Computer, and the debris should disappear and appear for a very short time only. We should look for our micro-bit (“indivisible” atom) according to these specifications.

Why then physicists “see” the smaller debris of a micro-bit? Very simply, our slow Computer is to blame. It just does not immediately “stitch” the holes left after the bit’s disappearing. In one single computer clock cycle, it is impossible to fix disappearance of information. And the control lines also start to slow down. Plus the higher Computer optimising program intervenes. So it turns out that hundreds, if not thousands, of Computer clock cycles are required for “annihilation” of a micro-bit. This is the visibility of the invisible.

That time-consuming process creates a tunnel effect. That is when some elementary particles (debris) easily penetrate through dense data arrays. In reality they, of course, do not move at all, just micro-bits are appearing and disappearing. That creates an illusion of movement, as the files change even within that short time slot. A good example of such debris is an “elusive” neutrino, which is really a rare fragment of a micro-bit. It was extremely difficult to catch that particle.

New Digital World Theory. Part 10

The Big Ban theory, Human being, Mass and conclusion.

You have probably noticed that I only briefly mentioned the Big Bang theory. I have done that deliberately, as I do not see anything significant in that idea. So mega data arrays grow and move apart – so what? The Computer is computing data, and results of that work are constantly changing, so that we notice movement in the Universe. And the fact that everything has started from one point could be explained very simply:  the Computer has just been switched on. This is why there was an unusual stream of data at the very beginning. The Computer’s software had to be loaded, and only then it became possible to start processing data. This is what it is, and nothing more.

I’d like to talk a bit about ourselves, homo sapiens.  We are that tiny-weenie program, but with so high adaptability level. And the result of that adaptability is our ability to collect knowledge and use that baggage for forecasting the future. That very ability distinguished us from other “live” programs. Unfortunately, our abilities are still not developed enough. At the moment, we are just a small virus that can’t even do any significant harm. We can exist only in very comfortable conditions, needing a certain density of data array, low local Processor activity, presence of certain information bytes. In a word, we are currently nothing but humble bugs. What are we going to do? The way we currently are, we can’t take on any significant task. We have only one possibility to evolve – that is, to transform into a new species – the humankind. In that species, each individual should have to cease existing as a human being and become a cell living for a common purpose and willing to sacrifice itself for the common tasks. And what to do now? There is no real purpose. Wealth, fame, well-being – all that are purposes of a man’s ego.  For Mankind to be created, that is nothing. Something grander is needed, with bigger ambitions, comparable with the Computer tasks. I don’t know how realistic is that, but taking over control of the Computer could be attempted. The task is complicated with that if we interfere with current processing tasks, then there would be a high probability of “cleansing” by the Computer’s antivirus. In the extreme case of getting rid of us, the Computer could be simply switched off, and our life would definitely end. The task should be done very delicately, with minimal distraction of the Computer’s work. For that, first of all, a collective mind should be created to solve far more complex tasks than each of us can. That mind would help us to seize more Computer resources and to venture tasks previously unattainable even in our dreams. For instance, conquering the Universe is impossible without either us adapting to wider living conditions, or the wider Universe being adapted to us. The higher optimising programs would always obstruct us in that. It’s not a fact that the Computer itself is capable, even for short time, to block that optimisation work. Possibly not. And then there would be only one option left – to change ourselves, in order to withstand a bigger range of conditions for existing in different data arrays.  In general, the task is complex, very time consuming, and the ways of its completing can’t be seen yet.  One thing is clear – that the Super Mind is needed to control us all. Technically, we are coming closer to accomplishing that task. But in practice, we would not be able to handle control over us anytime soon. The main problem is the new Super Intellect – that is, the program determining purposes and values of the humankind. That would be the nearest task for people.

Oops, I forgot to write about mass. It is mass that is a metric of information volume.  To process more information, the Computer needs more time for completing tasks – which means it handles bigger mass.

This is all for now. If you have any questions worth discussing, write via the contact form. I apologise in advance that I would only reply to those who are raising questions of my interest. Please do not ask about God. Alas, I have no answer – same as everyone else, I can only imagine Him.

 

Thank you for reading!

Igor Voroshilov

Written between 28.08.2016 and 05.03.2017.