The Degeneration of the Nation
The Three Big Questions in Science Today
What is natural in nature? Is it possible that nature is not natural? Are the big problems in science today just gaps (or enormous holes) in knowledge, which turn nature into a miracle, or do they hint at the need for a paradigm shift, and a new type of scientific explanation, which is not the laws of the system (nature) but the learning of the system? On the difference between a networked and descriptive scientific worldview (we must remove all explanation!), which is abundant in random and therefore improbable details - and a learning-based explanation
By: The Legend of Three and Four
Multiverse - National Gallery of Art Washington  (Source)
What is the worst thing that could happen to neuroscience in the coming century? The discovery that there is no "evolution" in the brain. That unlike biology, whose great successes stem from one algorithm, which is the simplest, most successful, and most comprehensive explanation in any scientific field (much more than in physics!), we'll discover that there is no comprehensive or central algorithm in the brain that can be generalized. That the brain depends on countless balances, perhaps even genetically programmed, that are very difficult to recreate, improve, or even understand. In short: the discovery that the brain is not only complex, but complicated and even tangled, meaning an unraveling knot. Like biology appeared before Darwin: as many related cases, that is, as a network - and not as an algorithm. As a language system and cataloging and characterization - and not as a learning system.

And what is the great hope of neuroscience? That the Einstein of the brain will appear, who will decipher basic, scientific principles that well explain its operation. That is, that the brain is driven by a strong algorithm and not just by big-data and a monstrous procedural operating system. That it does not depend on delicate fine-tuning at the motherboard level. Otherwise, even if we succeed in creating superintelligence, we will find it very difficult to create the precise balances required for it, the product of slow and insanely long evolution, and the path from here to catastrophe is short. In other words, if we don't discover that there is a new learning algorithm there besides the old evolution - then our situation is bad. And who guarantees us that "deep learning" will be able to advance us beyond the most superficial stages of the brain's learning system?

This problem of fine-tuning, versus an effective learning and adaptive algorithm, is perhaps the most basic problem in science today, in all areas of exact sciences, including mathematics. The question is how rich complexity is created at all? If we believe in the existence of a creator or designer, as in the Middle Ages, then very complex and rich explanations for our world are not problematic. But from the moment we are in the realm of modern science, we will find it difficult to accept a situation where the universe is designed for complexity at an exceptional level of precision, out of infinite possibilities, and without any apparent reason. Science seeks a case that will happen not by chance. Therefore, the three most significant questions in science today are actually one question:


The three great anomalies hint at deep orders that exist in the universe, and perhaps deep learning algorithms and adaptation and creation of complexity that we do not understand. Just as we did not understand evolution - and the biological world seemed to us both random and terribly complex, but also wonderfully adapted, and as if created by a designer - so we can hope that the discovery of other learning algorithms will solve the cosmic clock problems. Yes, perhaps mathematics, neuroscience and physics also have their own evolution and learning mechanisms. Or there is a deep learning reason for creating complexity - and for the existence of conditions for it. We may, for example, be able to base physical complexity on mathematical complexity, or vice versa, and create at least a reduction to the problem (would there be different mathematics in another physical universe?). Or it may be that we can generalize evolutionary learning, which somehow managed to create the conditions that allow its continued development each time anew (once there was no oxygen atmosphere on Earth - life created it!). There is some important component in the complexity puzzle that we do not understand, and that is not "natural". Any answer that starts with winning the lottery is not a scientific answer - but a theological one. And it is much more likely to replace it with a designer or simulation operator.

But if we do reach such a design answer (and it's possible! Any design leaves traces) - this will be a real philosophical revolution, and we will understand that we are not here by chance. In the current state of knowledge, in a different intellectual environment, these anomalies would be considered strong evidence for the existence of such a designing God. Only our belief in the appearance of evolutionary and "natural" explanations later, in accordance with the development so far in the history of science, thwarts such an obvious interpretation even today. But if we discover that these anomalies are very deep, and perhaps even broader than we thought, and no natural explanations are found for them for a very long time - science may influence belief in a way it does not anticipate at all, and we will return to a medieval worldview. The universe may seem to us immeasurably huge, and not like a laboratory experiment on someone's table, but in relation to what do we determine that it is a large universe? Do we have any objective scale? The only objective thing that is large in the universe is the number of orders of magnitude, but in fact it is only a few dozen of these. Maybe there are thousands of orders of magnitude in a universe larger than it? Maybe there are infinite orders of magnitude? Maybe our entire universe is a negligible system in size?

Our universe does not actually contain huge amounts of matter as we tend to think. After all, the universe is almost empty. Almost everything is empty in the universe. The stars are tiny compared to interstellar space in ratios with many zeros, and in the same way the particles are tiny compared to the distances between them, and so on. The only thing that operates the universe is not the matter itself, but forces, charges and fields that fill these spaces, and they are the real universe. Matter is ultimately an illusion. We are completely empty, and the only reason my finger doesn't pass through the keyboard keys is not that the place is full of matter there, because it is actually completely empty - but that there are repulsive forces that act at enormous distances relative to the size of the electrons in my finger and in the shin key, and therefore they repel each other with unimaginable strength relative to their size. If there were no such forces, I would immediately fall from the chair into the ground to the core of the Earth. And as a black hole, that is, as concentrated matter, Earth is actually quite tiny... That is - whoever created the universe was very stingy in the amount of plasticine in it relative to its size, but very generous in relation to the influence of tiny pieces of plasticine from afar on each other. If so, after we see such an extreme mismatch in the universe itself between the amount of matter and space, it is difficult to talk about large and small in relation to the entire universe at all: it is possible that our entire universe is tiny in relation to other universes - or huge. A small experiment - or an unimaginable monster.

The same question concerning space also concerns time. If the objective scale for space and time is the universe itself, then we, both as a species and as individuals, occupy a much larger amount of time from it by many orders of magnitude than space. As evolution, for example, we have been here for over a third of the universe's life, and as a piece of matter Earth is less than a nano-nano-something of the universe, and our ratio as humans to the size of the universe is similar. We are not just similar to those giants in time, from Proust's stirring final paragraph of In Search of Lost Time, but we are really noodles in time. Tiny creatures in space that occupy enormous time relative to their size. But who said that even the universe itself can be an objective scale? Can there even be an objective scale for the universe?

The only distances and quantities that are objective in the universe in terms of their size are the orders of magnitude themselves. They are the ones that determine that the universe is actually empty, because there are many orders of magnitude between the size of its parts and their influence distances and the spaces in it. Are the true boundaries of our universe, far beyond any boundaries in time and space, actually the boundaries of the orders of magnitude accessible to us? After all, it is possible that the answer to rich complexity is unfortunately rooted precisely in orders of magnitude that are not accessible to us, whether larger than us (the great designer? super-universe? different structure?), or much smaller (some basis that creates complexity in all orders of magnitude above it, and perhaps even creates mathematics itself and the laws of nature). But maybe complexity is an illusion? After all, there are many orders of magnitude in the universe where there is no complexity, and many simple and boring and uninteresting parts in mathematics. Maybe complexity is created precisely from the existence of many orders of magnitude, which allow the gradual creation of complexity?

After all, if we unite the evolutionary algorithm and the neural network algorithm, the two natural learning algorithms known to us, we will discover that what creates complexity is many generations one after the other or many layers one after the other (deep learning), when in each generation and layer there is a huge multiplication and redundancy of agents (organisms or neurons). That is, there is complexity created from the stratification of time that passes over the system, and there is also complexity created from spatial stratification in the system. Stratification is the father of complexity and therefore development and learning. Isn't it reasonable that there would also be complexity that stems from the stratification of orders of magnitude in the system? That is, where each order of magnitude reflects greater complexity that emerges from the one below it? Can't the cosmic learning mechanism be a learning mechanism built on orders of magnitude, which generalizes the learning mechanisms in time and space?

Mathematical complexity is also created from countless stages that are composed one on top of the other, according to the evolving definitions and the mathematical logical language. Is the complexity of the world created just from repeated calculation, or perhaps from an even more basic mechanism than calculation? Why are such distant branches in mathematics connected by such surprising and deep connections (the central story that repeats again and again in all parts of modern mathematics)? Is it possible that all mathematics is built not only like a network with deep connections from sea to sea, but that there is one basic phenomenon at its foundation, that there is a sea? Or is there some deep generative process that creates complexity throughout mathematics, for example of prime numbers, and can we get to its depth? Is there evolution for mathematics? Mathematical logic is indeed a generative process, but it is not "deep", and there is no understanding of the order created, but only an external linguistic description (and indeed it is relatively disconnected from other parts of mathematics, despite model theory, and recent advances in the connection between type theory and category theory). Will we discover learning at the foundation of the foundations of mathematics?

These questions are on the edge of our perception. But it is possible that if we unite the different learning algorithms we know, from the evolutionary, neural, mathematical and physical worlds, we will reach some unified and basic theory of learning and its essence, and of the deep common basis for the different learning algorithms in our world, and therefore also of the properties and limitations of learning. Is it possible that we will discover in the 21st century that the theory of everything in physics (TOE), which unifies the two great physical theories of the 20th century - quantum and relativity - into one framework, is based precisely on evolutionary learning ideas, like biology is based on evolution? Such a theory can be even more general than the physical "theory of everything" - because we may discover that it includes all the exact sciences, including mathematics and biology and computer science and neuroscience - under one learning framework. For example, when different learning algorithms are special cases of one basic algorithm.

And if we choose a more modest goal, just as network theory today is beginning to provide insights into networks in all areas of science and technology, it is possible that in the future we will have an interdisciplinary and fundamental learning theory, which will even be discovered as more basic than the idea of the network. Just as learning is more fundamental to the brain than the neural network, or more fundamental to evolution than gene networks. We can certainly imagine a kind of Einstein of learning of the 21st century, who will establish it as a super-theory, and who will also discover in it a host of quantitative predictions (how fast can one learn? Under what threshold conditions? How much information is needed and how much is created? How to measure "rich complexity" through learning? etc.).

Then, such a theory may find a general and comprehensive explanation for complexity in the universe, and the basic learning mechanism that drives it. And then we will stop understanding the universe as computation (lacking depth and insight), or as a network (random), or as probability (implausible), and start thinking about it as learning. We do not know of any other natural process that creates rich complexity, whether in culture, in humans, or in evolution, that is not learning. Maybe we'll discover that this is true for physics and mathematics as well? And maybe the nature of this algorithm will reveal to us why we are the only evolution that converged to a state of intelligence, and that there can probably be too many types of complex development, which happened in other evolutions, and which led to other directions, which we do not understand today. After all, it is possible that evolution tends to converge precisely to learning algorithms other than ours (does digital computation, for example from the genome, naturally grow from it, and not a brain? Why isn't this actually more likely? Why would an additional and analog learning mechanism be created, disconnected from the digital genetic learning mechanism that already exists and is ready?). And if a general and comprehensive scientific theory of learning is created, we may even be able to measure the complexity that it is "reasonable" for evolution to create, and understand if we are really an extreme case, or identify some rare turning point on the way to us, which led relatively easily to the complexity that followed. And maybe, if we decipher the basic learning algorithm of nature, we will also understand how artificial intelligence should develop naturally (!), and what is the most n-a-t-u-r-a-l learning algorithm for intelligence in the universe. And what is natural - is probably safer and more correct and more adapted to the world. And to nature. And to the nature of the world.
Culture and Literature