The Degeneration of the Nation
Can We Imagine Different Google and Facebook?
Why did Google turn the network of websites into a more cultured and quality network than the user network created by Facebook, and is this a decree of fate? The anti-bureaucratic trend in our time has created a preference for flat technological and social architectures, but stratification is actually of immense importance for the functioning of a state or culture. Thus, a contradiction was created between the current technological means and high culture and representative democracy. This contradiction will be resolved only through a new type of stratification, which is not based on the bureaucratic idea of control, but on the idea of the sovereign user
By: Neo-Structuralist
Architecture of stratification creates high culture (Source)

The Fundamental Error of Yuval Noah Harari
The Technological Reduction of Spiritual Development


Is technology really driving the world? Every era has its Zeitgeist, but in our time it seems that Zeitgeist means technology. The meta-narrative sold to us in the media and academia is gradually converging on one central explanation, beside which other explanations seem outdated (in accordance with the technological narrative, where anything non-technological is "outdated"): Technology is the engine of history - since time immemorial.

In the absence of spiritual evidence, even material findings from the Stone Age are interpreted as technological progress, and all past revolutions are interpreted through their technological aspect: the Agricultural Revolution, the Industrial Revolution, the Compass Revolution, the Genome, Gunpowder, Steel, and other revolutions that pop up frequently as the number of technologies. Even fundamental spiritual changes (monotheism? the Modern Era?) are explained technologically, for example through the Alphabetic Writing Revolution or the Printing Revolution. A good, convincing, and "deep" historical explanation today is a technological explanation underlying spiritual phenomena. This is a Marxist legacy that has received a contemporary update, and ironically, it fits well with our alienation from our past. And while deterministic historical explanations are widely condemned today - deterministic technological explanations are praised. Finally, the deterministic engine (sought after since time immemorial) for history has been found - and it is technology.

And who would dare to disagree and suggest alternative technological development directions? Certainly not the humanities scholars, few of whom understand how the computer on which they write their musings works, or the countless technological layers - physical, engineering, and mathematical - that enable the existence of the Internet. The ignorance (and sometimes pride in ignorance) of humanities scholars in natural sciences and mathematics is a very modern phenomenon, with sociological roots that have become ideological, for anyone capable of advanced quantitative thinking faces a system of incentives that will rarely channel them outside the vast technological apparatus of society. Those who can think of successful technological development directions that haven't been tried - should start a startup. And when ignorance reigns, deterministic thinking emerges, full of awe before the new historical god - technology.


The Return of Structuralism in the Version of Data Structures
Architecture as a Mediating Layer Between Technology and History


On the other hand, who can deny the enormous influence of technology on history anymore? Well, not only is technology not deterministic, and there is ample room for thinking through alternative histories about alternative technological development lines, but technology itself is not the deep explanation for past and present phenomena. Beneath technology lies another layer, which determines its development and influence in a specific context - and that is the architecture of technology. Did there really have to be one American Internet? Can't we imagine that in a multi-power era, with no superpower (i.e., not in the specific historical moment of the 90s), several competing Internets would have been created, not connecting well? A Russian, German, Chinese Internet? And each with its own architecture, matching the culture from which it came? Did Facebook have to be designed the way it is, for example through a popularity algorithm as opposed to a reputation algorithm, just because that's where the invisible hand of technology led it? Surely, they'll say, popularity means profit. But Google is designed using a reputation algorithm and not popularity, and isn't that where its profitability comes from?

Is the current architecture of democracy a decree of fate, something that stems from the depth of the concept of democracy, or a historical product in specific historical conditions, and can we imagine supremely democratic elections in vastly different architectures, for example a deep architecture? Does the crisis of democracy in the West stem from the very technology (deterministic, as we know) of the social network and is therefore a decree of fate, or perhaps it stems from a very specific American implementation of its architecture, which would not have happened, for example, in a reputation-based network? As long as the analysis of phenomena is based on technology itself, it seems like a stochastic force, almost like an invisible supreme force (that object of human desire for an "explanatory force" since time immemorial, which has been embodied in things like God or the invisible capitalist hand). But the moment the analysis is based on the concept of the architecture of technology - suddenly alternatives emerge, and thinking that is not only critically negative becomes possible, and proposals for other social architectures.

The architecture of technology also affects consciousness, not just the social order. If in the early days of computing, its use was esoteric knowledge entrusted to priests who knew "machine language", later in systems like "DOS" it became a monarchical rule through direct commands. But only when the modern operating system arose, did the current architecture that regulates human-machine relations emerge. This is an operating system that empowers the "user" (a new idea of individual sovereignty) precisely by hiding from him everything that is not under his control. All the vast innards of the computer in his hands are not accessible to him, and he is unaware of their existence, while he performs actions in a visual interface that hides their true mode of operation, but gives him an illusion of transparency ("windows") through simplistic graphic representations of complex and "deep" operations. In doing so, it gives him unprecedented control, which is mainly expressed in a sense of unlimited choice and freedom of expression, while all autonomous mechanisms operate independently but representatively - that is, according to the change in representations before the eyes of the sovereign. All this, similar to the democratic form of government: representative elections subject to a sovereign who is not part of the government, and does not directly control the state, which is theoretically perceived as his servant, while in practice it is a vast and semi-autonomous bureaucratic system.

But such an architecture is not the only one allowed by information technology, and it is being replaced by a new architecture, which shapes a new consciousness. The procedures of the operating system are essentially bureaucratic and not algorithmic. But when Google's mysterious algorithm arranged search results, and Facebook's secret one arranged the feed order, the user became passively dependent on an algorithm over which he has almost no active control of any parameters of its operation. We cannot instruct Facebook to show us more posts dealing with a certain word or field, or tell Google that we are interested in more results like the third one in the future, or decide to tell the algorithms what issue or topic interests us in general. The realm hidden from our eyes, which we do not understand and have no idea how it operates, and we also have no representation or conceptualization of it whatsoever, has expanded wonderfully, and the individual's span of control has dropped steeply. And all this is just a prelude to the integration of deep artificial intelligence algorithms that will learn our preferences and ways of operation by themselves, without the possibility of controlling them directly through any representation of their operation.


The Editors' Network vs. the Bubble Network
The Struggle Over Architecture as the Most Important Arena in Our Time


Global algorithms, unlike those that operate our personal computer or phone, are gaining power at the expense of the sovereign individual, just as global systems, such as the economy and the network, are gaining power at the expense of state sovereignty. This is an architecture that creates alienation between the sovereign user and the super-platform and its conduct, and we see the results of this alienation in politics and culture all over the world: a decline in trust in institutions, the rise of populism (a backlash to the decline in sovereignty), and the rapid decline of the middle tier in systems and the vital representation it provides (for example: critics who represent "what's going on" in a literary or cultural system, journalism that represents "what's going on" in the democratic system, intellectuals who represent "what's going on" in the development of the spirit, etc.). At the peak of the phenomenon, we are witnessing the rise of non-democratic systems that know better than the citizen what is right for him (China) or use their excess power for gross manipulation of him (a series of semi-authoritarian rulers who employ constant media manipulation). But is the alienated algorithmic architecture a necessity, and an outcome of the essence of technology and the "nature" of algorithms?

Certainly not. Every algorithm has parameters that can be made accessible to the user, and even in a simplistic representation, if there is a desire to do so. It is certainly possible to imagine Facebook or Google that make the central parameters of their algorithms accessible to user control, and represent them in a graphical interface that has meaning for him, and allow me to choose to be interested in "artificial intelligence" and "cats", while giving priority to results written in high language (a simple parameter to program) or those that mention the list of values from Wikipedia that I read in the last year. An elementary user interface for controlling the algorithm would also allow adjusting the parameters according to ready-made profiles, or those created by other users. Thus, for example, it would be possible to use a ready-made profile of someone interested in "Hebrew literature", and receive a feed that reflects what is happening in Hebrew literature in real time, or for example search results of the most important innovations in genetics in the last year, which currently requires not insignificant search capabilities. Such a profile would create an editor function, and the weighting of the order of results or feed could also be subject to user control (for example by reputation, popularity or trendiness). Why can't I know today what are the most popular posts in Israel this week, or those that have gathered the most angry emojis? Such a possibility would create a much more transparent and broader social network - instead of the current network of closed bubbles.

Just as Google and Facebook control the global parameters of their algorithms, so too could every user (who is interested in doing so) in principle and practice control the algorithm in a much more personal and transparent way. Would they lose economically from this? It's not at all clear. It would simply require a little more algorithmic effort, which is certainly within their reach, and if the demand for technological sovereignty of the user were to gain public momentum - it would even happen. But this is not just a specific demand from Google or Facebook (and their likes). This is a principled position that we must stand on and guard fiercely in the architecture of human-machine interfaces, towards the future interface with artificial intelligence. Technology that we have no simplistic representation of its operation allowing choice is a recipe for disaster, just as a state in which we have no sovereignty and elections of representatives and parties is a recipe for disaster.

In general, the intermediate layers in systems (mediators, editors, representatives, critics, reviewers, interfaces) are of enormous importance for their proper functioning, which rarely receives the understanding and protection it deserves (who needs all these mediators?), despite its theoretical basis (the "deep" paradigm) and practical (for example in brain activity, which is very layered). If we want technology that will serve us (and not vice versa), a functioning state, and a healthy cultural system - we must internalize the importance of stratification (layered architecture), and understand that absolute control of the sovereign in the system is neither desirable nor possible, but on the other hand loss of control is also dangerous. Therefore, it is necessary to cultivate in systems precisely the stratification that allows gradual loss of control at each stage, and only thus, perhaps, we will be able to cope with the complexity of the enormous systems that are developing before our eyes, and particularly with artificial intelligence. Because stratification means culture - and superficiality means barbarism.
Alternative Current Affairs