Niklas Luhmann and Jacques Ellul on the Autonomy of Technology

Page 1

Adam Lovasz

ELTE University

Luhmann and

Ellul on theAutonomy of Technology

ORCID: https://orcid.org/0000-0003-1422-0381

Citation

Lovasz, A. (2023), "Niklas Luhmann and Jacques Ellul on the autonomy of technology", Kybernetes, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/K-02-2023-0287

Abstract

Purpose: Drawing on the work of Niklas Luhmann, the paper argues that technology can be viewed as a self-referential system which is autonomous from both human beings and other function systems of society. The paper aims to develop a philosophy of technology from the work of Niklas Luhmann. To achieve this aim, it draws upon the systems-theory work of Jacques Ellul, a philosopher of technology who focuses on the autonomous potential of technological evolution.

Design/methodology/approach:The paper draws on the work of Niklas Luhmann and Jacques Ellul to explore the theme of autonomous technology and what this means for our thinking about technological issues in the 21st century. Insights from these two thinkers and researchers working in the Luhmannian sociological tradition are applied to remote work.

Findings: The sociological approach of Luhmann, coupled with Ellul’s insights into the autonomous nature of technology, can help us develop a systems theory of technology which takes seriously its irreducibility to human functions.

Research limitations/implications: The paper contributes to the growing sociological literature that thematizes the Luhmannian approach to technology, helping us better understand this pehnomenon and think in new ways about what technological autonomy means.

Originality/value: The paper brings together the work of Luhmann, Ellul and contemporary researchers to advance a new understanding of technology and technological communication.

keywords autonomy, autopoiesis, self-organization, systems theory, technology

1
Niklas

Introduction

The purpose of this study is to answer the following question: can technology be considered s an "autopoietic" social function system in the sense used by sociologist Niklas Luhmann? We may elaborate this query in the following manner: is technology autonomous from humans and other social systems and, if so, to what extent is this the case? In addition to theserious philosophical weight of the question of technology, the question may be of interest to researchers of Luhmann’s work in particular, because Luhmann never developed any systematic theory of technology, a strange anomaly for a sociologist who elaborated a comprehensive theory of society. We find many Luhmannian accounts of social functions in his oeuvre, a good part of which can be summarized as the application of the autopoiesis concept and insights gleaned from systems theories to social subfields. From eroticism to morality, law, economy, and the mass media, Luhmann applied the interpretive framework he composed to many specific functional areas. However, we look in vain for a detailed description of technology as an autonomous system in the German sociologist’s work. Luhmann does not classify "technology" as one of the main function systems of modernity. It does not play as big a role as law, the economy, or other more well-known functions, and cannot (yet) be considered an autonomous, autopoietic system. For instance, in relation to information-communication technologies (ICTs), Luhmann ventures the following tentative comments:„the internal operations of the computer can be understood as communications. We must presumably leave aside all analogies of this sort and ask instead what consequences it would have if computers could establish a quite independent structural coupling between a reality they were able to construct and consciousness or communication systems. This question deserves greater attention; it is impossible to judge at present what consequences it would have for the further evolution of the societal system” (Luhmann, 2012 [1997], p. 66).

Given the general nature of the Luhmannian model and the specificities of technology, we may nevertheless envision the applicability of Luhmannian system theory to the philosophy of technology, without having to necessarily reconstruct a (nonexistent) Luhmannian theory of technological autonomy. Our task is one of extending the autonomy framework, even beyond the original intentions of the author.

That being said, precedents can already be found in the form of comments in some of Luhmann’s works, for example when, in connection with the Internet and other new communication technologies, Luhmann states that „a far-reaching consequence of the evolution of dissemination technologies and the corresponding media lies in the waning need for societal

2

operations to be spatially integrated. By integration, I mean restriction of the degree of freedom of systems” (Luhmann, 2012 [1997], p. 188). Technology operates in a way that excludes „the actual living individual, the meaning-constituting subject” from its operations (Luhmann 1990, p.225).Itcanbestatedthatcertaintechnologies do haveselectioncapabilities,playinganactive role in the construction ofsocial reality.Accordingto Luhmann, the function of the mass media, in addition to the interpretation of reality, can be summarized as making reality meaningful for other systems and psychic systems (humans) (Luhmann, 2000 [1995]).

In this article, I define „autonomy” as the self-determinative capability of a selectively closed system. Can the autonomy of technology be considered from a Luhmannian system perspective? In this article, I propose that we read Luhmann alongside French sociologist and theologian Jacques Ellul. The latter, using systems theory insights similar to those of Luhmann, claimed that the system of technology is genuinely autonomous,autopoietic and even sovereign in relation to other functional systems. In my view, Ellul provides the key to a Luhmannian theory of technology as self-creative. In the second half of my paper, I shall rely on the ideas of Ellul to develop the idea of technological autonomy. The considerable differences between the two authors notwithstanding, both theorists worked within frameworks informed by systems theories. If we bracket Ellul's controversial techno-pessimist and humanist value judgments, we may consider the theologian as a most prominent representative of the technological autonomy concept, whose ideas can be incorporated into an expanded Luhmannian theory of technology. Is TechnologyAutopoietic?

While Luhmann characterizes "mass media" as a genuinely new system, he does not regard technology as a whole as autonomous. Luhmann registers the increasing dependence of society's functioning on technologies such as ICTs, but he considers concepts such as "technological society" and the like to be exaggerated. A modern, functionally differentiated society would not be able to exist without technology, but we should not equate the two (Luhmann, 2012 [1997], p. 321). Certain technologies for Luhmann do point in the direction of systemic autonomy. Writing and printing in particular have destabilized communication, and Luhmann does believe computers are doing the same. Baecker emphasizes how in Luhmann’s account communication technologies such as writing and printing have had enormous effects on accelerating social evolution: „Computers are an 'alternative' to the structural linkage of communication and consciousness”, albeit one that increases,rather than decreases, the internal complexity associated with communication (Baecker, 2006, p. 30).References can be found throughout Luhmann’s oeuvre, on the basis of which a certain idea of technological autonomy

3

can be reconstructed. As we shall see, Luhmann's sociology has also been applied fruifully by many contemporary authors whoconceptualize technology as a system autonomous from both human agents and other areas of society.

Fortunately, although Luhmann refrains from recognizing technology as a system of functions in its own right, we find some guidance regarding the social role of technology in Luhmann’s 1991 book on societal risk. I choose to focus on this book in particular, because it contains the most systematic elaboration of Luhmann’s views on technological autonomy. Modernity is increasingly organized around the management, distribution and minimization of risks arising from unintended consequences. According to Luhmann, the "risk society"1 is the result of accelerating technological developments. Technology is inherently related to the problematic of controllability, in its essence is instrumental, and presupposes three distinct characteristics: 1) the controllability of processes; 2) the amenability of resources to planning; 3) the localizability of errors (Luhmann, 1993 [1991], p. 88). According to Luhmann, technology should be understood as all procedures that result in the "causal closure of an operational area” (Ibid, p. 87). Anything which enables social actors to reduce complexity can be considered a technology.2 In Luhmann's framework, a mere instrument cannot be considered an autopoietic entity. Instruments are trivial machines, objects that lack the ability to move or create themselves. Autopoiesis in Luhmann’s use of the term refers to „non-trivial machines” capable of creating and maintaining their own boundaries.3 A non-trivial machine creates and maintains its own internal states, being „obedient to its inner voice” as Heinz von Foerster, progenitor of the concept, put it evocatively decades ago (von Foerster, 1984, p. 10). Luhmann claims that a certain form of technology has emerged -"high technology” - which differs from the standard format of instrumentality and triviality. Because of its outsized effects, high technology is beyond the purview of "the technical regulation of technology” (Luhmann, 1993 [1991], p. 89). We may classify under this rubric all technologies that, on the one hand, result in opaque causal relationships, and on the other hand create unpredictable risks. High

1 Aphrase borrowed from fellow sociologist Ulrich Beck. (Beck, 1992 [1986]).

2 For example, scientific concepts can be conceived of as technologies. When we talk about the concept of „communication”, for instance, we are using a technology that helps us sort various occurrences into the binary „meaning/non-meaning”. That being said, communication itself is also more just a mere tool: communication is „a recursively closed, autopoietic system (...) a structurally determined system that may be specified only by its own structures and not by states of consciousness” (Luhmann, 1996, p. 264).

3 Autopoiesis refers to the self-creative abilities of complex systems: „The system generates itself. Not only does it produce its own structures, like certain computers that are able to develop programs for themselves, but it is also autonomous at the level of operations. It cannot import any operations from its environment. (...) Such operational closure is merely another way of formulating the statement that an autopoietic system by means of the network of its oen operations generates the operations that it needs in order to generate operations.” (Luhmann, 2013 [2002], p. 77).

4

technologygoesbeyondpurelyinstrumental use.Inthecaseofsuchproceduresorassemblages, the risks become known only during use. As Luhmann puts it, high technology „violates” the „form-defining boundary” between „enclosed and excluded causalities” (Ibid, p. 90). The advent of this particular form of technology generates not only epistemological or existential uncertainty, but also a new kind of structural path dependence. For example, a simple example of structural path dependence is the QWERTY keyboard layout. The QWERTY layout was originally designed in the late 1800s for mechanical typewriters to reduce the likelihood of jamming by placing frequently used letter pairs far apart. As a result, the layout was not optimized for typing speed or efficiency.

The management of unintended risks arising from high technologies is only possible by involving additional technologies. Luhmann advances what we may term a „moderate autonomy” thesis. Technological accidents and malfunctions can only be handled by involving newer technologies.This accords well with von Foerster’s observations onnontrivial machines: „the nontrivial machine can change its operations internally so that its drift results in an extension of diversity” (Foerster, 2014, p. 86). We may think of multiple examples that demonstrate this type of escalatory logic, such as the cascading of risks generated by nuclear energy or the multiple serious environmental risks associated with oil and gas exploration. In every case, the solution is always of a technical nature. Fail-safes or, failing that, remedial technologies are deployed, to resolve or at least mitigate the damage. Because they have unforeseeablecapabilities,beyondacertainpointhightechnologiesacquireakindofautonomy, precisely because of this unpredictability. 4 Additional technologies are required, control techniques or „a machine within a machine, a trivial machine within a machine that can be trivialized only to a limited degree,” so as to stabilize aggregate functionality (Luhmann, 1993 [1991], p. 93). Far from reducing overall social complexity, high technologygenerates further increases in complexity.As Luhmann warns, "the machine can reconstruct itself in unexpected ways" (Ibid).

In late modernity, the boundary between controllable and uncontrollable, deterministic and inexplicably indeterministic causalities, becomes uncertain. We could say that here we discover several crucial elements of autopoiesis: operative closure, having one's own "agency", self-organization, and so on. But certain elements are not mentioned by Luhmann, such as the availability of a unique binary code, or the ability to process information. It is difficult to state

4 Von Foerster observes that when „non-trivial tendencies (the car won't start, etc.)” manifest themselves, „we call upon a specialist in trivialization who remedies the situation”. (Foerster, 1984, p. 12) Trivialization denotes ways of reducing the complexity introduced by unexpectedly non-trivial technology.

5

in all seriousness that a power plant "communicates" in any meaningful sense, unless we consider, for example, the outbound transmission of data relating to the state of its the plant and its equipment as communication. Luhmann regards technology as part of the "environment" of society, rather than composing an autonomous social subsystem: „ecological problems are actuated precisely by technology functioning and attaining its ends” (Ibid, p. 96). The problem of technology is an environmental one, not a system theoretical issue. Technology is an environmental aspect of society, just like any natural entity (Ibid, p. 98). On this view, there is no substantive difference between technological and ecological issues, since the two are identical. Like human beings or nature, technology is situated outside of functionally differentiated society. Luhmann would not deny that technology can certainly build up complexity, but the emphasis on risk seems to underline that the temporal scope of this complexity is severely limited and can even be destructive of society, at least in the context of late modernity. (Valentinov, 2014) No matter how "self-regulating" even the most extreme forms of technology may appear, we cannot speak of a specifically technological form of autopoiesis.

Or can we? If this position were so obvious, then surely Luhmannian sociology could never be applied in any meaningful way to the topic oftechnology. Yet there are many recent examples from scholarship that convincingly prove otherwise. Technology as a whole, or one of its specific manifestations, can be considered an autopoietic system in the sense used by Luhmann. André Reichel formulates the aforementioned thesis in an ambitious study. According to Reichel’s hypothesis, today technology has become a functional system suitable for creating differences of real importance for society as a whole. "Technology," writes Reichel, "establishes a circular and recursive relationship between itself and its environments - the physical world, society and human beings” (Reichel, 2011, p. 106). It is more than just an instrument subordinated to human intentions. In Reichel’s view, Luhmann's contribution can be summed up precisely in the fact that the sociologist places human beings in the environment of society, productively decentering the anthropocentrism still prevalent in much of the social sciences. Instead of a instrument passively serving human beings, technology is a complex assemblage made up of technological artefacts, physical and social environments, and human users (Ibid, p. 105). We are components, however, and not the prime movers, of the system of technology. Another significant contribution of Luhmann is the recognition that autopoiesis (self-relation) can exist in the case of inorganic and inorganic systems in addition to living systems. Neither are life nor consciousness necessary factors when it comes to postulating a self-referential and self-creative ability. In Luhmann's case, society is built from

6

"communications" generated by inorganic function systems, not from human individuals. Each function system has its own media and binary code, with the help of which it is able to reduce the infinite complexity of reality into information. According to Reichel's thesis, the specific binary code of technology is the duality of "work/fail,” and its media is "operativeness” (Ibid, p. 109; 111).

Indeed, these do seemto be categories relevant for many, if not all existing technologies. Of course, autonomy does not mean the complete isolation of a single functional system. Operative closure entails that there is also "interpenetration" or structural coupling between different function systems. For example, Reichel refers to tools that imitate the shape of the human hand (Ibid, p. 112).5 Biomimicry is a well-known phenomenon in the history of the evolution of technological innovation. There are many methods of incorporating natural forms into artificial constructions. One possible objection could be that Reichel’s position makes no distinction between trivial machines and non-trivial machines, that is, he commits an ontological category error. It is undeniable that there is a significant difference of degree between an inert hammer and an intelligent robot. However in systems theory self-organization is a broader category than intelligence, so in this regard the thorny philosophical problem of what exactly constitutes intelligence is bracketed. As Reichel states, "intelligence is not a characteristicofautopoiesis,norisitnecessaryforit”(Ibid).Inthecaseoftechnologicalobjects, external agents give objectives to these entities, so the totality of technical assemblages cannot be considered as an endogenously formed order. Against this possible counter-argument, Reichel claims that engineers can only „irritate technology" or, more precisely, react to operations originating from within the system of technology (Ibid, p. 113). There are capabilities of the system of technology that allow us to infer some degree of autonomy, i.e., an agency surplus.6 In terms of the big picture, technology cannot as a whole cannot be reduced to being an instrument of purely human, social or other extratechnological purposes. Even if some technical artifacts are undoubtedly trivial, the system of technology as an assemblage can be considered a non-trivial machine.

Following this path, even if one denies the autopoietic nature of technology as a whole,

5 One may also use the phrase „resonance” to denote the way different function systems interact with and complicate each other’s communications, often in productive ways, although Luhmann only uses this phrase in the Introduction to Systems Theory lecture series to describe the ability of a social system to sense changes in its environment and adjust its operations and structures accordingly (Clark, 2020, p. 2499). Of course, social systems also function as each other’s environments

6 While it is a different framework from that of Luhmann’s, Bruno Latour’s Actor-Network-Theory is one sociologically plausible example of how the concept of agency may be extended to nonhuman and/or inanimate entities (Latour, 2007). Unsurprisingly,ANThas been used extensively in the study of technology. For some recent examples, see: (Spöher, 2018); (Blok, Farias and Roberts, 2020).

7

moremoderateversionsoftheautonomythesisare alsoconceivable.Thisproposalcanbefound in an article byAntonioAscenio-Guillén and Julio Navío-Marco. Focusing on a privileged area of technology, the digital "cyberspace." instead of technology as a whole, the co-authors establish the predominance of anthropocentrism in conventional thinking about technology. Two basic theses are formulated in the literature in connection with technology and the Internet: the homogenization and diversification theses. While the first one interprets cultural globalization as a tendency towards the grayness of uniformity, the second one rather highlights the way in which the Internet disrupts the public sphere (Ascenio-Guillén and Navío-Marco, 2017, p. 24). Whatever we think of these ideas (nothing compels us to accept any of them as axiomatic), Ascenio-Guillén and Navío-Marco are correct in highlighting that both aforementioned positions are inherently anthropocentric, since they focus primarily on people's media consumption habits. If we were pollsters or market researchers, individual consumer decisions might have a special relevance, but subject-centered positions leave us in the dark when it comes to asking questions about the autonomy of technology.According to the authors, the same applies to the so-called "techno-optimistic" or "techno-pessimistic" scenarios. These too usually start from the perspective of human agents rather than the perspective of technology itsel.Whether it is Manuel Castells's optimist pluralistic communicative utopia (Castells (2003) or Jan Van Dijk's pessimist "infocracy," (Van Dijk, 2020), the limitation of such models is that they are primarily based on human meanings, as well as the relative advantages or disadvantages affecting humanity (Ascenio-Guillén and Navío-Marco, 2017, p. 27). Usercentric models of cyberspace are unable to reckon with technology as an autonomous agency. The authors identify four "epistemological obstacles" to be removed before we are able to think about society in a deanthropomorphized way: 1) society consists of concrete (human) individuals and their actions; 2) social integration exists through consensus; 3) societies form regional units; 4) societies can be observed from an external point of view (Ibid, p. 29). 7 For Ascenio-Guillén and Navío-Marco, Luhmann's sociology is useful because it breaks with all four of these anthropocentric social science assumptions. For Luhmann, society consists of communications and irritations communicated by systems, social differentiation takes priority over consensus, society is global (functional systems are organized as a global society), and there is no critical point of view extraneous society.

Ascenio-Guillén and Navío-Marco identify cyberspace as an extreme form of

7 While these points themselves originate from Luhmann, theconcept of „epistemological obstacle” more broadly stems from philosopher of science Gaston Bachelard. This concept describes the sum of all implicit and explicit presuppositions hindering the emergence of a new scientific paradigm (Zwart, 2020).

8

technology, which indeed is beginning to become an independently organized system at the beginning of the 21st century. An autopoietic system has the ability to self-create and selfstimulate, and also connects selectively to its own environment. The matter is complicated by the fact that cyberspace - in contrast to the "field" social function systems identified by Luhmann - does not reduce complexity but, like "high technologies", increases it.According to Ascenio-Guillén and Navío-Marco, cyberspace is "a system that is as complex as its environment” (Ibid, p. 32). According to their view, cyberspace is only a partially autopoietic closed system: it is autonomous, but at the same time it also exists as an environment for society as a whole, as an actual "hypersystem” (Ibid, p. 33). In contrast to function-specific social systems, cyberspace is unhindered by any structural or spatio-temporal factors in the selfstimulation of its own complexity, allowing it to further unfold its internal complexity.Whereas the communications of other social systems are function-specific (the economy only communicates economic data, law only deals with legal cases, and so on), cyberspace in Ascenio-Guillén and Navío-Marco’s view is capable of absorbing any and all forms of communication. Nowadays, there is a politically and morally motivated demand for comprehensive legal and political regulation of the Internet, but these politically motivated efforts ignore the fact that "cyberspace is not structurally connected to its social environment” (Ibid).Thereisnonecessarystructuralcouplingbetweentheinternetandothersectorsofsociety, apart from that of Internet users and social media algorithms8 The Internet has never been a very consistent driver of economic profits (many other industries are more profitable, while Internet companies are nevertheless valued generously, with high multiples), its political controllability is doubtful, its legal status the source of perplexity, its relationship to the science system often downright hostile and anti-scientific. All these factors might be conceived as lending support to Ascenio-Guillén and Navío-Marco’s contention that the Internet is indeed a separate social system. While one could argue that cyberspace depends on a material infrastructure and support from other social systems, this does by no means entails that operations are not controlled by these systems. Cyberspace is simultaneously dependent on environmental support and operationally independent. This view is in line with Luhmannian theory(wereferheretothe„paradoxofsimultaneoustotaldependenceandtotalindependence”,

8 This does not presuppose any distinction between human-owned accounts or fake accounts („bots”). Some users are nonetheless necessary for a social media platform to operate. Hence we can speak of a structural coupling between either organic or artificial interaction systems (i.e., users) and social media as a technological social system: „what distinguishes the social networks from the other social systems is the structural coupling with systems of interaction: in other words the conversational exchanges of the users, generated by contents posted and shared” (Artieri and Gemini, 2019, p. 10).

9

see: Luhmann, 2013 [2002], p. 201). Because of its radical autonomy, cyberspace is supposedly also more complex than society considered as a whole. Digital reality does not need to select among components,it can accept everything intoitself, turningany and all content it into binary coded information. In the case of cyberspace, the most concentrated level of circularity pertains in the area of computer data storage networks acting as its hardware substrate. Such hardware needs to be relatively isolated fromits environment to remain functional: servers cannotoperate in dusty or muddy conditions, for example. In contrast to cyberspace as a whole, its infrastructure is approaching the ideal of an operationally closed autopoietic system (Ibid, p. 36).9 It is noteworthy that Ascenio-Guillén and Navío-Marco do not identify any binary code or program along the lines of which cyberspace could select what is "uploaded.” As they write, "cyberspace does not 'select,’ but 'absorbs' the irritations and signs, from which it creates new signs, new complexity” (Ibid, p. 34). On this reading, cyberspace would be the new ecology, an artificial environment functioning as the shell of society through which all connections are made.

Elena Esposito, building on Luhmann’s insights, has argued for viewing human interactions with algorithms as real communication. In relation to Web 2.0 and the optimization of search engines, we are facing „an unprecedented condition in which machines parasitically take advantage of the user participation on the web to develop their own ability to communicate competently and informatively” (Esposito, 2017, p. 251) On first blush, it would be all too simple to dismiss social media bots or AI algorithms as being unresponsive, unreal communication partners or, worse, passively mechanical tools. Analoguously to the way in which Luhmann takes account of high technology as being irreducible to human intentions, Esposito argues for viewing algorithms as autonomous. Referencing both search engines and significantly more complex programs such as Google’s AlphaGo, which famously beat world champion Go player Lee Sedol in a widely publicized 2016 event, Esposito claims that such technologies point toward „a kind ofArtificial Communication which provides our society with unforeseen and unpredictable information” despite the absence of consciousness on the part of these algorithms (Esposito, 2017, p. 253) Esposito’s broader point is that communication does not necessitate the imputation of intelligence on the part of the sender, an insight extracted, albeit rather arbitrarily, from Luhmann’s communicative model.10 Anything generative of a

9 While not producing nearly enough to power themselves and still reliant on external electricity sources, the „Big Five”Americaninternetconglomerates(Google,Amazon,Facebook,andMicrosoft)allownelectricityproduction units. Therefore, one cannot exclude the possibility that these entities could indeed become completely self-reliant and self-referential in terms of their power supply (Bryce, 2020).

10 Esposito’s emphasis onLuhmann’suseofa file-card system, the Zettelkasten, forhisresearch,while an amusing

10

difference can function as a communicative event: even a non-thinking „sender” can emit meanings. Consciousness certainly needs communication to achieve intersubjective goals and construct shared meanings, but for Luhmann the reverse does not hold: „for consciousness, even communication can only be conducted consciously and is invested in further possible consciousness. But for communication itself this is not so. Communication is only possible as an event that transcends the closure of consciousness” (Luhmann, 1995 [1984], p. 99) Any difference amenable to categorization matters. Deep learning algorithms work effectively not because they involve any sort of intelligence (being non-thinking machines, they lack anything like human consciousness), but precisely because they repeat recursive loops without thinking. In the case of a trivial machine, indeterminacy is not generated. Smart algorithms, at least of the AlphaGo variety, however, allow the human user to experience a genuinely surprising event.11 By utilizing existing moves in the Go game, AlphaGo was able to produce an eerie experience. Users of smart algorithms „face a contingency that is not their own”, encountering „information from a different perspective” (Esposito, 2017, p. 258). We may agree with Esposito’verdict: at the very least, one can say that such deep learning algorithms empirically demonstrate Luhmann’s idea of the separation of consciousness from communication, as well as the non-trivial nature of these specific technologies. Function takes precedent over such vague metaphysical notions as consciousness: „the algorithm does not become more informed or more intelligent; it just learns to work better”, increasing the overall complexity of social communication in often unexpected ways (ibid, p. 262).

However,Esposito’semphasisonthesupposedsurprise-elementinvolvedintechnology demands some qualification. One can only be surprised while witnessing the levels of agency technologies demonstrate if one had previously accepted an instrumental view of technology as the fundamental common-sense position from the outset. Technological autonomy as a concept has been popular since at least the 1970s, and dates back to fears prevalent in the 1930s of a „machine civilization” substantially distorting, or even taking control over human destiny (Winner 1978; Mumford 1934; Spengler 1932). From total domination to quasi-independence fromhumanstoaverylimiteddegree ofagency,technologicalautonomycomesinmanyflavors. For instance, a moderate autonomy thesis would hold that humans can exist in the environment anecdote, does not shed much light on Luhmann’s views on technology and communication (Esposito, 2017, p. 256). The suspicion of superficiality is heightened by Esposito’s recent monograph, Artificial Communication Ostensibly, this work is dedicated to elaborating a partly Luhmannian-influenced philosophy of technological autonomy, yet the book utterly fails to mention Luhmann’s central concept, social autopoiesis (Esposito, 2022). 11 Luhmann summarizes the concept of „double contingency” with disarming simplicity: „do what you want if you do what I want” (Luhmann, 1995 [1984], p. 166). We may also call this phenomenon „mutually responsive adjustment of plans in real-time.”

11

of technology without necessarily being its components. If technology is an assemblage, the elements of the assemblage can still retain their mutual operational autonomy. The systems of science, education, healthcare, etc., can be elements of a large-scale technological assemblage without sacrificing their respective operational or systemic autonomy. Human psychological or interaction systems can also possibly retain operational autonomy or autopoietic closure even in the face of technological embeddedness. The technology system on this view irritates nature, social systems, organizations, psychic systems, interaction systems, and biological systems. These systems in the environment of technology then respond following to their own codes. While social systems, psychic systems, etc., cannot ignore technology's irritations, they can still respond in their own ways. But what if technology has actually become sovereign? A stronger possibility presents itself, namely technological hegemony, the dominance of technology as a social system over other social systems. On this view, the technological system is so powerful as to eclipse other forms of agency. The autonomy of technology becomes clearly conceivable when we proceed from its "high-tech" manifestations. As we have seen, nuclear energy or cyberspace can be examples of uncontrollable or even autonomous technologies. But what about those cases where complete uncontrollability prevails? What can chaotic examples tell us? Similarly to the authorsdiscussed above, Eliana Herrera-Vega's point of departure is also the break with anthropocentric concepts of technology. Technology is "comprehensive" and theories that continue to treat technological systems exclusively in "intentionalist" or "instrumentalist" ways are unable to really understand the implications of autonomy (HerreraVega, 2014, p. 25-6). According to the predominant paradigm, only human agency, and especially human intention in particular, matter. The situation is not much better in the case of representativesof"socialconstructivism”whodeniethattechnologyhasanyessence,substance or autonomy that can be separated from other areas of society. Social constructivists consider technology to be basically the product of social groups or institutions (Ibid, p. 29).12 According to Herrera-Vega, however useful such approaches may be when analysing human uses of technological objects, they are not worth much when it comes to grasping the essential nature of technology itself. „Inter-systemic technological communication” is a real possibility, and the

12 Forsocialconstructionists,humandecisionsdetermineandframetechnology,onabothindividualandcollective level of description (Bijker, 2009). In their defense, social constructivists would claim that while the meaning (or our understanding) of technology is socially constructed, this does not necessarily have to be intentioned. Unintentional social constructs are certainly possible, and probably most of the important social institutions have emerged through processes of spontaneous evolution (Elder-Vass, 2010). Social constructionism is not the same as instrumentalism, but it shares with the latter a dismissal of technological autonomy. Similarly, those "functionalist" explanations that only identify technological functions with their usefulness to human users or human societies are also unsatisfactory. (McLaughlin 2001). For Luhmann, functions are self-referential, and are not describable by reference to a whole, especially not by reference to human institutions, intentions or uses.

12

author finds the Luhmannian social theory framework extremely useful for interpreting this phenomenon (Ibid, p. 30). According to Herrera-Vega, it is correct to say that technology has acquired a certain "self-directing" and "self-referential" character. There is more in play here than "alienation" in the Marxist sense. What may seem like alienation from a human point of view is, from the perspective of the emerging technological hyper-system, the unfolding of its autonomy (Ibid, p. 32).According to the so-called "rupture thesis," technological development has already exceeded the level that would allow for deliberate human planning or control. The system of technology has acquired such a degree of complexity that it is impossible to subordinate it to any political, economic, or subjective objectives. Similarly to Reichel and Luhmann, Herrera-Vega draws attention to the complexity-increasing nature of technology. While traditional function systems reduce complexity, the system of technology does not demonstrate this ability. Quite the reverse holds. In connection with the 2010 oil drilling accident in the Gulf of Mexico, the author considers administrative/bureaucratic apparatuses, interacting organizational forms, the oil rig itself as an artefact, and laws regulating oil drilling as being components of a single technological assemblage (Ibid, p. 35). As in Luhmannian systems theory generally, the phenomenon of interpenetration comes to the fore in this example as well. The technological accident, registered primarily as an ecological disaster, was nonetheless first coded as an economic catastrophe, then a political scandal, and finally a legal case, yet is irreducible to any of these registers. The Deep Water Horizon accident is a sovereign13 technological event, and the authorities and companies involved were only able to "solve" the situation through massive deployments of technology. Human agents, or even governments, are not capable of definitive solutions these days. Using the words of HerreraVega, "subhuman" commands now appear as "autonomous, independent capacities" in social life (Ibid, p. 39). Luhmann can be viewed as providing an ideal model for grasping the selfreferentiality of technology.According to Herrera-Vega, Jacques Ellul’s thesis of technological autonomy is also highly relevant (Ibid, p. 37). Hence I propose that we may productively supplement the emerging Luhmannian research paradigm of technology as an autopoietic system with Ellul’s insights regarding the autonomy of technology.

13 Under this term, I understand the classical idea of one system determining or overcoding the movements, changes and tendencies of another system. A. W. Pink’s definition of sovereignty, albeit in a theological context, works well when applied to the topic of technological sovereignty, or „strong technological autonomy”: „To say that God is sovereign is to declare that He is the Almighty, the Possessor of all power in heaven and earth, so that none can defeat His counsels, thwart His purpose, or resist His will” (Pink, 1949, 14). To paraphrase, for the strong autonomy thesis, the technological system possesses all power, its „counsels” cannot be ignored, and its „purpose” (or internal teleology) cannot be thwarted without negative social consequences, while its agency is „almighty.”

13

The System of Technology

In the following, I attempt to conceptualize technology as a self-organizing system which is sovereign over other systems of society. In the second half of this essay, I propose to fill the empty space of the "missing" Luhmannian theory of technology with Jacques Ellul's philosophy of technology. The latter worked with a systems theory background. The explosive growth of teleworking can be more broadly related to the process that, following Ellul, we characterize as the development of the "technological society" and „technological system.” The human component is undergoing a comprehensive reconditioning to achieve closer integration into the technological infrastructure. There are increasing signs that technology can maintain its operational closure.Therise ofremote workmeans that,atleast incertain sectors,permanent connection and spatial separation are simultaneously viable. This technological development can be interpreted as a correlate of ever closer integration into the technological system, which separates itself from its environment to an unprecedented degree. Differently put, intersubjective separation can actually function as a new form of closer interpenetration between systems, a more intimate connection with technology as an ahuman force. We read the two tendencies – intersubjective separation or, to borrow a Marxist terminology, „alienation”, and technological integration – as interrelated phenomena.

According to Ellul, recognizing the autonomy of technology is essential for understanding contemporary social processes.At every level, technology gradually isolates and immunizes itself from other sectors of society, becoming a self-referential and selfprogramming system. In the late stages of his career, Ellul consciously applied concepts from cybernetics and systems theories to describe the phenomenon of technological autonomy. From technique, we reach an elaboration of technology as both an autonomous and sovereign system (Cérézuelle, 2018). Ellul held that technology, while building up its own recursivity, also subordinates other areas of society to itself. This is a strong view of autonomy. Be it culture, politics, economics, human existence, or nature, in Ellul's pessimistic vision is nothing is immune to integration into self-operating technology. The segregation of elements from each other is a prerequisite for recoding them as technological components. Everything unuitable for integration into the system of technology becomes discarded. I would to focus on two of Ellul's main works in the remainder of this paper: La technique ou l'enjeu du siècle (The Technological Society) and Le système technicien (The Technological System). While the moderate autonomy thesis holds that the technological system is autonomous but not sovereign, Ellul goes beyond this and argues for radical autonomy coupled with deterministic capacity. Ellul frames

14

technology as a holistic system that is both autonomous and sovereign. Furthermore, the systemicclosureoftechnologyanditsinterrelationshipwith humanalienationisalsoimportant. An obvious question which presents itself from the outset is why autonomy at all? To quote Joseph C. Pitt, a skeptical reader of Ellul and an opponent of any concept of technological autonomy, „(1) If technology is a product, and (2) unless we add some additional properties to technology beyond its being a thing we manipulate, then (3) there is no reason why we should even begin to think of technology as not within our control” (Pitt, 1987, p. 112). The burden of proof is on the proponent of the autonomy thesis to demonstrate what characteristic the internal relations of the technological system contain which is irreducible to instrumentalization. In an interview, Ellul explains his distance from Marxist critical theory by saying that he undertakes to continue the tradition of social criticism, with a new focus: "I was sure that if Marx had lived in 1940, he would not have studied economics or capitalist structures, but technology. So I began to research technology using the same method that Marx used to investigate capital a century earlier” (Ellul, 1981, p. 155). The dominant social fact of the 20th century is the purported inevitability of technological domination.14 Ellul’s work has achieved widespread influence, especially in the field of philosophy of technology. To understand the concept of technological society, it is worth investigating the way Ellul uses the concept of "technique ” In the broadest sense, the "technical phenomenon" refers to all the "the search for greater efficiency,”utilitarianformsofactionthat areassumedtocontributetotheachievement ofwelldefined practical goals (Ellul, 1964 [1954], p. 20). Technology is a utilitarian mode of action, a coherent effort to maximize efficiency. Once identified as the most effective course of action, technique tends to become the exclusive course of action. Who doesn't want to finish their daily tasksasquicklyaspossible?Whoamonguswouldchoosethelongerroutetoanurgentbusiness meeting? Modern social life, through a series of automatic adaptations, excludes inefficient, wasteful activities step by step. As a result, "today no human activity escapes this technical imperative” (Ibid, p. 22).

Based on the above considerations, technology obviously cannot be reduced to the machine. Once could claimthat technology lacksagency altogether.To quote Pitt, „the problern is man. The tools by themselves do nothing” (Pitt, 1987, p. 113). Ellul rejects such received wisdom. The imperialism of the efficiency principle includes economic, organizational and "human" techniques. Under the latter ("human techniques"), Ellul understands all forms of

14 Of course, one can accuse Ellul – and rightly so – of oversimplifying things. If society is truly complex, then technological domination is not inevitable. We cannot say that anything is inevitable. This applies to discourses relating to the supposed „domination” of society by „capital” too.

15

psychological and biomedical interventions that condition humanity to living in the emerging "technological milieu.” Technology, defined as the dominance of the efficiency principle in social life, also entails a fundamentally new environment compared to previous ecologies. The natural world is doomed to be replaced by technology. Somewhat fatalistically, Ellul notes that "we are rapidly moving the time when there will be no longer any natural environment at all" (Ibid, p. 79). In itself, this futurological speculation does not seem particularly radical, consideringthecurrentstateofourplanet.Whatmakestheideaoftechnologyasanautonomous environment noteworthy is that the elimination of other environments by technology does not end at the "external,” natural dimension. The "social milieu" is not immune to invasion, colonization and integration by technology either. Referring to the enormous uprooting of rural populations in 19th-century Europe, Ellul emphasizes the vested interest of technology in the transformation of society. Technology creates for itself "the greatest possible plasticity" in society by eliminating "all natural groups" (Ibid, p. 51.). The more widespread non-familial loose ties become, the greater the social mobility and levelling, greatly facilitating the influx of unemployed labor into cities.

While from a 21st-century perspective, the term "natural social group" seems disconcertingly essentialist, Ellul thinks of technology as a third environment (besides natural and human environments). 15 In a later work, Ellul describes the modern condition as a "technological environment" that transcends and eradicates previous natural and social ecologies (Ellul, 1989, p. 134).This applies to human culture too.As Ellul notes in a later work, „each culture is made obsolete” by the triumphant onslaught of technology (Ellul, 1990 [1988], p. 144). As Kevin Garrison points out, the dominance of the technological milieu subordinates all social issues to increasing efficiency, which "as an end game, can never satisfy itself” (Garrison, 2012, p. 62). In contrast to conceptions of technology that conceptualize techniques instrumentally, Ellul's philosophy of technology can be characterized as "substantivist.” However, Garrison's definition of substantivism as "the idea that technology controls people" is overly simplistic (Ibid, p. 58). What is important is the identification of an independent technological essence that can be separated from human presence. Specifically, we can talk about technological essence. Regarding the technical phenomenon, Ellul firmly states that, while it is (also) made of human actions, technique cannot be subordinated to human goals or

15 In this regard, Ellul follows the lead of Gilbert Simondon, who associated technology as an artificial mode of being with the advent of a new „milieu.” Indeed, Ellul cites Simondon frequently in later works. Simondon was one of the first Continental thinkers, alongside Martin Heidegger, to ascribe technology an autonomous essence, its own mode of being contrasted with the human subject. (Simondon, 2016 [1958]; Heidegger 1977 [1964]).

16

intentions.Andrew Feenberg, in a now classic monograph, defines the substantivist view in the following manner: " According to substantivism, modernity is also an epistemological event that discloses the hidden secret of the essence of technology.And what was hidden? Rationality itself, the pure drive for efficiency, for increasing control and calculability.This process unfolds autonomously once technology is released from the restraints that surround it in premodern societies” (Feenberg, 1999, p. 3). This summary is useful for our purposes, although Feenberg later connects substantialism with pessimism about technology in a misleading manner. Substantivism, according to Feenberg, would also imply that „technology is inherently biased toward domination. Far from correcting its flaws, further advance can only make things worse” (Ibid). For our present purposes, it is more than sufficient to identify substantialism with the autonomy of technology vis-à-vis human goals and desires, without attributing any negative or positive value to this. Processes are always part ofbroader evolutionary tendencies.In summary, according to substantivism, "means and ends" become inseparable from each other, the means transforming into ends in themselves (Ibid, p. 9). Instead of the subject, techniques hitherto treated as tools now become the final determining factor, overriding all other perspectives. In what sense can technology as a whole be considered „autonomous”?

A key area where Ellul identifies autonomy in play is the inability of moral codes to restrict the deployment and growth of technology. According to Ellul, it is illusory to suppose that technology can be used in a humanistic or moral way. All conceivable uses will be introduced sooner or later, while the abandonment of an immoral technology can be difficult16 It is misconceived to assume that technology is value-neutral because its "use is inseparable from being" (Ellul, 1964 [1954], p. 95). Technology forces its own use. Morality is alien to technology: the creation of an invention necessarily entails its utilization, provided that it can be incorporated into a previously developed assemblage.17 According to Ellul, those who are enthusiastic about the peaceful uses of nuclear energy while opposing nuclear weapons are

16 As a counterpoint to Ellul’s rather one-sided pessimism, one could point to international bans on the use of specific weapons technologies, such as the 1997 Ottawa Convention prohibiting anti-personnel landmines and the 1925 Geneva Protocol banning poison gas in warfare. A remarkably small number of outliers aside, such agreements have generally been respected, at least thus far, eliminating certain weapons from active battlefield use.

17 Naturally, if we employ a systems theory perspective, then technology is not exceptional in this regard. In modern, functionally differentiated societies morality is also alien to the economy, law, science, religion, art, mass media, and the other function systems. Morality is promoted mostly by organizations and interaction systems. If morality were predominant, the functional areas of society would be unable to follow their own codes and become swiftlyinoperative,weighteddownbyvaluesalientotheirlogic.VladislavValentinovhasarguedforaninteresting form of „Luhmannian” ethics that would ground its normativity upon the imperative to sustain the functioning of society: „the normative relevance of systemic sustainability does not rest on the desires and needs of human beings”, but rather upon the need to maintain systemic functionality (Valentinov, 2019, p. 110).

17

making a mistake and fail to acknowledge the true nature of the technological phenomenon: "technologynever observes the distinctionbetween moral and immoral use” (Ibid, p.97).When it comes to rational efficiency, all other normative considerations are bracketed. In a Luhmannian semantic framework, we could say that technique is only capable of observing the „works/does not work” binary distinction. The primary characteristic of autonomy is therefore the separation of technology from moral requirements. True autonomy implies separation from every normative standpoint and value.18

But Ellul does not stop here. Since it manages to frame itself as the sole effective way of acting, the choice always falls upon technical solutions in modern society. Only those patterns of action believed to be the most effective are given priority. Moral or immoral consequences are of merely secondary importance as compared to implementability. Technology itself is use, and the only valid, overriding imperative is the use of every invention (Ellul, 1964 [1954], p. 99). However, the substantive content of technological autonomy is not limited to amorality. Contrary to the economic and power-based reductivism of Marxists, for which technology is merely a means of consolidating class power, according to Ellul technique is "autonomous with respect to economics and politics" too, and is even "independent of the social situation," being "the prime mover of all the rest” (Ibid, p. 133). A functional boundary separates technology from other areas of society. Far from society shaping the ways in which technology is used, society itself is shaped, transformed and conditioned by the latter. The process of technicization also implies that human components of society are not resistant to integration into the system of technology. Ellul envisions the complete integration of humans and technology, during which "it will be inadmissible for any part of the individual not to be integratedinthedrivetowardtechnicization”(Ibid,p.139).Indeed,thosewaryofsubordinating their bodies to beneficial technological interventions are widely viewed as irrational, or even dangerously irresponsible. Even so-called leisure activities are coming under the control of technology. Half a century before the invention of Internet social media, Ellul registers the merging of leisure activity into the technical whole: “all recreation is saturated by technical mechanismsofcompensationandintegration.Doingnothingnolongermeansanemptydomain

18 In this respect, Wha-Chul Son's well-intentioned idea, according to which the effectiveness principle could be somehow mitigated in society by supposedly "human-centered" forms of technology, seems extremely naive. According to Son's view, "purpose-driven technological innovation (...) revives us humanity’s initiative in the realm of technology”. (Son, 2012, p. 60). This position does not take seriously the radically inhuman otherness of technology as a phenomenon. Compared to human values and goals, technology represents something utterly different. The merit of Ellul's technological philosophy lies precisely in the fact that it makes the autonomy of technology visible, of course in a pessimistic and unacceptably humanistic discourse Despite his humanism, Ellul recognizes that it is impossible to domesticate or dominate technology, for the latter moves on its own evolutionary trajectory.

18

(...) Free time is also mechanized time and is exploited by techniques that, although different from the forms experienced during working time, are just as invasive, burdensome and leave as little real space for freedom as the work” (Ibid, p. 401). Critical theories regarding the use of social media and its monetization in the form of unpaid, gamified "play labor/playbour” were anticipated by Ellul, when he dismissed the misconception that automation would increase the amount of free time and quality (Fuchs and Sevignani, 2013). In fact, the growth of "free time" is achieved at the cost of complete psychological and functional oversaturation of the human element by technological factors, and transformation of play into goal-oriented work that produces data for the technological complex.

Despite the apparent similarity of their diagnoses, Ellul differs from Marxism in that the former describes the autonomy of technology in a much more radical way. Technology is not a product of the socioeconomic milieu. The exact opposite is true: technology creates a "plastic" environment for itself, perfectly suitable for its own proliferation. Despite the fact that certain social groups, such as top managers, bureaucrats and data engineers occupy a privileged position in "automatic cooperation,” this does not imply that any human elite can control or even supervise the entire process (Ellul, 1964 [1954], p. 137). Drawing on the language of cybernetics, Ellul characterizes the technological society as a "feedback process" leading to the „progressive elimination of man from the circuit” (Ibid, p. 136). Human components themselves become interchangeable, and human freedom becomes diminished. Technology has "intrinsic finality" and is independent of external ends (Ibid, p. 141). All discourses that trace technology back to social or economic factors are fatally misguided. The phenomenon of technology underlies social, cultural, economic and political factors today. While technique contributes to the centralization of state power, the accumulation of capital, the psychological disorganization of the masses (media consumption, chemical mind-altering) and the erosion of civil liberties, in Ellul's view these are mere symptoms of the sovereignty exercised by technology over modern society. It may seem that Ellul has a decidedly fatalistic attitude towards technology as a factor in history. No matter how humanitarian its orientation, no social arrangement can prevent the development of technology. Whatever we believe about fatalism as a philosophical concept, the description that Ellul provides about technological autonomy is indeed relevant. If we seek an adequate description of technology, we must consider it a selforganizing system. Explanations that devalue autonomy are of limited usefulness. The instrumentalist conceptionof technologydeals with everything, but not the essence.As wehave seen, technology for Ellul is „a system which „incorporates processes, rules (or codes), and institutions as well. Technique is a totalizing system of methods that has become the new

19

environment, or milieu, for all of human existence” (Hanks and Hanks, 2015, p. 461).

In his later volume, The Technological System, Ellul draws more explicitly on systems theory concepts in order to deepen his own analysis. Technology is a system, even a separate universe, which is both "exclusive and total” (Ellul, 1980 [1977], p. 35). Technology is more than just an infrastructure or one mode of communication among many: Technology, as the sole mediator now recognized, actually "escapes any system of values” (Ibid, p. 36). Unbounded technicization leads to the replacement and recoding of all previously existing relationships. In an incisiveessay,Yuk Huicharacterizestheprocess oftechnicizationas"desymbolization."The technical phenomenon is „a vital force within the objects themselves that constitutes their progress from one stage to another,” while excluding all elements too complex or chaotic for technical integration (Hui, 2012, p. 74). Replacing traditional values, which are usually completely alien to the efficiency principle, desymbolization goes hand in hand with designification. According to Hui, “desymbolization” describes a “process of short-circuiting” that undoes traditional values and ways of life while bringing forth „an efficient and automatic technological system” (Ibid, p. 75). Ellul himself emphasizes, not without nostalgia, the loss of the "ability to symbolize" on the part of people brought up in modernity, a result of the development of technology. Meaning becomes artificial, only accessible through technical mediations. Desymbolization transforms everything into something comparable and interchangeable, and as a result "modern man is torn apart”19 (Ellul, 1980 [1977], p. 40). In Ellul’s view, even seemingly creative activities such as "design" or architecture bear the traces of the technological invasion that pervades society (Ellul, 1980 [1977], p. 170). It is no exaggeration to say that design is a designation, a methodical removal of extra-technological meanings that cannot be processed by technical means. Noise-producing and disturbing factors must be removed from the system, functional stability requires this from technicians and officials. Problematically, everything is slowly becoming endogenous to the third technological environment. The system of technology undertakes to incorporate all elements: integrable moments will have a place in universal interconnectivity. After all, what else would be the purpose of data sciences, other than the construction of „an emerging digital milieu and a concretizing technological system, in which different entities can be digitized and thus connected by data links”? (Hui, 2012, p. 78). The predominant social

19 Before dismissing this statement as too bombastic, it is worth pointing out that Luhmann talks about exactly the same thing when describing the modern subject as an assemblage ("psychic system") situated at the intersection of different functional systems, breaking down into separate, specialized functions. We could also mention social psychologist Erving Goffman's „dramaturgical theory,” according to which the modern subject falls apart into a multitude of institutional roles. Goffman, 1959.

20

situation of our time, allegedly largely due to the COVID-19 epidemic, has exiled a significant proportion of the human workforce to the digital space. Digital modes of being are becoming the role, not an exception or luxury. The accelerated, drastic manifestation of a process that was actually already taking place at the time of Ellul's writing: the merging of natural and social milieus into the third, technological milieu. With acutely disturbing foresight, Ellul observes that „the mediating technological system becomes the universal mediator, excluding any other mediation but its own. That is the highest degree of its autonomy” (Ellul, 1980 [1977], p. 38).

In-person contact is replaced by digital communication, be itremote work or dating apps Intersubjectivity becomes ever less possible outside of the telecommunications complex. We are technicized subjects (even the writer of these very lines!), while living digitally isorders of magnitude more convenient than any alternative. The luxurious isolation and claustrophilia of our time dramatically underlines Ellul's point; in modern society "technology is the living environment” (Ibid, p. 39). Outside of technologically mediated intimacy, the 21st century subject is endangered by a devastated, uninhabitable and crowded urban environment. Technology is the only ecology in which we can live. It would be absurdly naïve to assume that we can reverse this overnight, through some Rousseauian return to nature. Escape into the wilderness is impossible, because the phenomenon of technology is present everywhere, reediting reality and the perception of reality according to its own systemic needs. Technology is a connectivity that affects and integrates everything. To separate ourselves from this system is to risk permanent social disconnection (Gertz, 2016). 20 Any direct, immediate, unmediatedconnectivity is impossible: technology has carried out a kind of phenomenological reduction, reducing users to blocks of data, synthetic aggregates of predictable behavior, monetizable data products.

Drawing from systems theory and insights from Gilbert Simondon’s philosophy, Ellul tries to prove in The Technological System that technology has become a closed, self-referential system. In order to achieve operational closure, it is necessary to integrate previous milieu into the technology. The technical milieu, „must be regarded as an ’organism’ tending toward closure and self-determination: it is an end in itself. Autonomy is the very condition of technological development” (Ibid, p. 125). Of course, the extent to which the technology can really be considered a truly full-fledged functional system is debatable. Luhmann, for example, was skeptical that technology would actually have reached the level of organization of an

20 Onecannotliveeffectivelywithoutusingsocial media,butthelattercanoperatewithoutrealhumanengagement or users apart from bots, although the economic value of such platforms is still to a large extent determined along the lines of how many supposedly „human” eyeballs are supposedly reached.

21

autonomous social system by the 1990s. However, Luhmann also admitted that there is a tendency towards closure in the technical dimension: "We see the problem of technology in the isolation of its operations from an interfering charge of meaning, in, one could say, its unirritability" (Luhmann, 2013 [1997], p. 243). 21 Whatever we think about technological autonomy as a concept,22 according to Ellul tendency is paramount. Hui puts it appropriately, stating, in relation to the phenomenon of technology that the latter is inseparable from its own "progression” (Hui, 2012, p. 74). Differently put, in the case of technology, system is process If we think of the autonomy of technology as a dynamic process and not a fixed state, then Ellul can be exoneratedfromthe accusationthat his descriptionis tooabstract,simplistic ortotalizing. Instead of reading Ellul's philosophy of technology as a description of the current state of technology, it is more fruitful to interpret this view as a representation of the evolutionary tendencies inherent in technology as a third milieu, regardless of the precise extent of their realization.

Our task is not to define exactly how self-referential a system of technology is at a given moment in time. Rather, of greater important for us is awareness of the existence of a largescale process tending towards systemic closure. As Hanks and Hanks remind us, „technology becomes autonomous when it ceases to be primarily a means or a tool and begins to become an end or a goal, and as it tends toward self-perpetuation and intractability” (Hanks and Hanks, 2015, p. 464). The third milieu seeks to replace all natural entities. Referring to a very relevant example, animals produced in industrial agriculture are also considered as a source of disease. Industrial-scaleanimalhusbandryis,ironically,toonaturalandspontaneousfromatechnocratic perspective, because it can cause uncontrollable and unpredictable pandemics. Despite the micromanaged environment, nature is stubbornly full of uncertainty. For humanitarian and health reasons, some are advocating a radical reduction in the consumption of meat products of animal origin, advocating for a switch to plant-based or lab-grown ersatz "meat products.” This example also supports the observation that the systemof technology/science23 can only propose solutions for dysfunctions arising from earlier technologies (industrial animal husbandry) via mobilization of newer technologies (artificial meat). We should not look for ideology or mental

21 Luhmann hereis referring tothe same phenomenon as Elluland Hui,namely the self-referentiality of technology. The autonomy of technology from human meanings is the very essence of the desymbolization process.

22 David Roden holds that Ellul’s holism gets in the way of a realistic portrayal of technology. As Roden writes, „modern technology is not in control of anything, unlike the Molochs of Heidegger and Ellul. On the other hand, there are good reasons for thinking that it is out of control, since its very flexibility means that techniques can be iterated in ways that deform or alter its systemic properties.” Roden, 2022, p. 8.

23 Thesetwosystemsare,forEllul,oneandthesame.FromaLuhmannianperspectivehowever,science(asdistinct from technology) is already a separate function system, which operates along the lines of the „true/false” binary. (Baraldi and Corsi, 2017; Leydesforff, 2001, p. 180-211.; Taschwer, 1996;).

22

attitude behind this: in fact, it can be attributed to systemic structural inertia. The technology system is path-dependent: all problems can only be addressed when translated into the system's own language. External questions cannot be posited, the technology system can only respond to technical, i.e., endogenously generated questions.

Assoon asadysfunctionoccurs,thetechnicalsolutionisimplemented.Since thesystem is unable to categorize anything different from its own coding, every problem is recorded as a technical insufficiency, a temporary disturbance (Ellul, 1980 [1977], p. 49).According to Ellul, it is not sufficient to assume that technology has freed itself from dependence on other social systems. It has actually become the ecology of every other system, the environment of environments. No existing political or social system can resist the efficiency principle.24 Ellul also draws our attention to specific cases where political interests attempted to override technological requirements. In general, it can be said that the consequences of the direct intervention of politicians are "always catastrophic,” and the result is further delegitimization of the political sphere, in addition to the relative appreciation of technical knowledge (Ibid, p. 130).Neitherthemarketnorothertypesofsocialarrangementscanresisttheallureofefficiency, which overrides all other considerations. The accusation of inefficiency is crucial to the deligimation of social forms. No matter how morally relevant or politically useful an ideology may be, it seems impossible today to argue for a social model that is inefficient. If all other solutions fail, one must be able to prove that dysfunctions are actually useful functions. Pragmatismisthenorm,theundisputedmeta-value,andplausiblesocialalternativesmustprove their usefulness through compliance with the efficiency imperative.25

Ellul wasskepticalthat governments wouldeverbe ableto regulatetechnology. Initself, regulation can only work by promoting further development of the system of technology: today it is unthinkable to talk about governance without mobilization of technical mediations. Be it the magical „whitening” of the economy through digitalization or mobile apps tracking the

24 As opposed toEllul, one could argue that awhole range offunctional systems are inefficient, often demonstrably so. The art system does not need to be efficient. We can also argue that the education system is often quite inefficient, and even actively resists the irritations of technology's efficiency principle. The healthcare system seemstoresistorsubvertcallsforgreaterefficiency.Thepoliticalandthelegalsystemscanalsobequiteinefficient. However, Ellul would respond that even if there are pockets of society where other perspectives predominate, their codes or values are still subordinated to technological imperatives.After all, art must be valuable, healthcare must deliver „positive” health outcomes (a technically measurable value), the law must produce legal decisions, and politicshastoresolvedisputes.Efficiency,Ellulwouldargue,hasawayofinsinuatingitselfintotheinnerworkings of all areas of society.

25 Inefficiency and dysfunction are treated by Ellul as being synonymous, but they are not necessarily so.Asystem or process may be inefficient but still functional, just as it may be dysfunctional but efficient. For example, the QWERTY keyboard layout is inefficient but necessarily dysfunctional. It was not designed with the primary goal of maximizing typing speed or efficiency.

23

movements of infected individuals, effective implementation of government regulations depends on integrated technological solutions. If the state attempts to implement"purely political decisions without taking into account the interests of the technical milieu, "this always causes disasters at the technological level” (Ellul, 1980 [1977], p. 137). Of course, it does not follow that self-regulation of technology is free from blunders, as exemplified by such purely technological accidents as Chernobyl, Bhopal, the oil rig explosion in the Gulf of Mexico or Fukushima, not to mention longer-term and slower-paced complex ecological apocalypses, as car-centericAmerican city planning implemented along purely technological interests. Only at the cost of considerable civilizational regression would government be able to regain meaningful autonomy from technology. Shutting down nuclear power plants or destroying America's highways would be political suicide. According to Ellul, there is an asymmetry between the technological system and other areas of society. While other social functions are forced to adapt to the evolution of technique, the reverse is never the case.

Theobviousinabilityofthepoliticalsphereto meaningfullyregulateorlimittechnology demonstrates the autonomy of the third milieu. Technology „has (...) become not only the determining fact but also the "enveloping element," inside which our society develops” (Ibid, p. 161). Earth has been integrated into the planetary system of technology. In a totalizing way, Ellul asserts that technology as a mode of being „does away with limits,” is "potentially limitless" and strives for endless growth (Ibid, p. 154). Moreover, if we accept Ellul's argument, technology is equivalent to the environment, at least as far as Planet Earth is concerned. No functional system can work outside of technology.Although remnants of the natural ecosystem already showed signs of crisis at the time Ellul's volume was written, technology always finds ways to circumvent its limits. We must be very skeptical regarding the possibilities and realities ofregulationalongpoliticallines.26 Thesystemoftechnologyisself-regulating,inthatitfosters the political sensibilities and ideologies that facilitate its self-organization. Otherwise, inefficiency and dysfunction will occur.

Thatbeingsaid,dysfunctionandinefficiencyarerelatedconcepts,butdonotnecessarily

26 Ellul in a relatively late volume, The Technological Bluff, refers to Luhmann’s ideas on politics as a selfregulating functional system: „For Luhmann it is a peripheral and provincial idea to think that individuals can influence the state in areas which are beyond their competence and in which the process is autonomous and contingent. Luhmann rightly refers to the growing autonomy of the apparatus of government. Habermas talks about the dependence of the government on the interests of the better-organized groups, the weightiest of these being the technicians and scientists. Thus the combination of political administration and the technostructure results in the total elimination of individuals. But the character of the combination and of the power of the state is strange, for the state less and less directs the economy. Its planning is for itself alone” (Ellul, 1990 [1988], p. 303). Ellul’s point is that politics, being preoccupied with its own binary code, is no longer in a position to regulate technology.

24

imply one another. Inefficiency refers to a suboptimal allocation of resources or the use of more resources than necessary to achieve a particular technical outcome. Dysfunction, on the other hand, refers to a systemor process that fails to operate or deliver the intended results effectively. In some cases, inefficiency can lead to dysfunction, especially if the inefficiency becomes so severe that it hinders the overall functioning of a system or process. For example, if a manufacturing process is highly inefficient, with excessive waste and delays, it may become dysfunctional and unable to meet production targets or deliver products on time. However, it is possible for a system or process to be inefficient while still functioning. An inefficient system may still achieve its goals, albeit at a higher cost or with more resources than necessary. In such cases, the system may be considered suboptimal, but it is not necessarily dysfunctional. Similarly, dysfunction can occur even in the presence of efficiency. A system may be highly efficient in allocating resources or executing tasks, but it may still be dysfunctional if it fails to achieve its intended goals or produces unintended negative consequences. For example, a company might have a highly efficient internal communication system, but if the communication leads to decision-making based on incorrect or misleading information, the system could still be considered dysfunctional. An example of a system or process that is inefficient without being dysfunctional is a traditional office work environment where employeesarerequiredtoworkfrom9 amto5pm,MondaytoFriday.Whilethisworkschedule has been a standard for many years, it may not be the most efficient way for employees to work. While this work schedule may be inefficient, the system is not dysfunctional as it is still achieving its intended purpose of providing a consistent work schedule for employees. Thus, dysfunctionandinefficiencyarerelatedbutdistinctconcepts.Whiletheycansometimescoexist or be causally linked, one does not necessarily imply the other. It is possible for systems or processes to be inefficient without being dysfunctional, and vice versa.

Anexplicitlyanti-technologysocialpolicyisimpossibletoimplementsolongassociety is embedded in the system of technology. Human agents apparently make the most important decisions in the social sense, but the human subject is inherently conditioned by "human techniques" such as propaganda and mind-altering drugs. We have become inherently technological beings.27 Following the futurologist Yona Friedmann, Ellul envisions that the human nervous system will be chemically adapted to life in an artificial environment (Ibid, p.

27 Naturally, one could venture the claim that to be human is to be technological from the get go.According to the homo faber concept, an idea first introduced by Henri Bergson and with which Ellul too was demonstrably familiar with, humans make things and are in turn „made” by their own technologies. To be human means, on this view, to be both the producer of oneself and to be produced by one’s own products in a recursive loop. Homo faber is an inherently technologically-engaged mode of being (Ihde and Malafouris, 2019).

25

260). The widespread use of antidepressants in our time seems to underscore the importance of human resource management techniques.The mental healthcrisis of the early 21st century does lend plausibility to Ellul's predictions. Homo sapiens was not created for a technological environment, so the latter chemically transforms the human being. After proper conditioning, the human element becomes suitable for existence in large urban conglomerations. Whether it is closure or reopening, technology wins in both alternatives, as the unfolding of the technological system continues unhindered in all directions. Humanity, multiplied into billions, is unimaginable without technology. "Self-organization" is the primary fact of technology, and only through this can its operation and presence be summed up (Ibid, p. 141). For Ellul, autonomy also leads to uncontrollability. Growth is built into the system. If innovation becomes temporarily impossible, sooner or later a way around the bottleneck is invented. While even in Ellul's time there was much communication about stagflation, the limits of growth, slowdown and so on, from the early ecological movements to the 1972 report of the Club of Rome (The Limits of Growth), in retrospect it can be concluded that all these semantics had no effect whatsoever on technological growth.28 The l'élan technique has remained intact through all of its crises.29

Infinite digital memory complements and then replaces organic memory. Drawing on Edmund Husserl's terminology, Hui states that “tertiary [technological] retention complements the finitude of organic memory (the latter of which can be characterized as a duality of protention and retention)30 with the infinite range of memories made possible by digitalization. But tertiary retention also becomes a source of primary retention, and secondary retention is also a source of protention” (Hui, 2012, p. 80). Couched in the language of system theory, the technological system has become a tertiary observer that supersedes all other modes of observation in a temporal sense as well. Even more drastically, we could say, using Luhmann’s language, that algorithmic differentiation has obliterated functional differentiation (Tække, 2022).Anadvancedmanifestationoftechnologicalalienationisthe separationofthe mind from the organic sources of memory. Uploaded into digitality, intersubjectivity becomes a source of raw material that can be endlessly re-edited, transformed and mined. There is no longer any boundary between real and simulated memory. Since we are unable to access any direct, unmediated organic level of memory, the technologically-integrated subject has no hope of

28 One may refer here for instance to the „degrowth” and Slow Food movements.

29 Truly, it seems technology is the sole hegemonic form of „striving” today, at least on Earth. I borrow this expression, a reference to Henri Bergson’s concept of the élan vital (life force, vital striving) from the following paper: Winkler, 2007.

30 Protention is expectation relating to the future, while retention is memory of the past.

26

regulating the system as a whole. Pol Pot, who demolished cities, or Ted Kaczynski, who advocated rebellion against technology, become Internet memes in the same way as Bill Gates or Steve Jobs. The arrival of the third milieu underlines the "self-regulating nature of technology" and "normalizes" its own functioning to such an extent that all other functions become impossible (Ellul, 1980 [1977, p. 153). If we accept the autonomy of technology as a reality or at least a real tendency (élan technique), it becomes very difficult to take seriously the legitimacy of technocratic or anti-technological tendencies. These are pure fantasies, manifestations of wishful thinking. It is not necessary to accept Ellul's fatalism in order to recognize the validity of the technological autonomy thesis. Sometimes Ellul exaggerates. For our part, however, we can maintain that the autonomy of technology entails the consequence that technology cannot be reduced to any human purpose. Whether the degree of imperialism described by Ellul follows from its more than instrumental nature is a matter open to debate. In itself, technology as an independent functional area could imply its equality among other functions ystems, but we also know that every functional system strives for monopoly. Ellul's philosophy of technology is therefore grounded on two statements: 1) technology has acquired functional independence, and 2) it dominates other areas of society. It is an empirical question whether the autonomy of technology vis-à-vis humans and society entails an all-pervading determinism or not.

Conclusion

We must recognize that there is no corner of the Earth unaffected by technology to some extent. Every change that takes place in the system of technology is a "self-conditioned" modification (Ibid, p. 275). Technology can only encode events to extent that it processes it and translates presence into a technical question along the lines of its own binary code. Looking ahead, Ellul envisions three possible scenarios for the future of the technological system. The first is intentional slowdown forced by external administrative means. The second is a sudden, catastrophic collapse. The third is a gradual but spontaneous shutdown, which would result frominternaldysfunctionsoftechnologyitself.AccordingtoEllul'sapocalypticvision,collapse is inevitable. As "technology grows until it reaches its limits,” by the time the gravity of the situation becomes apparent, " the cumulative effects come into play, it will be too late –catastrophe will occur through a massive decline in technology and population” (Ibid, p. 288). These two ultimate phenomena, technology and population, are inseparable from each other. Allofusareartificiallykeptalivebyhuman techniques.Technologicalautonomymerelymakes

27

clear that the „differentiation of society has always been alien to human life” (Tække, 2022, p. 19). Society never was an intentional product of human planning. The recent COVID-19 pandemic also points to a fourth possibility, namely, the uncontrollable proliferation of the few remaining spontaneous, unplanned natural forces, severely disrupting complex social infrastructures, logistical chains and institutions. The big question is which alternative will triumph. It cannot be ruled out that the adaptive capabilities of functional differentiation also allow for building up additional complexity.Apparent disasters can easily become tools for the further self-construction of technology. Crises could very well prove to be building blocks of an even more self-referential social form. Chaos is the most useful building material. In any case, apart from the communicative anthropological substrate built into technology, it is difficult to envision any autonomous human subjectivity or function system outside of technology. In this paper, I have hoped to show, using a variety of primary and secondary literature, that technology can be conceived of as a functional social system that is autonomous from interaction systems. By reading Luhmann’s systems theory through an Ellulian lens, we can shed light on the ways the technological system evades human regulation, while leaving open the question of whether human life outside of or beyond technology is possible or not. That being said, on my part I am severely doubtful whether our species as homo faber can survive a non-technological displacement, or even if a „detechnologized” humanity is not in itself an oxymoron.

Acknowledgements

I am indebted to the peer reviewers for their suggestions and comments regarding previous versions of this article.

28

References

Artieri, G. B. and Gemini, L. (2019) "Mass media and the web in the light of Luhmann’s media system”. Current Sociology: 1-16.

Ascenio-Guillén, A. and Navío-Marco, J. (2017) "Cyberspace as a system and a social environment: a theoretical proposal based on Niklas Luhmann.” Communication & Society 31: 23-38.

Baecker, D. (2006) "Niklas Luhmann and the Society of the Computer. " Cybernetics and Human Knowing 13: 25-40.

Baraldi, C. and Corsi, G. (2017) Niklas Luhmann. Education as a Social System. Cham: Springer.

Beck, U. (1992 [1986]) Risk Society. Towards a New Modernity. Los Angeles, London, New Delhi, Washington DC: SAGE Publications.

Bijker, W. E. (2010) "How is technology made?—That is the question!." Cambridge journal of economics 34.1: 63-76.

Blok, A.; Farias, I. and Roberts, C. (2020 eds.) The Routledge Companion to Actor-Network Theory. London and New York: Routledge.

Bryce, E. "How Google Powers Its ‘Monopoly’With Enough Electricity For Entire Countries." https://www.forbes.com/sites/robertbryce/2020/10/21/googles-dominance-is-fueledby-zambia-size-amounts-of-electricity/

Castells, M. (2003) The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford University Press.

Cérézuelle, D. (2018) "Jacques Ellul: From Technique to the Technological System." The Ellul Forum 62: 59-67.

Clark, C. (2020) "Resonanzfähigkeit: Resonance capability in Luhmannian systems theory." Kybernetes 49.10: 2493-2507.

Elder-Vass, D. (2010) The causal power of social structures. Emergence, structure and agency. Cambridge: Cambridge University Press.

Ellul, J. (1989) What I Believe. Grand Rapids: Eerdmans Publishing Company.

Ellul,J.(1990[1988]) The Technological Bluff. GrandRapids:WilliamB.EerdmansPublishing Company.

Ellul, J. (1981) À temps et à contretemps. Paris: Centurion.

Ellul, J. (1980 [1977]) The Technological System. New York: Continuum.

Ellul, J. (1964 [1954]) The Technological Society. New York: Vintage.

29

Esposito, E. (2022) Artificial Communication. How Algorithms Produce Social Intelligence. Cambridge: The MIT Press.

Esposito, E. (2017) "Artificial communication? The production of contingency by algorithms."

Zeitschrift für Soziologie 46.4: 249-265

Feenberg,A. (1999) Questioning Technology. London and New York: Routledge.

Foerster, H. (2014) The Beginning of Heaven and Earth Has No Name. Seven Days with Second-Order Cybernetics. M. New York: Fordham University Press.

Foerster, H.(1984) "Principles ofSelf-Organization - ina socio-managerialcontext." in: Ulrich, H.andProbst,G.J.B.(eds.1984) Self-Organization and Management of Social Systems Insights, Promises, Doubts, and Questions. Cham: Springer, 2-24.

Fuchs, C, and Sebastian S. (2013) "What is digital labour? What is digital work? What’s their difference? And why do these questions matter for understanding social media?.”

TripleC: Communication, capitalism & critique. Open access journal for a global sustainable information society 11(2): 237-293.

Garrison, K. (2012) "Ellul’s alternative theory of technology:Anticipating the fourth milieu of virtuality.” Explorations in Media Ecology 11(1): 57-72.

Gertz, N. (2016) "Autonomy online: Jacques Ellul and the Facebook emotional manipulation study." Research Ethics 12.1: 55-61.

Goffman, E (1959) The Presentation of Self in Everyday Life. New York:Anchor.

Hanks, J. C, and Hanks, E. K. (2015) "From technological autonomy to technological bluff. Jacques Ellul and our technological condition.” Human Affairs 25(4): 460-470.

Heidegger, M. (1964) "The Question Concerning Technology.” In: Heidegger, M. (1977) The Question Concerning Technology and Other Essays. New York and London: Garland Publishing, 3-36.

Herrera-Vega, E. (2014) "Relevance of N. Luhmann’s theory of social systems to understand the essence of technology today. The Case of the Gulf of Mexico Oil Spill” Technology in Society 40: 25-42.

Hui, Y. (2012) "Technological System and the Problem of Desymbolization.” Jerónimo, H. M., Garcia, J. L. and Mitcham, C (2012. Eds.) Jacques Ellul and the Technological Society in the 21st Century. Cham: Springer, 73-83.

Ihde, D. and Malafouris, L. "Homo faber Revisited: Postphenomenology and Material Engagement Theory” Philosophy & Technology 32: 195-214.

Latour, B. (2007) Reassembling the Social. An Introduction to Actor-Network-Theory. Oxford: Oxford University Press.

30

Leydesdorff, L. (2001) A Sociological Theory of Communication: The Self-Organization of the Knowledge-Based Society. Irvine: Universal Publishers.

Luhmann, N. (2013 [2002]) Introduction to Systems Theory. Malden: Polity.

Luhmann, N. (2013 [1997]) Theory of Society. Volume II. Stanford: Stanford University Press.

Luhmann, N. (2012 [1997]) Theory of Society. Volume I. Stanford: Stanford University Press.

Luhmann, N. (1996). "On the scientific context of the concept of communication." Social Science Information 35.2: 257-267.

Luhmann, N. (2000 [1995]) The Reality of the Mass Media. Stanford: Stanford University Press.

Luhmann, N. (1993 [1991]) Risk. A Sociological Theory. Berlin and New York: De Gruyter.

Luhmann, N. (1990). "Technology, environment and social risk: a systems perspective" Industrial Crisis Quarterly 4: 223-231.

Luhmann, N. (1995 [1984]) Social Systems. Stanford: Stanford University Press.

McLaughlin, P. (2001) What Functions Explain. Functional Explanation and Self-Reproducing Systems. Cambridge: MIT Press.

Mumford, L. (1934) Technics and Civilization. London: Routledge & Keegan Paul. Pink,A. W. (1949) The Sovereignty of God. https://ccel.org/ccel/pink/sovereignty/sovereignty.toc.html

Pitt, Joseph C. (1987) "The Autonomy of Technology” in: Durbin, P. T. (ed. 1987) Technology and Responsibility. Dordrecht: Springer Business Media, 99-115.

Reichel, A. (2011) "Technology as System. Towards an Autopoietic Theory of Technology.” International Journal of Innovation and Sustainable Development 5.(2-3): 105-118.

Roden, D. (2022) "Technological Anti-Holism and the Thinking of the Outside.” [conference paper].

Spengler, O. (1932) Man and Technics. A Contribution to a Philosophy of Life. New York: AlfredA. Knopf.

Simondon, G. (2016 [1958]) On the Mode of Existence of Technical Objects. Minneapolis and London: Univocal - University of Minnesota Press.

Son, W. C. (2012) "Are We Still Pursuing Efficiency? Interpreting Jacques Ellul's Efficiency Principle.”In:Jerónimo, H.M., Garcia, J. L.andMitcham, C (2012. eds.) Jacques Ellul and the Technological Society in the 21st Century. Cham: Springer, 49-63.

Spöher, M. (2018 ed.) Analytical Frameworks, Applications, and Impacts of ICT and ActorNetwork Theory. Hershey: IGI Global.

Strate, L. (2004) "Ellul and technology studies." Communication Research Trends 23.2: 28-32

Tække, J. (2022) "Algorithmic Differentiation of Society – a Luhmann Perspective on the

31

Societal Impact of Digital Media.” Journal of Sociocybernetics 18.(1): 2-23.

Taschwer, K.(1996) "Science assystemvs. science as practice:Luhmann's sociologyofscience and recent approaches in science and technology studies (STS)—a fragmentary confrontation." Social Science Information 35.2: 215-232.

Valentinov, V. (2019) "The ethics of functional differentiation: reclaiming morality in Niklas Luhmann’s social systems theory." Journal of Business Ethics 155: 105-114.

Valentinov, V. (2014) "The complexity - sustainability trade-off in Niklas Luhmann's social systems theory." Systems Research and Behavioral Science 31.1: 14-22.

Van Dijk, J. (2020) The Network Society. Los Angeles, London, New Delhi, Washington DC: SAGE Publications.

Winkler, R. (2007) "Nietzsche and l’élan technique: Technics, life, and the production of time.”

Continental Philosophy Review 40(1): 73-90.

Winner, L. (1978) Autonomous technology. Technics-out-of-control as a theme in political thought. Cambridge. The MIT Press.

Zwart, H. (2020) "Iconoclasm and imagination: Gaston Bachelard’s philosophy of technoscience." Human Studies 43.1: 61-87.

32

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.