The Iterative Relationship Between Technology and International Security

Photo: Alexandre Debiève CC

Scientific breakthroughs and technological innovations are often subject to public discussion about their capacity to affect international security, either by their military exploitation or their uptake and re-appropriation through non-state actors and terrorists. While accompanying proliferation and militarisation concerns are not new, the challenge of governing emerging technologies is as much about their often-unknown technical affordances as the way in which they capture the imagination of innovators, policy-makers, and public communities.

Technologies affect political and social life in many ways, as we can see in the role of Information and communications technology (ICT) for propaganda practices or the uptake of sophisticated drone technology by terrorist and rebel groups. Yet, developments in technology also bring new challenges in conceptualising or anticipating what counts as a disruptive technology or how its practices are relevant for security. New approaches to thinking about the scope and practices of the governance of science and technology are consequently needed to define the responsibilities of scientists and innovators in mediating technology’s effect on society on the one hand, and that of states and transnational companies on the other.

The international politics of security technologies – discerning the security potential of emerging technologies

The extensive integration of technology into social, political and economic processes has led to a renewed interest in the relationships of that technology with the practices and discourses of security. The proliferation of, and access to, highly sophisticated weapons technologies demand new approaches in responding to the gained capabilities of non-state actors. The extended reach of state propaganda and the new dynamics of misinformation campaigns enabled through ICTs further indicate how technologies become utilised for hostile purposes but also shift the frontlines for political contestation and animosity. With former dichotomies and distinctions between public/private and military/civilian further blurring, existing concepts and terminologies seemingly fall short in describing and anticipating technology’s ambiguous effects.

Meanwhile, in the realm of emergent technologies, less methodological attention has been given to the iterative processes and the decisive choices through which technology becomes politically relevant for security. Technology is often conceptualised in terms of its effect on its socio-political environment as being sustaining or disruptive depending on whether its trajectory is following an existing path or opening up new matrices. Such essentialist conceptions of technical affordances risk neglecting the socio-political processes through which promissory technologies arrive at their political capacity, as well as their ambiguous effects on international security.

While different proposals exist for technology governance, their approaches, scopes, and practices range from the prohibition under international humanitarian law to norm development, codes of conduct, and the development of ethical design principles. Whereas international humanitarian law includes general rules and treaty law regarding the requirements of weapons technologies, further practical obligation on states, as constituted by Article 36 of the 1977 Additional Protocol to the 1949 Geneva Conventions, lacks international harmonisation. While ongoing developments in civilian and military technology stress the relevance of Article 36’s weapons reviews, it does not specify the processes through which the legality of weapons, means, and methods of warfare are to be assessed. Likewise, the internal procedures of the legal review mechanism including format, method of working, mandate or level of authority of review bodies need further unification.

Academic analysis has paid little attention to the working processes, institutional settings, and practices that are steering scientific and technological trajectories. Political contention additionally appears captured by concerns about innovation pace, technological affordances, and their challenges for democratic governance processes, yet like the academic discourse it fails to adequately consider the changing actor landscape and institutional settings which drive science and technology. In particular, the increasing role of private actors in developing defence capabilities questions traditional considerations of responsibility and accountability for the development of defence capacities and the legitimate deployment of force. Public-private collaboration in defence research and development is being established as a common innovation model, thereby blurring the realm of defence industry, defence policymaking, and public economy. The recently established European Defence Fund testifies these changing dynamics and demands towards developing defence technology capabilities with large parts of its investment being strategically oriented towards private industrial promotion with envisioned spillover effects for the economy.

Ethics for military engineering – Innovation practices and norm development

With visions of autonomous weapons becoming realisable, debates have shifted towards the role of innovators and ethical frameworks in governing emerging technologies and encoding desirable political ends into the technological design. Against the backlash of stagnating international negotiations over enforceable legal regimes, innovators and scientists have taken active roles in the development of norms for securing “meaningful human control” over weapons systems and the deployment of force. Consequential attention to innovation practices and the role of weapons designers have contributed to the re-conceptualisation of human-machine interaction.

Central points of contention surrounding the allocation of responsibility and accountability in “algorithmic warfare” demands further clarification when, where, how, and by whom decisions are delegated to automation processes, while equally revealing how these decisions in current everyday military practices are part of highly distributed processes to be formed among multiple individuals and professions. This re-conceptualisation offers a different reading of the challenges of autonomous weapons not in passively losing control to machines but in deliberatively choosing “to give up control”. Re-locating agency with humans to define their relationship with technology, it also affects the aim and objective of regulatory attempts and the law. Instead of targeting the micro-processes of Artificial Intelligence (AI) design, humans and their practices of building, training, deploying, and monitoring the algorithms become subject to governance. On this ground, the design of algorithms remains subject to existing normative and legal regimes, while for its governance questions concerning data quality, transparency, and scrutiny become central.

Considerations about the use of data, its quality, and labelling for target selection processes are no new concerns but are of increasing importance for the deployment of high-precision weapons. Revealing bottlenecks and ambiguities, much less is known about the processes, norms, and quality of intelligence that is informing target selection processes than about the weapon itself.

With the rise of ever more precise weapons, the idea of ‘the good war’ or imagination of war as automated weapons between machines have been debated – also questioning the social function and significance of war for society. Since technology has an active role in negotiating the moral evaluation of the deployment of force, for instance allowing actors further distancing from its destructive effects, legislators as much as the military are asked to define the ethical and strategic objectives according to which autonomous weapons and other technology would be designed in order to make it subject to democratic deliberation and control.

The road ahead

The socio-economic effects of technology are ambiguous, and the future challenges tied to emerging technology are to a large extent linked with technology being used in areas it was not originally intended for, thereby making developments hard to predict. The landscape of actors involved is changing and blurring previously distinct societal spheres, and therefore existing frameworks, concepts and terminologies fall short at capturing the full scope of the developments. To meet future challenges and establish a clear division of labour and of responsibilities in mediating technology’s effect on society, we need new approaches to thinking about the scope and practices of the governance of science and technology.

About the author: Anna Roessing is a Doctoral Researcher at the University of Bath. Her research focuses on socio-technical imaginaries in the innovation governance of biotechnology.

  • This text is based on discussions in the second PRIO TRANSAD Workshop on “Technology, Security, and Warfare”, which took place at the University of Bath 24 May 2019.
  • The third PRIO TRANSAD Workshop will take place in Barcelona 21-22 November 2019, under the title “Emerging Technologies and International Security in the Mediterranean Region”.
Share this:

One Comment

Bashar Malkawi

Technologies affect political and social life in every possibel way. No state, legislator, or actor can keep up with technological advances. This also applies in the realm of security. Multilateral solution in the form of an international treaty could be one way thought not enough. Bashar Malkawi

Comments are closed