When we discuss artificial intelligence, the digital technology that makes it happen, and singularity – the idea that both of them will exponentially take over the progression of society – we refer to them in singular. This is not a coincidence.
Both, science and fiction have portrayed AI as a particular form of reason, digital technology as an autonomous driver of change, and singularity as a unidirectional technological revolution. However, none of them are necessarily as “singular” as they appear.
Rather, the different contexts in which digital technologies come to matter create a broad variety of knowledge and social effects. For example, digital technologies are currently used for predictions of any kind: from the spreading of pandemics to political elections and crime mapping. Not only does each of these predictions produce their specific societal effects: they influence whether or not we get vaccinated, for whom to vote or where to park our car. They also produce more complicated effects, some of which actually make us question their predictive power. Filter bubbles and fake news are just some of them. But what exactly makes these social effects complicated?
While the way in which digital technologies work is no longer intuitive to understand and question, the above-mentioned effects also reveal that humans are still an important part of the game. And this complicates things. Digital technologies and the knowledge they produce are not as singular and independent of social processes as the term “singularity” suggests. After all, it is us who provide both data and context knowledge for predictions, and in many cases it is still humans who decide which parameters are included in prediction algorithms. This goes to show that the simplified idea of computer-versus-humans doesn’t really hold. The production of “intelligence” through digital technologies doesn’t happen outside social and political situations, but in relation to them.
In interviews I have conducted on predictive policing methods it became quite clear that digital technologies are closely linked to social and political situations. Both, police officers and programmers decide which crime data to collect, how to feed it into the computer and how to present the outputs of algorithmic calculations. All of these decisions taken by humans are part of defining which kinds of crimes police focuses on, even though the actual crime predictions are eventually generated by a computer. It shows that political and social data and context knowledge feed into digital technologies and influence the intelligence they generate. And – vice versa – digital technologies and the intelligence they produce again influence political and social situations in specific ways. One striking characteristic of digital technologies is, for example, that any knowledge they produce has to be calculable and captured in numbers. Even though this seems obvious, it still does determine and limit the ways in which digital machines can produce knowledge. For predictive policing this means, for example, that correlations and patterns are the main knowledge tools for algorithms to predict crime. This means that correlations and patterns influence actual policing decisions, for example where to mobilize personnel and which locations to focus on. In essence: how digital technologies work is specific to the social situation they are used in, and digital technologies create specific effects on society. This means that humans and machines co-produce the progression of society rather than dominating over each other.
Once we have understood how social situations are actually reflected in the way we engineer digital technologies and create digital knowledge, it may be more appropriate to explore the many “specificities” of the situations in which digital technologies and society influence each other instead of presuming a “singularity”.
- My peer-reviewed article on the same topic, Politics and the Digital, is available here.
- Last month Morgenbladet published an engaging special edition on artificial intelligence. I followed up with an op-ed which was published in a later edition of the paper. This blog post provides further critical comment.