Children are becoming the objects of a multitude of monitoring devices—what are the possible negative ramifications in low resource contexts and fragile settings?
The recent incident of a UNHCR official tweeting a photo of an Iraqi refugee girl holding a piece of paper with all her personal data, including family composition and location, is remarkable for two reasons. First, because of the stunning indifference and perhaps also ignorance displayed by a high-ranking UN communications official with respect to a child’s personal data. However, the more notable aspect of this incident has been the widespread condemnation of the tweet (since deleted) and its sender, and her explanation that it was “six years old”. While public criticism has focused on the power gap between humanitarians and refugees and the precarious situation of Iraqi refugees, this incident is noteworthy because it marks the descent of a new figure in international aid and global governance: that of children’s digital bodies.
Because children are dependent, what technology promises most of all is almost unlimited care and control: directly by parents but indirectly by marketing agencies and tech companies building consumer profiles. As explained by the Deborah Lupton, in the political economy of the global North (and, I would add, the global East), children are becoming the objects of a multitude of monitoring devices that generate detailed data about them. What are the possible negative ramifications in low resources contexts and fragile settings characterized by deep-seated oversight and accountability deficits?
The rise of experimental practices: Ed. Tech, babies and biometrics
There is a long history of problematic educational transplants in aid context, from dumping used text books to culturally or linguistically inappropriate material. The history of tech-dumping in disasters is much more recent, but also problematically involves large-scale testing of educational technology platforms. While practitioners complain about relevance, lack of participatory engagement and questionable operability in the emergency context, ethical aspects of educational technology (Ed. Tech), data extraction—and how the collection of data from children and youth constitute part of the merging of aid and surveillance capitalism—are little discussed.
Another recent trend concerns infant biometric identification to help boost vaccination rates. Hundreds of thousands of children die annually due to preventable diseases, many because of inconsistencies in the provision of vaccine programs. Biometric identification is thus intended to link children with their medical records and overcome the logistical challenges of paper-based systems. Trials are now ongoing or planned for India, Bangladesh and Tanzania. While there are still technical challenges in accurately capturing the biometric data of infants, new biometric techniques capture fingers, eyes, faces, ears and feet. In addition to vaccines, uses for child biometrics include combatting aid fraud, identifying missing children and combatting identity theft.
In aid, data is increasingly extracted from children through the miniaturization and personalization of ICT technology. Infant and child biometrics are often coupled with tracking devices in the form of wristbands, necklaces, earpieces, and other devices which the users carry for extended periods of time.
Across the board, technology initiatives directed at children are usually presented as progress narratives, with little concern for unintended consequences. In the economy of suffering, children and infants are always the most deserving individuals, and life-saving interventions are hard to argue against. Similarly, the urgency of saving children functions as a call to action that affords aid and private sector actors room to maneuver with respect to testing and experimentation. At the same time, the mix of gadget distribution and data harvesting inevitably become part of a global data economy, where patterns of structural inequality are reproduced and exacerbated.
Children’s digital bodies
Despite the massive technologization of aid targeting children, so far, no critical thinking has gone into considering the production of children’s digital bodies in aid. The use of digital technologies creates corresponding “digital bodies”—images, information, biometrics, and other data stored in digital space—that represent the physical bodies of populations affected by conflict and natural hazards, but over which these populations have little say or control. These “digital bodies” co-constitute our personalities, relationships, legal and social personas—and today they have immense bearing on our rights and privileges as individuals and citizens. What is really different about children’s digital bodies? What is the specific nature of risk and harm these bodies might incur?
In a non-aid context, critical data researchers and privacy advocates are only just beginning to direct attention to these practices, in particular to the array of specific harms they may encounter, including but not limited to the erosion of privacy.
The question of testing unfinished products on children is deeply contentious: the possibility that unsafe products may be trialed in fragile and low resource settings under different requirements than those posed by rich countries is highly problematic. On the other hand, parachuting and transplanting digital devices from the global North and East to the global South without any understanding of local needs, context and adaption practices is—based on the history of technological imperialism—ineffective, disempowering, a misuse of resources and, at worst, could further destabilize fragile school systems.
Very often, in aid tech targeting children, the potential for digital risk and harm for children is ignored or made invisible. Risk is phrased as an issue of data security and malfunction and human manipulation of data. Children—especially in low-resource settings—have few opportunities to challenge the knowledge generated through algorithms. They also have scant techno-legal consciousness with respect to how their personal data is being exploited, commodified and used for decisions about their future access to resources, such as healthcare, education, insurance, welfare, employment, and so on. There is the obvious risk of armed actors and other malicious actors accessing and exploiting data; but there are also issues connected to wearables, tablets and phones being used as listening devices useful for surveilling the child’s relatives and careers. It is incumbent on aid actors to understand both the opportunities posed by new technologies, as well as the potential harms they may present—not only during the response, but long after the emergency ends.
Conclusion: time to turn to the CRC!
The mainstreaming of a combination of surveillance and data extraction from children now taking place in aid, ranging from education technology to infant biometrics means that critical discussions of the ethical and legal implications for children’s digital bodies are becoming a burning issue.
The do no harm principle is a key ethical guidance post across fields of development, humanitarianism and global health. The examples above illustrate the need for investment in ethics and evidence on the impact of development and application of new technologies in low resource and fragile settings. Practitioners and academics need to be alert to how the framing of structural problems shifts to problematizations being amenable to technological innovation and intervention and the interests of technology stakeholders. But is that enough?
The Children’s Rights Convention of 1989 represented a watershed moment in thinking children’s right to integrity, to be heard and to protection of their physical bodies. Article 3.1 demands that “In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration.” Time has now come to articulate and integrate an understanding of children’s digital bodies in international aid within this normative framework.
This post originally appeared on the Open Global Rights website. Find it here.