Skip to main content
Anna Leander
Dec 20 2023

Digital technologies and connected standards are changing the methods with which the security sector operates. This is increasingly necessitating ICoCA to understand and include the effect that the digital transformation is having on the security sector. Anna Leander, Professor of International Relations and Political Science at the Geneva Graduate Institute, explores the implications of digital technologies on security providers through her research project in the Brazilian context. Professor Leander highlights the need for fundamental shifts in our approach to security standards and emphasizes the significance of ICoCA’s role in developing guidelines for private security providers adopting digital tools.

 

 


 

The private security sector is varied and constantly morphing. The consequence is that the remit of ICoCA is a moving target. One of the ‘emerging areas’ that the association is finding itself grappling with is that of ‘digital technologies’. The association is now engaged in a partnership with ICT4Peace to research and reflect on how its ‘Code of Conduct’ should be interpreted and possibly also updated in light of the very significant changes in practices of security service provision digital technologies entail. This blog post contributes to that incipient reflection. It draws on the findings of a research project about to conclude that focused on the ‘regulatory politics’ of the digitally mediated Brazilian democracy.[1] It connects these findings to the incipient debates about standards for private security working with digital technologies. On this basis, I argue that digital technologies call for three fundamental shifts in how we think about standards for security providers: shifts in what we think standards focus on, in how we think they operate and in why we think they matter for overall regulation. As I conclude, these changes leave ICoCA with rather substantial standards developing work to do that it has only just begun.

What Standards? From Punishment to Prevention

We tend to think of digital technologies as tools we use e.g. in politics or for security. Our research on Brazil shows that they are much more than that. Digital technologies make up an ‘infrastructure’ through which things are done, including politics and regulation. Algorithmic codes sort and order content, encryption gives access to and excludes from it and connecting practices (including those of BoTs and Trolls) are defining for its circulation. Analogously, platforms such as WhatsApp and Facebook that we looked at in our study, are more than spaces where politics ‘unfolds’. They modulate who and what is included in what conversations and on what terms; that is they fashion the formation of political subjects and relations. They are involved in what we term the ‘infrastructuring of politics’ in our project. When we started, we thought the most important part of this ‘infrastructuring’ would be the way in which it was connected to mis-/dis-/mal-information. We changed our mind. In the course of our research, we came to think that ‘atmospheres’ rather than information was the core issue. The ‘mood’ of politics was what nurtured polarization, radicalization and violence or inversely solidarity, constructive engagements and alliances. Moods also permeated the significance of information, including the interpretation, circulation and credibility of it. Information flows mattered because they were nurturing these ‘atmospheres’ more than because of the claims or lies they contained.[2]

The findings that digital technologies are ‘infrastructuring’ and matter for democratic politics primarily because of their connections to ‘atmospheres’ have implications for how we think about the standards and regulations of digital technologies more generally. To grapple with the infrastructuring of atmospheres, regulation needs to focus less on punishing existing abuses and more on preventing the formation of atmospheres where abuses emerge. To be effective, standards informing content moderation, flagging, the closing accounts etc. need to become less concerned with post-facto measures and more with nudging what is not yet there. While our project focused narrowly and specifically on the digital technologies and democratic politics in Brazil, the shift in focus of standards is of broader relevance. To see why, picture the digital technologies that are part of the integrated security system of a military base, a refugee camp or a mining company or the assessments underpinning the security surrounding the trip of a diplomat, a ship going through the Malacca straits or a researcher heading out for fieldwork. Also here, regulation that centers on punishing the abuses of the technologies matter of course but those preventing them from emerging in the first place are crucial. 

In the integrated security system e.g. we obviously need standards for how we control entry and access, who is concerned by the control, what form these controls can take, and what kinds of sanctions they might trigger. This will all allow us to control and punish abuses. But, more fundamentally we need to ensure that the system does not systematically exclude, bias or abuse and punish in ways that create systematic problems, including the abuse and resistance of the system. While standards ‘punishing’ abuse remain important, when digital technologies are involved the emphasis shifts towards standards of ‘prevention’. This is a first general implication of our Brazilian research of pertinence to thinking about standards for private security working with digital technologies.

How Standards? From Surveillance to Design

The shift in what standards are for has implications for how they operate. Starting again from our recent project exploring the digital infrastructuring of democracy in Brazil helps give a sense of why. The project began with the idea that standards operated as major actors negotiated, adopted, implemented and enforced them; in our case we were looking at platforms such as WhatsApp and Facebook. Such a view on how standards work makes control at different levels crucial; controls that ensure the widespread adoption, correct interpretation, just following, and equitable enforcement of standards. ‘Surveillance’ at different levels is the result. However, as the project progressed, it became clear to us that standards also operate in other ways. We studied the connections between the GDPR and the regulatory standards in Brazil.[3] The GDPR was a ‘regulation by design’ as ENISA put it when proposing it. It was to operate by being built into the digital infrastructures. (This is why we now constantly click to select and accept the use of cookies). Facebook also designed it into its infrastructures.  in favor of the Brazilian alt right. Our research demonstrated that standards such as these operate ‘‘by design’. This does not eliminate the significance of standards operating through ‘surveillance’, of course. Even if the GDPR operates by design, it also rests on surveillance. Companies reported and found guilty of non-compliance can be fined 20 million EUR. Analogously, Facebook’s ‘community standards’ operated by design but surveillance continued and was used to detect and punish abuses. A ‘War Room’ was set up specifically to monitor activities during the Brazilian elections and subsequently the platform established a permanent ‘Oversight Board’ that Mark Zuckerberg modestly compares to the Supreme Court. Thus, while standards operating through design have become crucial they work in symbiosis with standards working through surveillance.

The centrality of standards operating through design dovetails with the move in the direction of standards focused on prevention. Designing standards into digital infrastructures is to rely on the infrastructure to shape use and prevent abuse. Such regulation by design obviously, is not unique to GDPR or the way it links to the infrastructuring of Brazilian politics. Rather, technical standards designed into infrastructures play a crucial role in the regulation of digital technologies more generally. They define not only how a specific aspect of digital infrastructures operates but also how it connects to the rest of the infrastructure. This is why technical standards that govern ‘interoperability’ and ‘systems compatibility’ figure so centrally in regulatory debates surrounding digital technologies. For security providers, regulation through technical standards designed into digital infrastructures also matter fundamentally. For instance, for those working with border security the ‘interoperability’ of databases, defined by technical standards, will determine what kind of data they can associate their activities to and so both what kind of protection they provide, to which subjects and on what terms. The standards designed into infrastructures in other words form the politics and ethics their activities enshrine. Working with the designing of these standards is consequently fundamental for preventing involvement in the mis-use and abuse of systems as well as complicity with fundamentally problematic practices that reinforce e.g. racist, religious or otherwise repressive profiling and subjectification of asylum seekers and migrants that is a fundamental ethical problem and a reputational risk for security professionals. Digital technologies make design core to the operation of standards while they diminish the role of surveillance. This is the second broader insight I would like to highlight drawing on our research project.

Why Standards? From Punishment to Prevention

If digital technologies alter the ‘what’ and the ‘how’ of standards, we should also expect them to have an incidence on their ‘why’. If standards are no longer about prohibiting specific activities but about preventing their emergence and if they therefore no longer operate through practices of controlling activities but through the design of possibilities, this logically should have an incidence on why we adopt them, and on what we expect standards to achieve for us. But what kind of incidence? Again, I think the findings of our research project provide a sense of direction. In our research, we showed that the community standards of platforms did not play the role of fixed principles that had the same regulatory effects across time and space. Quite on the contrary, the standards operated as complex and evolving ‘devices’. Their sharpness, firmness, consistency, connects or what we might term ‘texture’ affected how precisely the standards ‘cut into’ a regulatory context. It also had implications for how the standards were themselves recursively re-shaped by the regulatory contexts in which they operated.. The ‘standards’ in other words operated less as firm and fixed principles than as shapeshifting devices, transforming as they scaled into novel contexts. Generalizing this point, standards pertaining to digital technologies operate less as ‘norms’ expressing given ‘values’ than they do as material (infrastructural) devices that work with and transform these in context. 

Now hair is on end and many heads that shake in despair. Do ‘standards’ that are constantly evolving and re-negotiated, that lack clear and fixed principles deserve to be called ‘standards’ at all? Are they standards of anything but whitewashing or perhaps invisibilizing the dark and problematic sides of digital technologies (including in private security)? Where did ethical processes and values deserving defense disappear? And, what about the power of platforms and tech giants? If we climb down from the abstract omniscient view that undergirds these worries and instead think about standards as always being from somewhere and operating in situated contexts, the prospect of working with processual standards seems less threatening. It is actually a defense against the power of platforms and the invisibilizing of problems. Resorting again to our Brazil project, the possibility to negotiate and transform the norms of values informing the forward orientated standards designed into digital infrastructures by platforms such as Facebook and WhatsApp suddenly seems rather less threatening. In fact, it seems an absolutely necessary condition if such standards are to play a role in regulation and do so with the sensibility to context that is necessary if the digital infrastructuring of politics is to support democracy rather than undermine it. By analogy, and more generally, transforming and negotiating standards — such as those of data protection when working in humanitarian settings — is a pivotal part of ensuring that these standards work at all and that they do so with consideration for the concerns that dominate in the situated context. Precisely because digital technologies scale, spanning contexts, a processual approach is the sine qua non for ‘response-able’ regulation. Far from a problem, a processual approach to standards is one caring for context, allowing it to play into the principles of the standards at the core of regulatory politics.

Where to with Standards?

So where does the argument that standards pertaining to digital technologies need to focus on prevention and operate through design and be processual leave ICoCA? The short answer is ‘in an uncomfortable place’. More specifically in a place where a lot of work lies ahead. Standards for private security work involving digital technologies are urgently needed and need to take unaccustomed formats. Again, effective standards for working with digital technologies to be future orientated (preventive), inscribed in digital infrastructures (designed) and therefore contextually shifting (processual). This is far from the backward looking, textually formulated, and firmly principled picture of standards currently dominating thinking in security and elsewhere. To develop standards that operate in this unaccustomed way will therefore require a giant leap of imagination by ICoCA. A leap that will require follow up efforts to concretize and make real the reimagined standards for security services provision involving digital technologies specifically. Of course, ICoCA is not the only organization entering the unfamiliar, challenging terrain of development standards for digital technologies. So the good news is, the work ahead might be shared with others elsewhere, including those who played (and continue to play) a crucial role in the development of standards for Brazil including ‘internet activists’, the EU/ENISA, Facebook and the other platforms, Brazilian regulators — particularly the Supreme Electoral Court — and the many academic researchers who engage the topic. Tackling the rather daunting task collaboratively is probably the best bet for successfully shouldering it. Since this is how ICoCA approaches the task, they have a good chance of moving forward with their ambitions of developing standards for private security companies working with digital technologies (basically all). Let’s hope they do so.

 


[1] For more see Infrastructuring Democracy: The Regulatory Politics Of Digital Code, Content And Circulation.

[2] Leander, Anna, with, Luisa Lobato, and Pedro dos Santos Maia. 2022. “‘Vulgar Vibes’. The Politics of Altright Aesthetics.” Under Review. For a short version Leander, Anna. 2023. “Vulgar Vibes. The Atmospheres of the Global Disinformation Order.” Globle: La Revue de L’Institut | The Graduate Institute Review 31 (31): 24 and

[3] GDPR is the General Data Protection Regulation adopted by the EU in 2016 that entered into force in May 2018 in the midst of the electoral campaigns that brought Bolsonaro to power in 2018. Our argument is published in Leander, Anna, with Cristiana Gonzales, Luisa Lobato, and Pedro dos Santos Maia. 2023. “Ripples and Their Returns: Tracing the Regulatory Security State from the EU to Brazil, Back and Beyond.” Journal of European Public Policy 30 (7): 1379-1405.

[4] Leander, Anna, Deval Desai, and Florian Hoffmann. 2021. “Scaling Digital Infrastructures: The Forms and Edges of Regulatory Devices ” Under review.

 

 

 

 

The views and opinions presented in this article belong solely to the author(s) and do not necessarily represent the stance of the International Code of Conduct Association (ICoCA).