Covid-19 contact-tracing apps and the public/private co-production of security

This article examines how the smartphone contributes to the co-production of security through an analysis of Covid-19 contact-tracing apps. Building on existing research in security studies that mobilizes the science and technology concept of co-production, the article proposes the notion of ‘appropriation’ as a concrete way of extending our understanding of the public/private co-production of security. Appropriation highlights how consumer technology may be repurposed for security and shows how private-sector actors that own consumer technology not only influence, but actively condition the co-production of security. Bringing new, typically commercial, concerns to bear on security practices, appropriation also has the effect of complicating conventional understandings of the relationship between liberty and security. Focusing on the NHS Covid-19 app and its contentious relationship with Google/Apple’s framework for digital contact-tracing, the article demonstrates how the smartphone enables private-sector actors to gain influence in the security domain. Google and Apple used their control over smartphone technology to compel the British health authorities to adopt a less effective but more privacy-preserving approach than they originally intended, and thus enforced a seemingly liberal response to an exceptional political situation.


Introduction
To restrict the spread of the Covid-19 pandemic, governments across the world developed contacttracing apps for smartphones.At the same time, Google and Apple teamed up to provide national health authorities with a programming interface that they could use to develop and implement such applications.Contact-tracing apps traced the movements of smartphone owners and alerted them if they had been in contact with other smartphone owners who were or had been infected with the virus.Despite much controversy over both the privacy implications and the effectiveness of such technology (Hern, 2020b;Triggle, 2021), the use of digital contact-tracing by governments to protect the health of their populations became a widespread practice (Johnson, 2020).Digital contacttracing during the Covid-19 pandemic thus shows how smartphone technology can become central to security practices and draws attention to how smartphones contribute to the co-production of security at the intersection between the public and the private sectors.
This article examines the smartphone as a security device through an analysis of how Covid-19 contact-tracing apps contributed to the co-production of security.Co-production is a concept from science and technology studies that highlights how technology and society come into being through an interactive and co-constitutive relationship (Jacobsen, 2015;Jacobsen and Monsees, 2019).Although still relatively little used in security studies compared to other concepts from science and technology studies (Jacobsen, 2020: 130), co-production has recently gained some traction in the discipline.Notably, security research has mobilized co-production to explore how the public and private sectors intermingle in the production of security.By demonstrating how the state's operation of security technology often depends on aid and expertise from private-sector companies and scientific environments, this research demonstrates how technology contributes to the co-production of security by making the practice of security a public/private enterprise (Allan, 2017;Bellanova and De Goede, 2021;Martins and Jumbert, 2022;Nolte and Westermeier 2020).
The article contributes to the science and technology studies and co-production literature in security studies by proposing the concept of 'appropriation' as a concrete way of extending our understanding of the public/private co-production of security.Appropriation refers to the process by which consumer devices are repurposed by security actors to become security devices.The concept alerts us to how security is co-produced in the mundane and banal negotiation of relationships between people and their devices.Moreover, it shows how ownership of the technology and control over the process through which it becomes security technology allow private companies not only to influence, but actively to condition the co-production of security (Amicelle et al., 2015: 300-302; see also De Goede and Westermeier, 2022;O'Grady, 2021).Appropriation adds texture and depth to the co-production concept by detailing how a certain and prolific kind of co-production -that which happens through the repurposing of consumer technology for security -forcefully redistributes authority in the security domain.Appropriation thus also offers a more granular understanding of how public/private co-production enables the expansion of private security authority and, more broadly, how international security changes in the face of the galloping digitalization and privatization of society.
The smartphone is a prominent example of an appropriated security device.By connecting closely with the human body, and arguably extending it (Mazmanian, 2019;Park and Kaye, 2018), the smartphone allows uniquely intrusive, intense and ubiquitous surveillance, and is thus a vehicle for the state to secure against external and internal threats (Chambers, 2016; see also Bauman et al., 2014: 123;Ford and Hoskins, 2022;Harcourt, 2015: 10-11, 23;Horbyk, 2022;Lyon, 2015: 76-77).From the perspective of security theory that understands surveillance technology as a means of legitimizing and normalizing illiberal security practices (see Bigo and Tsoukala, 2008;Huysmans, 2014: 22), the smartphone can be seen to co-produce security by offering new ways for the state to manage its population.Considering the distinctly appropriated nature of the smartphone as a security device, however, this understanding can be nuanced, as the process of repurposing consumer technology offers the possibility to interests other than state security to condition security practices.Specifically, Big Tech companies that own and control the appropriated security technology are seeking to profit from -more than secure -their users (see Hoijtink, 2014;O'Grady, 2021: 235).Thus, private-sector actors' interventions in the practice of security complicate conventional understandings of the relationship between liberty and security.
In examining how smartphone appropriation co-produces security, this article analyses the case of the United Kingdom's National Health Service's (NHS) version of an app for digital contacttracing, NHS Covid-19, and how it was developed and implemented in interaction with Google/ Apple's joint interface for digital contact-tracing.The analysis shows how the UK version of the app was forced to comply with the Google/Apple framework, which in practice meant that government health authorities in the United Kingdom had to adopt a more privacy-preserving approach than they originally intended.Google/Apple's control of the technology required to carry out effective contact-tracing enabled the companies to condition the state's ability to save the lives of its citizens.In this way, the smartphone contributed to co-producing a liberal security politics, as its interaction with security actors through practices of appropriation facilitated the prioritization of interests other than state security.This is especially striking considering the exceptional political circumstances that the pandemic constituted, which could have been expected to foster and legitimize a more unchecked intensification of mass surveillance and curtailment of liberties in the name of security (see Kirk and McDonald, 2021).
The article first introduces the concept of co-production and argues that public/private coproduction can helpfully be approached through the concept of appropriation.Next, it discusses the smartphone as a security device and how this device might co-produce security, especially by reworking the relationship between liberty and security.Third, it analyses the development and implementation of the NHS Covid-19 app as a case of how the smartphone contributes to the public/private co-production of security, in particular by complicating conventional understandings of the relationship between liberty and security.The final section suggests that the case points towards a future security politics in which Big Tech companies continue to gain influence in the security domain and do so through the principle of liberty.

Public/private co-production and the appropriation of security devices
In examining the ways in which technology brings public and private actors together in the production and practice of security, co-production is a particularly useful concept.Like other popular science and technology studies-inspired approaches to security studies, such as actor-network theory (Bengtsson et al., 2019;Borg, 2021;Latour, 2005;Schouten, 2014;Toom, 2020) or intra-active materialization (Aradau, 2010;Barad, 2007;Glouftsios, 2020;O'Grady, 2021), co-production assumes that technologies have material agency, meaning they are not passive tools but active participants in the production of security.More than that, however, co-production highlights how technology not only produces social orders by exercising material agency, but is also socially constituted.This means that technology is both constituted by and constitutes social orders.What is co-produced in co-production, then, is science and technology, on the one hand, and society, politics and security, on the other.Thus, co-production stands out from other paradigms in science and technology studies by emphasizing the interplay between the discursive and the material dimensions of production: how technology is constructed discursively and simultaneously loops back to impact the very grounds for this construction (Elbe and Buckland-Merrett, 2019: 127;Jacobsen, 2015: 155;Jacobsen and Monsees, 2019: 26-31).
Through co-production processes, moreover, technologies connect different and sometimes heterogeneous knowledges and interests in the creation of new security practices.In particular, the use of new, advanced technology for the practice of security brings scientific, often privatesector, expertise to bear on security governance.For instance, Bellanova and De Goede (2021) argue that online terrorist content moderation occurs through public-private co-production, as the EU regulates but remains dependent on social media companies to counter radicalization and recruitment online.This case shows, they hold, how EU security policies 'work with and through private platforms' (Bellanova andDe Goede, 2021: 1330), and how digital technologies function to connect public agencies with private companies.Similarly, Nolte and Westermeier (2020) demonstrate how the co-production of urban security infrastructures is dependent on the interaction between public and private-sector actors.Through an ethnographic study of an Israeli urban security conference, they show how the securitization of infrastructure draws on public as well as private-sector expertise, even to the extent that the distinction between the public and private sectors blurs.Finally, Martins and Jumbert (2022) show how the use of drones for border management is reliant on expert knowledge in the private sector.In this way, the securitization of migrants and the co-production of border security are animated by the diverse forms of cooperation established by public authorities and private entities.This article contributes to the co-production literature in security studies by further exploring how technology co-produces security in public/private spaces.Moreover, it focuses attention on the role of devices and in particular mobilizes the concept of appropriation as a concrete way of extending our understanding of public/private co-production.
Security devices are the equipment that security practitioners and professionals use to practise security.Since they have agentic capacities, devices are agents that actively -through interaction with human security actors -participate in fashioning security practices (Amicelle et al., 2015: 294).As such, security devices can be seen as co-productive in the sense that their use is simultaneously conditioned by a device's material agency and by security actors' interpretation and social constitution of the device.Moreover, devices are 'not just "things"' but 'techniques and instruments embedded in social practices' (Amicelle et al., 2015: 297).'An analytics of devices' thus attunes us to how technologies act in security settings, specifically through the ways in which they are used instrumentally by security actors.This means that the performative security effects of devices and their intervention in processes of co-production are linked to specific human/device relations.
One important way in which security devices come into being is through appropriation.Appropriation happens when devices that are originally made for purposes other than security are repurposed to become security devices (Amicelle et al., 2015: 300-301; see also O'Grady, 2018: 111-112, 127-129).This creates space for a wide variety of artefacts to become security devices.Often, banal consumer goods are appropriated for security purposes (Amicelle et al., 2015: 301; see also Grove, 2015;Saugmann, 2020;Singh, 2015;Tanner and Meyer, 2015).However, appropriation of consumer goods for security purposes requires imaginative and innovative engagement with technology (Amicelle et al., 2015: 301-302; see also Chandler, 2018: 146-148, 164-165).Appropriation can thus have unintended and unexpected effects as it provides opportunities for the creation of new and original human/device constellations across sociotechnical environments (Amicelle et al., 2015: 294; see also Hayles, 2012: 89).
In this way, appropriation might lead to the rearrangement of relations not only between state and citizen, but also between different political authorities.Since the banal consumer goods that are often appropriated for security purposes are typically owned by private-sector companies, appropriation allows such companies to exert influence in the security domain.When practising security through appropriation, security governance must operate through and at the mercy of private-sector actors that might -by virtue of their control over consumer devices -come to act as intermediaries in the relationship between the state and its citizens.This can redistribute authority in security assemblages as it enables private-sector companies to complement and challenge the state in the practice of security (Amicelle et al., 2015: 302;De Goede and Westermeier, 2022;O'Grady, 2021: 240-241).Appropriation thus emerges as a kind of micro-political 'strategic game' in which private-sector actors can intervene in security governance and reconfigure hierarchies of power and authority in international security (Amicelle et al., 2015: 297).
Approaching co-production through appropriation therefore adds texture and depth to our understanding of how security is co-produced at the intersection of public and private-sector interests.Where previous security research on co-production has shown that technology produces security by connecting the heterogeneous knowledges of public and private actors, appropriation shows how such public/private interaction can occur through processes of technological transformation that enable private-sector actors to enforce their interests more strongly in the production of security.As security actors aim to appropriate human relationships with their consumer devices, security practices are conditioned by the technological affordances of consumer devices, but also by the ways in which the private-sector actors that control the devices make them available for appropriation.
Appropriation thus also nuances the notion of co-production by showing that the character and trajectory of public/private co-production depends on the kind of technology involved in the coproduction process.By calling to our attention the repurposing of consumer technology, appropriation sensitizes co-production to the ways in which security practices exploit consumerist behaviour, and how the management of such exploitation is an effective way for private-sector actors to gain influence in the security domain.More broadly, appropriation also shows how the rapid and comprehensive digitalization and privatization of society, culture and everyday life contributes to the co-production of security, and how co-production of and by banal consumer goods can even enable forceful redistributions of political power and authority.
The next section discusses the smartphone as a prominent example of a consumer device that can be appropriated for security purposes and how appropriation of the smartphone contributes to the co-production of security, especially by complicating conventional understandings of the relationship between liberty and security.

Smartphones, liberty and security
The smartphone enables the fashioning of new security practices by allowing security actors to mobilize the close human/machine relations that the device creates.By providing permanent internet access (Chambers, 2016;Vordener and Klimmt, 2020), and connecting especially intimately with, and arguably even extending, the human body (Chambers, 2016;Park and Kaye, 2018), the smartphone stands out from other comparable devices such as mobile phones, tablets or laptops.Indeed, it is arguable that the human/smartphone is an emergent hybrid entity that constitutes a new kind of subject for security practices to address (Markussen, 2022;Mazmanian, 2019).
In practical terms, the smartphone's hybridization with the human enables more intensive and intrusive surveillance through location and activity-tracking.Smartphones offer precise GPS signals that, combined with other data on habitual movement gathered from photo, map and social media apps, transmits information that can be triangulated in order to accurately pinpoint motion and even predict future behaviour.Since they follow us around, moreover, the sheer amount of data about locations and activities that smartphones transmit makes them treasure troves for surveillance actors, which can now access, map out and act on the smallest details as well as the largest patterns of individual and collective human behaviour (Chambers, 2016).
In this way, smartphones function as nodes in larger surveillance networks and assemblages, and are especially important for the operation of 'smart cities'.Smart cities is a term that refers to the ways in which densely populated areas are increasingly equipped with digital, often smart devices that produce large amounts of location and activity data.Precisely because it functions as an extension of the body and follows us around, and thus generates more data than any other device, the smartphone becomes a central machine-to-machine gateway in city-like surveillance networks, in the sense that it functions as a hook-up point or sensor through which citizens are included in surveillance circuits (Kitchin, 2014; see also Wood and Steeves, 2019).
At first glance, the smartphone can be seen to co-produce security by enabling illiberal security policy.Traditionally, and especially in the context of the global 'war on terror', security research has highlighted how the securitization of external threats has legitimized intrusive and violent surveillance and intelligence practices, and also accounted for how the state of exception is normalized and made into a permanent political condition (Huysmans, 2014: 22;Neal, 2010: 8-10).In political discourse and mainstream security and surveillance scholarship, privacy is typically seen as the liberty or right that is at stake when the need for security legitimizes intensified surveillance (Abu-Laban, 2012: 420-422; Amoore, 2014: 108).By connecting closely with the human body and enabling more intrusive surveillance that erodes privacy, the smartphone could thus be argued to be complicit in the intensification of security at the expense of liberty.In particular, the smartphone's provision of near real-time access to locations and activities can be seen as a grave invasion of privacy because it allows public and private-sector actors alike to see and potentially exploit behavioural patterns and intimate details from people's everyday lives.When considering the smartphone as a security device, then, liberty and security can be understood as privacy and surveillance, and the seeming production of illiberal security policy appears in the guise of privacy-invading surveillance.
The smartphone's intervention in the co-production of security, however, is not so straightforward.One reason for this is that privacy is a complex and contested concept.In the study of surveillance, the particular meaning of privacy has been hotly debated and conceptualized in many different ways (Bennett, 2011;Gilliom, 2011;Lyon, 2015: 22, 98-102;Rule, 2012).For instance, privacy can be understood as inherently important or as a means to an end.It can also be understood as divisible, i.e. important for individuals, or as holistic, i.e. important for the collective (Rule, 2012: 65-66).Placing privacy in direct relation to liberty/security, moreover, critical security scholars have problematized the conventional equation of privacy with liberty.Notably, Amoore (2014: 108-109) stresses how contemporary security and surveillance practices divide the subject into bits of data to the extent that speaking of a 'data subject with a recognizable body of rights to privacy' makes little or no sense, while Salter (2019: 361) shows how surveillance technologies' level of precision impacts the extent to which they are perceived as privacy-preserving and accordingly whether their use is deemed to be legitimate.Moreover, De Goede (2012) makes the case that locating privacy on the liberty side of liberty/security is based on the false presumption that more data necessarily gives more security and overlooks the fact that sacrificing privacy is itself a security risk.Hence, the specific meaning that is ascribed to privacy, and how exactly privacy as a right or liberty is understood to relate to security, matters for how privacy operates discursively to counter or legitimize security policy and practice (see also Bauman et al., 2014;Jacobsen, 2015: 157-158;Leese, 2014: 507;Schouten, 2014;Valkenburg and Van der Ploeg, 2015;Weitzel, 2018: 434).
Another reason why the smartphone's intervention in co-production processes requires a more nuanced understanding of the relationship between liberty and security is that the smartphonethrough appropriation -provides opportunities for interests other than state security to condition security practices.Because it is a consumer device that has been appropriated for security purposes, the smartphone makes the state reliant on private-sector companies in its practice of security.As such, it also allows for the interests of private-sector companies to determine the character of security practices (see Amoore, 2013;De Goede, 2012;De Goede and Westermeier, 2022;Hoijtink 2014;O'Grady, 2021).Given that private-sector companies are primarily driven by the imperative to maximize profit, use of the smartphone for surveillance purposes might not necessarily lead to an infringement of rights and liberties for the benefit of increased security.Instead, smartphone appropriation might bring new and different concerns to the fore in security decisionmaking and practice.
Although the smartphone certainly facilitates an intensification of surveillance and the invasion of privacy, the way in which the device's appropriation for surveillance purposes contributes to the co-production of security complicates conventional understandings of the relationship between liberty and security.Since smartphone surveillance is not necessarily antithetical to privacy, and is driven by commercial rather than security interests, the smartphone's intervention in co-production processes cannot be interpreted strictly as the privileging of security at the expense of liberty.Instead, smartphone appropriation intervenes in and contributes to the co-production of a complex and contested material/discursive terrain at the intersection between concerns for liberty and security, and public and private interests.
The next section examines the role of smartphone appropriation in co-production processes through an analysis of Covid-19 contact-tracing apps, and in particular the NHS app and its interaction with the Google/Apple interface.The case illustrates how smartphones can be appropriated for security purposes, and more generally how processes of public/private co-production and the relationship between liberty and security look when animated by the security appropriation of consumer devices.

Methodology and research design
The study focuses on the disagreements between British health authorities and Google/Apple over how best to digitally contact-trace during the Covid-19 pandemic, and provides an account of the role the smartphone played in this struggle.In doing so, the analysis examines how the appropriation of the smartphone contributed to the co-production of security by (re)negotiating the relationship between liberty and security, and by rearranging security authority between the public and private sectors in the case of the NHS Covid-19 app.Approaching this problem through the concept of co-production, the analysis focuses on the interplay between the discursive and material levels of production.This means that the study examines how the technology was understood, interpreted and represented discursively, but also how the technology itself took part in and conditioned its own use and discursive representation (Jacobsen, 2015: 154; see also Aradau et al., 2014: 68).
Moreover, drawing on Mutlu's (2012) framework for studying how non-human actors influence security networks at different temporal stages, the analysis examines the co-production process in three steps of material/discursive interplay: emergence, continuity and transformation (Mutlu, 2012: 175).Emergence is the phase in which a device enters a security complex and plays a part in delineating a security problem.Continuity follows, during which a device's role in a security complex is stabilized, and where areas of contention in a security problem are temporarily settled.Finally, in the transformation phase, a device drives the regeneration of a security complex, system or network (Mutlu, 2012: 175-176).In emphasizing the temporal aspect of the discursive/material production of security, this approach attunes us to the ways in which technology impacts security networks in different manners over time.In this way, the framework also highlights the processual becoming of material objects and in particular 'the transformation of their purpose' (Mutlu, 2012: 179), which is especially pertinent when studying the appropriation of devices.
The empirical material analysed consists of official information and documents from the British health authorities and Google/Apple, media reporting and debate, and computer science research.The material from the health authorities and Google/Apple is analysed in order to capture the state's and Big Tech's understandings and representations of the technology, and how they balance concerns for privacy and public health, while the newspaper articles help us further understand the state's and Big Tech's positions, but also capture the public and political controversy surrounding the app's implementation.This material also provides insights into the functioning of the technology itself.The computer science research is analysed to provide a more in-depth understanding of the functioning of the technology.The material was collected through a systematic review of texts published on Google's and Apple's home pages and on British health authorities' various websites, as well as a thorough search and scan of news archives on the topic of the NHS Covid-19 app.
The analysis is divided into four parts.The first part provides some background and justification for the choice of case.The second, third and fourth examine the emergence, continuity and transformation, respectively, of smartphone/human relations in the security complex of the Covid-19 pandemic and more specifically the controversy surrounding digital contact-tracing.

Background: The Google/Apple Exposure Notification framework and NHS Covid-19
On 10 April 2020, tech giants Google and Apple announced that they had teamed up to help governments halt the spread of Covid-19, and created what has become popularly known as the Google/Apple Exposure Notification (GAEN) framework (Apple, 2020; Manthorpe, 2020a).The GAEN framework facilitated digital contact-tracing through Bluetooth signals.By constantly sending out signals, a smartphone running a built-in app using the framework would register when in close proximity to another, and the two (or more) phones would exchange anonymous identifiers.If one of the people carrying a phone with the app was infected with Covid-19, they could share that information with the local (typically national) app, which in turn alerted the people who had been in proximity with the infected person (Google/Apple, 2020a: 3; Kessibi et al., 2020: 3).
The NHS Covid-19 app is one example of a local app that was developed within the GAEN framework.The road leading to its implementation, however, was rocky.Having first developed its own app outside the framework, the NHS was eventually forced to develop a new app within it (Sabbagh and Hern, 2020).Although many other local apps, such as Norway's Smittestopp, Australia's Covidsafe and Germany's Corona-Warn, were also initially developed outside the framework and later compelled to comply with it, the British case stands out in the sense that British health authorities were particularly reluctant to move their operation into the framework.Furthermore, they continued to test the limits of the GAEN framework after complying with it.As such, the British case reveals how a GAEN-based app can function, but also sheds light on the contentious politics of its development and implementation.
As a case of how security is practised under exceptional political circumstances, moreover, the Covid-19 pandemic and more specifically implementation of the NHS Covid-19 app is well suited for an exploration of how security appropriations of the smartphone negotiate the relationship between liberty and security.Given that the virus constituted an existential threat to the health and lives of entire populations, it would be reasonable to expect it to have triggered and legitimized exceptional securitizing measures, for instance in the form of intensified surveillance (Kirk and McDonald, 2021).On the other hand, the way in which smartphone appropriation provided opportunities for private-sector actors to exert influence in the security domain in this case might suggest that the emergence of exceptional circumstances does not necessarily lead to a liberty/security trade-off that goes strictly in the favour of security (see Flood et al., 2020;Ilves, 2020;Kitchin, 2020).

Emergence: Negotiating the meaning of privacy
The emergence of the smartphone in the Covid-19 security complex was characterized by the negotiation of a trade-off between precision and privacy, as increased precision of contact-tracing was seen to compromise privacy.When the GAEN framework was first announced by Google/ Apple, it was overwhelmingly received as privacy-preserving.This was emphasized in the information from the Google/Apple team itself, which wrote in an early press release that 'user privacy and security' were 'central to the design' (Apple, 2020).Moreover, media reporting routinely highlighted that the GAEN framework was privacy-preserving by contrasting it with the early British version of the app, which was considered more privacy-invading (Hern, 2020a).
Three main features were highlighted as privacy-preserving, which seems to have contributed strongly to establishing the meaning of privacy in this context.First, there was the use of Bluetooth signals to measure proximity.The individual smartphone sent out Bluetooth signals and their strength, as received by other smartphones, indicated proximity to those other smartphones (Hoepman, 2021: 2;Shen et al., 2020: 1-2).Unlike GPS, Bluetooth cannot pinpoint the exact location and movement of a smartphone (Criddle and Kelion, 2020;Shen et al., 2020: 1).It was therefore criticized for being imprecise in its proximity measurement because it relied on strength of signal for information (Briers, 2020b;Kelion, 2020b).When Bluetooth signals were sent out to 'shake hands' (Kelion, 2020b) with signals sent from other smartphones, moreover, they were given so-called rolling (or ephemeral) proximity identifiers (RPIs) that were derived from lists of temporary exposure keys (TEKs), a second feature that enhanced privacy by ensuring user anonymity.These RPIs were generated anew every 10-20 minutes and the TEKs every 24 hours (Hoepman, 2021: 6-7; Google/Apple, 2020c).Third, there was decentralized storage, as the RPIs and registered handshakes were stored locally on the smartphones that received them.By preventing centralized storage, which would have sent the identifiers to the public health authorities, this approach ensured that governments could not access contact-tracing data unless an infected person voluntarily shared this information through a national app (Google/Apple, 2020a; 2020b; Hoepman, 2021: 3;Shen et al., 2020: 2).
Despite some early speculation that GPS tracking might be used (Kelion, 2020a), and despite criticism that the system might allow for identification even if identifiers were rolling and exposure keys temporary (Kessibi et al., 2020), the NHS quickly opted for Bluetooth signalling and ephemeral identification in its initial version of the app.By offering these solutions as alternatives to more invasive location-tracking technologies, the smartphone itself provided an initial definition of privacy in the context of contact-tracing as a pandemic response.Because the Bluetooth and ephemerality baseline was so swiftly established as the privacy option in a contact-tracing context, GPS tracking and active identification, which could have been helpful for public health purposes, such as for analysis of the social graph and movements in the population (Levy, 2020;Mason, 2020), were deemed inappropriate (Levy and H, 2020).
The decentralization feature, on the other hand, was far more contested, as the initial NHS app took a centralized approach in direct opposition to GAEN standards, by wanting to store RPIs with the NHS instead of locally on individual phones (Hern, 2020b).As it developed its app, in close cooperation with the Alan Turing Institute and the National Cyber Security Centre (NCSC), the NHS was aware of the privacy implications of opting for centralized storage.Nonetheless, it insisted that this approach was sufficiently privacy-preserving, and that centralization was necessary from a public health perspective as it would provide access to the social graph and allow analysis of aggregate data to improve both the functionality of the app and the pandemic response more generally (Martin, 2020).Tellingly, the NCSC insisted in early May 2020 that the centralized approach it was developing was privacy-preserving because it would ensure anonymity in the collected data.Moreover, it argued that a decentralized model would be pointless as it would not assist the authorities' wider pandemic response if analysis of aggregate data was not possible.By arguing that the 'design of the app must be led by the epidemiological needs of [the] response', and that 'the security and privacy models have to fit around that' (Levy, 2020), the NCSC even explicitly prioritized public health over privacy protection.
Thus, the emergence of the smartphone in the Covid-19 security complex and more specifically its appropriation for digital contact-tracing allowed for the construction of different privacy standards.Google/Apple insisted that a sufficiently privacy-preserving approach would have to involve decentralized storage in addition to Bluetooth-tracking and ephemeral identification.The British health authorities, on the other hand, argued that centralized storage would still preserve privacy when combined with Bluetooth-tracking and ephemeral identification.Moreover, the disagreement over privacy standards is an indication that Google/Apple and the UK had different interests in using digital contact-tracing to tackle the spread of the virus.Google/Apple argued for the need to establish a privacy-preserving framework for app development and then find effective ways to halt the spread of the virus within that framework.The British health authorities, on the other hand, argued for the need first to build an effective app and only then to ensure that it preserved privacy as much as possible.This indicates that Google/Apple's primary interest was to preserve the privacy of smartphone users while the state's primary interest was to save the lives of its citizens.

Continuity: Establishing the meaning of privacy
However, the different understandings of privacy quickly converged.Maintaining that any centralized approach would be too invasive, Google and Apple, which run the overwhelming majority of smartphones through their respective operating systems, Android and iOS, refused to support the initial NHS app.By pushing the NHS app to the background when phones were not in use, they made it difficult for the app to detect Bluetooth signals and identify contacts.Tests suggested that an app within the GAEN framework would be able to detect 99% of smartphones, while the NHS app outside the framework detected only around 4% of iOS phones and 75% of Android phones.As a result, the NHS was forced to abandon the app and comply with the GAEN framework (Kelion, 2020c).
This move was largely understood as a case of the UK's being forced to bow to the will of the Big Tech companies (see, for example, Sabbagh and Hern, 2020).UK Health Secretary Matt Hancock accused Apple of 'intransigence' and argued that 'our app won't work because Apple won't change their system' (quoted in McGuiness, 2020).Expressing similar frustration, founder of the Government Digital Service Tom Loosemore complained that 'Google and Apple have given governments an abacus in an era of machine learning' (quoted in Manthorpe, 2020b).
At the same time, however, the British health authorities also framed their sudden compliance with the GAEN framework as a joining of forces, and thus represented the relationship between the UK and Google/Apple as collaborative and symmetrical.The Department of Health and Social Care (DHSC), for instance, seemed happy with the GAEN framework's privacy regulations once they had been adopted by the NHS (DHSC, 2020a;2021: 22-23;Sidhu and Appleton, 2020).Moreover, the NCSC, which had previously explicitly prioritized public health over privacy protection, suddenly changed its tune.Now, following the move to comply with the GAEN framework, the NCSC stated that the 'app is designed to protect privacy while providing useful information' and that it must 'minimise data collection to what's necessary to support the app's declared functions' (Levy and H, 2020).
Around the time of the launch of the new NHS Covid-19 app, however, but long after the British authorities had decided to comply, Google/Apple also launched an update of its Exposure Notification Express (ENX) interface, which allowed the health authorities to aggregate and analyse some anonymized data, primarily based on notification numbers by region and over time (Briers, 2020a;DHSC, 2021).Although this was a slight centralization of approach, this update was still far from the fully centralized approach that the NHS first opted for, and so cannot be understood as a concession on Google/Apple's part (Briers et al., 2021;Hern, 2020b;Levy and H, 2020).
This shows how continuity in human/device relations through appropriations of the smartphone enabled Big Tech companies to determine the meaning of privacy and thus establish the possible conditions for the practice of security.As owners and controllers of smartphone technology, Google/Apple could decide what kind of surveillance would be sufficiently privacy-preserving and even prevent the state from taking a different position.The active obstruction of the British state's attempts to implement an app outside the GAEN framework demonstrates how ownership and control of the technology the state needed to practise security enabled Big Tech companies to steer the co-production of security knowledge.By equating privacy with Bluetooth signalling and more consequentially decentralized storage, and subsequently forcing the state to adopt the same position to comply with the framework, Google/Apple mobilized the state's reliance on smartphone appropriation to decide what privacy should mean and how security should be practised.
In this way, the case also confirms the opposing interests of Google/Apple and the state discussed above, by showing how interests other than conventional state security can enforce the privileging of privacy at the expense of public health.Indeed, both Google/Apple and the state demonstrated an interest in saving lives and preserving privacy.Google/Apple illustratively wrote in one early press release that, in the 'spirit of collaboration' it would 'enable the use of Bluetooth technology to help governments and health agencies reduce the spread of the virus' (Apple, 2020), while the NHS stated that 'everyone is in agreement that user privacy is paramount' (Hern, 2020b).As this analysis has shown, however, smartphone appropriation became a way for Google/Apple to decide how concerns about privacy and public health should be balanced.By forcing the NHS to comply with its framework, Google/Apple used smartphone appropriation not only to establish the meaning of privacy in the context of contact-tracing, but also to co-produce security in this context as subject to fairly strict concerns about the civil liberty of privacy.This is striking, given the exceptional political circumstances of the pandemic, which might normally be expected to foster intensified surveillance at the expense of civil liberties.

Transformation: Practising security through privacy
As outlined above, the NHS Covid-19 app was developed within the GAEN framework, and therefore adopted Google/Apple's method: Bluetooth signals from separate phones shook hands, were stored ephemerally and locally, and alerted users in case of infection and proximity.Within this framework, however, the British version of the app did develop its own distinctive functionality.First and foremost, the NHS Covid-19 app was programmed to alert people when they (or more precisely their phones) had been in 'close contact', which generally meant within two metres for more than 15 minutes (NHS, 2021a(NHS, , 2021b)).This was a public health consideration that did not have any privacy implications, and so although the GAEN framework provided the algorithm that calculated distance and time with infectiousness to determine occurrences of 'close contact', the interface placed no direct limitations on government decisions in this regard (Briers, 2020a;DHSC, 2021: 25).Thus, the NHS decision on when to trigger alerts was a first manifestation of the national health authorities' ability to leave a mark on digital contact-tracing as a security practice through processes of appropriation, even when cooperating with Google/Apple and limited by the technological solutions it provides.
On top of providing its own standards for when to alert, the app ran two unique contact-tracing features: local area alerts and QR venue check-ins.The local area alerts feature allowed the app to alert app users if their local area had become a high-risk area for spread of the virus.When registering on the app, users were asked to provide the first part of their postcode, and would receive a notification if the area they lived in became 'high risk' (DHSC, 2020a).The QR venue check-in feature allowed app users to scan a QR code with their phone when they entered a bar, gym, shop and so on, so they could be alerted if there was a chance that they had been in the same venue at the same time as someone infected with the virus (DHSC, 2020a(DHSC, , 2020b)).Just as with the definition of 'close contact', it was the national health authorities that decided when an area was to be deemed high risk or when a venue should be considered a hotspot for infection (Hicks et al., 2020;Kelion, 2021a).
By introducing local area alerts and QR venue check-ins, features that are unique to the NHS Covid-19 app, the NHS was innovative in shaping how the smartphone could function as a security device in an active process of appropriation.These features also show how the technology itself conditioned appropriation.Development of the alert system in general, but especially the QR venue check-in feature which relied on pocket presence and self-extension, was enabled by the smartphone's unique and intimate connection to the human body and thus also by its continuing attractiveness as a consumer device.
However, the strict compliance of these features with the GAEN framework's privacy standards also demonstrates the limited nature of such appropriation.It was only when public health policy was in accordance with the framework's privacy regulations that the state could innovate in this case.Even if unique to the NHS Covid-19 app, local area alerts and QR venue check-ins did little to provide aggregate data that could be used for analysis of the social graph.The UK's failed attempt to recentralize its approach demonstrates this further, as it highlights not only what the state could do within these parameters, but what it could not do.As part of its plan to ease lockdown restrictions in the spring of 2021, the NHS tried to update the app to allow users to consent to their entire venue history, based on check-ins, being automatically uploaded to a central server in the event of their becoming infected.The DHSC insisted that the move would be privacy-preserving.Google/Apple deemed it too intrusive, however -not only on the grounds of centralization, but also because it introduced a form of location-tracking to the app -and promptly blocked the update (Hern, 2021;Kelion, 202).
In this way, it appears that the distinct features of the NHS Covid-19 app contributed to the coproduction and even transformation of security by further consolidating the meaning of privacy.Enacting GAEN-style contact-tracing, local area alerts and QR venue check-ins confirmed that privacy protection meant Bluetooth-tracking, ephemeral identification and decentralized storage.Attempts by the NHS to recentralize data storage were a bid to renegotiate the initially established meaning of privacy.This ultimately had little effect due to Google/Apple's swift rejection of the update.Moreover, the practical appropriations that operated in compliance with the GAEN framework show how establishing the meaning of privacy as a form of security knowledge conditioned the practice of security.Given the specific meaning that was ascribed to privacy, in which decentralized storage was the most important, only practices that complied with and further enacted that prescription could be considered sufficiently privacy-preserving and thus seen as legitimate security policy.
This further demonstrates how smartphone appropriation enables private-sector actors to enter and exert influence in the security domain.Such actors not only participate in the collaborative construction of security expertise, but also contribute to deciding how the state can practise security.The case analysed here is particularly striking since it shows that appropriation can enable Big Tech companies to compel the state to adopt liberal security policies in the face of exceptional political circumstances.It is clear from the above analysis that the UK health authorities wanted to implement more intrusive and comprehensive surveillance measures.Having become reliant on commercially owned technology through appropriation, however, they were not allowed to do so.

Conclusion
This article examines the smartphone as a security device and explores how this device contributes to the co-production of security.It has done so by approaching co-production processes as practices of appropriation, and by analysing the development and implementation of the NHS Covid-19 app.The concept in science and technology studies of co-production has been mobilized to great effect in security studies to illuminate the ways in which public and private-sector actors come together in the production of security.I argue in this article, however, that approaching coproduction through appropriation -the repurposing of consumer devices for security -can extend our understanding of public/private co-production by highlighting how security practices are not only enabled by the technological expertise residing in the private sector, but often actively conditioned by private-sector actors.These actors own the consumer devices that are appropriated for security and as such control the process by which security devices become security devices.Appropriations of the smartphone mobilize the device's extension of the body and access to location and activity data.While this provides space for an intensification of surveillance, especially in exceptional times, it also provides opportunities for commercial instead of security interests to dominate security practices, even in exceptional times.
The analysis shows that in the case of the NHS Covid-19 app, security appropriation of the smartphone enabled the Google/Apple fusion to decide how the state should practise security through digital contact-tracing.The UK health authorities tried to develop an app outside the GAEN framework using its own interface but was quickly compelled to comply with the framework.This can be explained by Google/Apple's forceful construction of privacy as Bluetoothtracking, ephemeral identification and, most importantly, decentralized storage, which had the effect of representing the state's centralized approach as weak on privacy.It can also be explained by the fact that Google/Apple owned and controlled the operating systems that ran on most of the phones to which the state needed access for contact-tracing purposes.This allowed the companies to actively sabotage the functioning of the UK's independent app.
The smartphone contributed to the co-production of security in this context by enabling the privileging of concerns about privacy at the expense of public health, and by offering an effective way for Big Tech companies to enter and gain influence in the security domain.This is especially striking since the Covid-19 pandemic was an exceptional political situation that might be expected to foster illiberal security practices, for instance by sacrificing privacy at the altar of public health.In many cases and places, responses to the pandemic did indeed come in the shape of illiberal security policy (see, for example, Greitens, 2020;Hajnal et al., 2021;Mukherji, 2020).In the case of digital contact-tracing, however, commercial interests came to the fore in security decisionmaking which had the effect of co-producing a liberal security politics even under exceptional political circumstances.
Although Google/Apple offered a framework that was privacy-enhancing compared to the approach that the state initially took, a case can be made that the GAEN framework harbours the potential to foster illiberal practices.The framework arguably constitutes a 'dormant mass surveillance tool' (Hoepman, 2021: 11) in the sense that it is technology that can be used to control and shape future pandemics or other crises and social upheavals.Even if individual data does not leave the individual phone and cannot be used to identify individuals, Google and Apple can access aggregated microdata stored in their operating systems and use this to analyse, shape and control communities and populations (Hoepman, 2021: 9-10, 12;Veale, 2020).The framework is privacyenhancing in the sense that it anonymizes data but can nonetheless be a powerful surveillance tool (see also Dubal, 2020;Thomas, 2020).
This can help explain why Google/Apple opted for a contact-tracing model that restricted their data access, counter to their normal modus operandi of maximising data access to maximise profits.By taking social responsibility and building credibility as a guarantor of privacy early in the pandemic, at a time when Big Tech companies were facing intense scrutiny over privacy issues, Google and Apple performed the role as serious, trustworthy and legitimate political actors.This allowed the companies to continue aiding state authorities in the provision of public service goods, and thus to keep harvesting personal data going forward (see Braddock, 2022).In this way, the privacy focus that dominated debates about Google/Apple's involvement in the global pandemic response even worked as a distraction from the wider transgression where Google/Apple increased their authority in the security domain (Nature Machine Intelligence, 2022;Sharon, 2021).
Looking ahead, then, there is a possibility that Google/Apple could further consolidate power in the security domain through so-called function creep, which 'refers to the tendency for a project to exceed its original purpose' (Jacobsen, 2015: 156) and has unintended and potentially harmful effects even when successful.From this perspective, the co-production of a liberal security politics as enabled by smartphone appropriation that was seen in the case of the NHS Covid-19 app might entail a longer-term intensification of corporate and state surveillance.In this way, the present study indicates that Big Tech companies' future security enactments are likely to be done in the name of privacy and thus operate through the principle of liberty.