“Talk to strangers!” Omegle and the political economy of technology-facilitated child sexual exploitation

This article examines how technology-facilitated child sexual exploitation has flourished within the laissez faire regulatory frameworks of neoliberalism, and argues that political economy should play a more central role in theorising about child sexual abuse. Drawing on the case study of Omegle, a livestreaming website that matches strangers via webcam, the paper illustrates how deregulatory trends have produced an alignment between the sexual interests of child sexual abusers and the economic interests of some online service providers. The paper suggests that intersecting political ideologies and economic structures have increased opportunities for child sexual exploitation and decreased formal and informal controls, while recruiting paedophilic desires and exploitative subjectivities within processes of capital accumulation. The paper explores the implications of political economy for theories of child sex offending, which have typically focused on the psychological, social and legal dimensions of child sexual abuse while overlooking the role of capitalist structures and imperatives.


Introduction
This article addresses the link between political economy and technology-facilitated child sexual exploitation, and examines how the cultural force and economic potential of paedophilic desire has been mobilised by a largely unregulated technology sector.The last quarter century has been characterised by annual increases in reports of technology-facilitated child sexual exploitation, well beyond the capacity of law enforcement and online safety agencies to address (Bursztein et al., 2019).Technology-facilitated child sexual exploitation includes a range of offences in which networked and camera technology facilitate the sexual abuse of a child, including online grooming, the live streaming of abuse, sexual extortion, and the distribution and consumption of child sexual abuse material.A recent nationally representative survey of young American adults found that 15.6% reported at least one incident of online child sexual abuse, such as the non-consensual distribution of a sexual image of them as a child (Finkelhor et al., 2022).Surveys that use broader definitions of online sexual harm, including online sexual solicitation by a stranger, suggest that the majority of children will experience some form of technology-facilitated sexual abuse prior to the age of 18 (WeProtect, 2021).
There is increasing recognition of the ways in which online services are disincentivised to intervene in online abuse and harassment, since the resultant spikes in use, traffic and attention are profitable (Salter, 2018).However, recent media coverage and civil suits have advanced a darker prospect, namely that sexualised content depicting children is being actively promoted and facilitated by online platforms.For example, the popular adult site Pornhub has been accused of hosting videos of children being sexually abused that are also monetised through advertising, with victims complaining that the site has not responded expeditiously to requests for removal (Kristoff, 2020).Pornhub's parent company, Mindgeek, is now being sued by dozens of people who allege that it has profited from videos of their sexual abuse and exploitation (Iati, 2021).Twitter, the influential microblogging platform, has faced years of accusations that it has enabled the circulation of child sexual abuse material while permitting communities of child sex offenders to communicate through its service (Salter & Hanson, 2021).
These examples suggest that theoretical conceptualisations of technology-facilitated child sexual exploitation need to take seriously the role of the technology sector and its embedded values and practices in facilitating the abuse of children.To explore this point, this paper offers a case study of the live-streaming platform Omegle.Live streaming services feature in a range of patterns of online exploitation.The most well studied has been the live streaming of child sexual abuse from low-income countries (particularly in South-East Asia) into highincome countries for profit (Napier et al., 2021), however live streaming is also used to lure children into sexual acts online (which are often recorded) and to broadcast child sexual abuse material to other offenders (WeProtect, 2021).A number of child suicides have been linked to sexual extortion facilitated by live streaming applications (Nilsson et al., 2019).
Launched in 2009, Omegle offers few features beyond text or video chat with randomly paired strangers.After declining in popularity throughout the 2010s, the "retro" appeal of the site and the novelty of its random video chat caught the attention of influencers on social media platform TikTok during the COVID-19 pandemic (Lorenz, 2020).In 2021, the BBC reported that videos tagged with "Omegle" on TikTok had garnered 9.4 billion views, driving up to 65 million visitors to Omegle per month (Tidy, 2021).This explosion in popularity led to police forces issuing warnings in a range of jurisdictions, including UK, US, Norway, France, Canada and Australia, due to the risk of child sexual abuse on the site (Tidy, 2021).After reviewing the implications of new technologies for theories of child sex offending, the paper presents a "walkthrough" analysis (Light et al., 2018) of Omegle, documenting key features of the site's design, administration and user experience.The analysis focuses on the contradictory vision and governance of the site, in which a superficial presentation of Omegle as a "great way to meet new friends" glosses over a sexualised and unmonitored environment that includes a significant number of child sex offenders.At the time of analysis, pornographic advertisements targeting adult men were Omegle's only apparent source of income, and yet the site encouraged use by children aged 13 and above.
This paper reflects on the ways in which the economic imperatives built into the contemporary internet, unopposed by state enforcement of the public good, can activate paedophilic desire for financial gain and transmute children's bodies into high-demand and profitable sexual opportunities.Recent shifts in Omegle's operations and business model appear to have been prompted by public pressure and mounting civil suits, underscoring the regulatory vacuum and the failure of government to endow themselves with the power to ensure that online services are not catering, inadvertently or otherwise, to sexual interest in children.

Political economy and child sexual exploitation
Theories of child sex offending are diverse and range from psychological and psychosocial scholarship on the characteristics and motivations of perpetrators (Seto, 2019), rational choice and routine activity theories focused on the contexts of child sex offending (Brown, 2022), sociological analyses of the social, symbolic and legal constructions of child sexual abuse (Pilgrim, 2018), and feminist accounts of the role of gendered norms, structures and practices in sexual offences against children (Cossins, 2000).While not discounting or supplanting these theoretical perspectives, recent scholarship has highlighted the role that political economy (that is, the role of markets and the state) plays in shaping the contexts of, opportunities for, and responses to child sexual abuse.For example, in her analysis of child protection history in the United Kingdom, Campbell (2023) argued that the rise of neoliberal politics and government austerity in the 1980s resulted in underfunded welfare and health services that struggled to accommodate the increased number and complexity of child sexual abuse reports.Missteps in the investigation and prosecution of child sexual exploitation during this period were linked to a lack of adequate resources and support infrastructure as a result of government austerity (Campbell, 2023).Similarly, Cheit's (2014) analysis of criminal justice investigations and media coverage of major child sexual exploitation cases in the United States in the 1980s and 1990s points to the influence of government investment in child protection responses.Like Campbell (2023), Cheit (2014) identifies a lack of training and resources in the difficult trajectory of sexual exploitation cases through the courts during this period of time.Their work points to the intersecting economic, political and cultural forces at work in shaping understandings and responses to child sexual abuse.
In his account of the relevance of political economy to criminology, Reiner (2018) notes that criminal opportunity as well as formal and informal controls are all shaped by political and economic forces.Such considerations have become particularly acute with the rise of the internet and the technological facilitation of child sexual abuse.The sociotechnical apparatus of the internet has enabled and expanded opportunities for child sexual abuse, but this apparatus, and the technology sector that administers it, are reflective of the economic and political conditions of its emergence (Keen et al., 2020).It was during the mid-1990s, at a time of neoliberal ascendency, that the internet was commercialised.Within the regulatory zeitgeist of the time, legislators were eager to unburden the fledgling technology sector of any "red tape" that might hold back innovation and economic growth (Kosseff, 2019).This was a period in which the dominant media framing of child sexual abuse in the United States and United Kingdom was via the lens of "false allegations" and "moral panic", legitimised by extensive academic commentary (Beckett, 1996;Kitzinger, 2004).It was with some shock that Jenkins (2001), who had previously claimed that online and offline child sexual exploitation allegations were the product of moral panic, began to study online child sexual exploitation in the late 1990s only to find that the problem was far more serious than he had anticipated.He acknowledged that he embarked on this study expecting to debunk claims of widespread online child sexual abuse, and was shocked by the severity of the content and the openness with which it was traded (Jenkins, 2001).He would later go on to ask why the problem of child sexual abuse material had not detonated a "virtuous" moral panic in the form of public concern and demand for more decisive intervention (Jenkins, 2009).
This transition in Jenkin's thought is reflective not only of the prevalence and visibility of child sexual offending online but also of the struggles of culturalist theorists to explain or account for ongoing increases in online offences against children, and the apparent passivity of government, industry and the public in the face of these increases.After all, the foundational assumption of moral panic theory is that the public and authorities tends towards over-reaction to evidence of child sex offending (Pilgrim, 2018).A more politically and economically inflected view was detailed by Haug (2001) in her prescient discussion of the impacts of neoliberalism on child sexual exploitation, in which she warned that "the letting-go of ethically based preventative measures designed to regulate market forces" would provide private sector "entrepreneurs" with opportunities to profit from "child pornography and prostitution" (p 77).In a significant break from moral panic analysis, Haug (2001) pointed to major child sexual exploitation cases in Europe in the 1990s not as evidence of social hysteria but as indicative of macro-economic trends that were turning the "whole world into a warehouse of raw materials for the benefit of investment-ready capital" (p 77).In short, she warned that the ascendency of global free market liberalism in the 1990s, very much symbolised and affected by the commercialisation of the internet, would have the effect of disembedding market forces from legal frameworks that had prevented the open integration of paedophilic desire into the domain of capital accumulation.This description is remarkably apt in light of the capacious commodification of online human interaction that constitutes the underlying business model of social media companies and internet monopolies (Salter, 2017).The opportunity to abuse children and access child sexual abuse material is highly appealing to significant market segment that, in the absence of regulatory forces, social media companies and internet service providers have no incentive to disrupt (Salter & Richardson, 2021).The next section examines the integration of paedophilic desire into online capital accumulation with specific reference to the website Omegle.

Child sexual exploitation on Omegle
Omegle is a live streaming website that was launched in 2009 by then 18-year old American Leif K-Brooks, initially as a text-only service that paired random users for a private chat (K-Brooks, 2009).The site saw rapid growth amongst young people, with 150,000 daily users within a month of launch (Quenqua, 2009).Even in its early text-based iteration, K-Brooks identified abuse and harassment as the "biggest problem" on the site (K-Brooks, 2009).At the time, journalist Jason Tanz commented on the likelihood that it would be misused by child sex offenders.Quoted in the New York Times, he said: "The first person I connected with said, 'let's have cyber [cybersex] right now … The second was a 14-year-old kid from London.It's not hard to see how this is going to be a problem" (Quenqua, 2009).Despite these warnings, Omegle would incorporate a video chat feature in 2010 which paired strangers using their webcams.This new feature led to a dramatic increase in the popularity of the site (Bolton, 2010).Following complaints about nudity and sexualised behaviour, Omegle created a "moderated" and "unmoderated" section in 2013.The ostensible aim of the "moderated" stream was to reduce nudity, harassment and sexual content on the site and to protect users under the age of 18 from exploitation and abuse.
Omegle has always been popular with children.A survey of 1,333 college-aged Americans (median age 20) found that 17% reported using an anonymous chat or video service such as Omegle while a minor (Greene-Colozzi et al., 2020).Despite its stated purpose to facilitate friendly social interaction between strangers, Omegle has become notorious for child sexual exploitation (Citron & Wittes, 2017).A recent study documented anonymous Omegle users acknowledging that they had used the site to watch child sexual abuse material and solicit children for online sexual abuse (Demetis, 2020).International alarm at rates of sexual exploitation on Omegle during COVID-19 led to a formal investigation by multiple United Nations Rapporteurs, who reported that their cyber investigation team was connected through Omegle with children engaged in sexual acts, including "young prepubescent boys masturbating live on the video chat" (Singhateh et al., 2021).
In June 2021, the authors undertook a "walkthrough" analysis of Omegle to explore the underlying ethos and values of the site, as well as to identify those design features that may be contributing to criminal sexual offences against children.The walkthrough methodology provides a structured framework for the exploration of the design features of mobile and internet applications to uncover the intention and purpose of their designers, the impact of the application on user experience and behaviour, as well as the implied sociocultural meanings behind certain functions and features (Light et al., 2018).The walkthrough method involves data gathering on three key elements of the application: 1) the environment of expected use, which examines how the application's owners and developers anticipate that the site will be used, and how they regulate user activity and seek to generate profit.This analysis can be performed simultaneously with 2) the technical walkthrough which identifies the intentions that lie behind the design and architecture of the service.In this process, researchers may also come across 3) evidence of unexpected practices, in which users are appropriating the service for their own interests in a manner that is not anticipated by the owners and developers (Light et al., 2018).This analysis focused specifically on child sexual exploitation as one such "unexpected practice", although the extent to which such conduct on the site is unanticipated by the owners is called into question by the analysis that follows.
A key point of difference between the walkthrough and other research methods, such as digital ethnography, is that the walkthrough is focused specifically on software design, including its underlying economic and cultural assumptions and implications, rather than the phenomenology of the user experience (Light et al., 2018).The Omegle walkthrough took place while the COVID-19 pandemic was under way, and after over 12 months of rolling lockdowns, which forced an unprecedented number of children online globally for social and educational purposes.The walkthrough analysis was conducted by the second author, and included repeatedly browsing the application over a period of one month, while assessing, observing, and taking notes and screenshots in order to document the interlinked technical and cultural dimensions of the user interface.These notes were shared with the first author and refined through ongoing discussion and repeat visits independently and jointly to the Omegle site.In conducting the walkthrough analysis, the authors did not interact with other users of the site but instead sought to investigate Omegle's affordances and offerings to users, while considering consistencies and contradictions between Omegle's stated purpose and its availability for other activities, including criminal behaviour.Since the project gathered no personal information from human subjects, our Human Research Ethics Committee advised that ethics approval was not necessary.
There are strengths and limitations to our case study approach and the use of the walkthrough methodology.The aim of this paper is to explore and expand on theories of political economy as they pertain to technology-facilitated child sexual exploitation, and case studies provide a strategic research methodology through which the depth analysis of paradigm examples can deepen conceptual frameworks (Radley & Chamberlain, 2012).While suited to the testing and expansion of theory, case studies do not necessarily provide generalisable findings.The paper is focused on one website and its specific findings are limited to Omegle.Given the momentum for policy and law reform to promote online child safety (see below), as well as robust disagreement over how child protection measures should be incorporated into online services and products, detailed case studies of other apps and websites would expand our understanding of the response of the technology sector to social, political, economic and regulatory forces.Furthermore, the walkthrough methodology hinges on informed interpretation of software design and does not require interviews or commentary from the designers themselves or from the website owners.Hence, the paper does not offer insights into specific individual motivations, but instead the broader function and impact of Omegle.The design and function of websites and apps evolves over time, as the analysis below makes clear.The walkthrough method analysed the operations of Omegle at a particular point in time and there have been a number of alterations to the site since.The paper will include observed changes to Omegle's design and operation where relevant.

Technical walkthrough
Omegle can only be accessed through a website and does not have an official app, although the site can be used on a mobile phone.The overall appearance of Omegle is reminiscent of a website from the early 2000s rather than the more sophisticated appearance of contemporary social media platforms.The simple design of the Omegle website would reduce the amount of server power necessary to run the site, and the fact that the service is self-published and not available through third parties such as app stores further reduces overhead costs.On the whole, the appearance and structure of the Omegle service suggests an overarching concern about the reduction of operating expenses.
In the walkthrough, we entered through an introductory page which exemplified a haphazard approach to design that combines text and low-quality graphics.Omegle's intended purpose centres around the social experience of finding and communicating with strangers on an international scale.This was communicated on the opening page by site banners encouraging users to "Talk to strangers!" as well as an introductory paragraph which explained Omegle's purpose as a website that randomly pairs people together for the purpose of talking to one another (see Figure 1).Omegle's stated vision for friendly social interaction was further evident in its invitation to underaged users aged 13 and above, albeit "with parental permission".Like Omegle, many youth-focused sites claim to disallow children under 13 since the Children's Online Privacy Protection Act, enacted in the United States in 1998, obliges websites and internet services to implement higher standards of privacy protections and parental consent for users aged 12 or below.However, without robust age verification mechanisms, younger children can easily access such sites.Clearly, Omegle's operators were aware of the dramatic spike in the site's popularity that occurred over the course of the COVID-19 pandemic, with the opening page noting that Omegle enabled socialisation with "social distancing".However, the proposition that Omegle is designed for friendly interaction was immediately contracted by links to pornography sites and foreboding warnings on the introductory page which warned that the "Video is monitored.Keep it clean!".
This warning occurred alongside the claim that the site is moderated, which was immediately contradicted by the option to "start chatting", which stated that video chat is an "unmoderated section" (see Figure 1).There was no moderated video chat option.User uncertainty was amplified on the front page by the depiction of politically fuelled "meme" comparing Chinese president Xi Jinping to Winnie the Pooh (see Image 1).This meme was quite large in size and presented with no explanation and no apparent connection to the purpose or intended use of the platform. 1This image further contributed to the page's lack of coherence.
Omegle offered limited functions other than text or video chat, and even these features were presented in a manner that was puzzling.For instance, directly beneath the dark blue "Text" option, an information box contained the words "spy (question) mode" (Image 2).There was no explanation of what this meant for users who select this option.There was an option to chat via a "college chat" by submitting an email address into the textbox, although it was not explained what "college chat" refers to.The overall impression delivered by the opening page was of a site where options have been added, subtracted and changed over time without concern for consistency or ease of use.When the user selected video or text chat, a pop-up box appeared, in which the user was asked to check two boxes.The first box acknowledged that the user is bound by Omegle's terms of service and community guidelines whereas the second box acknowledged that the user represents that they are either over 18 or 13 or older, under parental supervision.There were no requirements for signing up or logging in.Omegle makes no attempt to request personal information.The user was then taken to text or video chat, which has a "lo fi" appearance consistent with Omegle's outdated design (Image 3).Chat commenced immediately, under an overarching banner "STAND WITH HONG KONG AGAINST THE CCP!", calling back to the opening page's mockery of the Chinese regime.Exiting the site is a simple as clicking close on the browser.

Environment of expected use
The precise vision or purpose of Omegle is unclear.On one hand, the site badges itself as an opportunity to "talk to strangers" and engage in "clean" fun.At the same time, the site's owners signal in multiple ways their awareness that a significant proportion of the site's users are engaged in sexual activity.This was particularly evident in the site's reliance on pornographic advertisements.Once the user entered the video or text chat environment, the site aggressively marketed sexual webcam services in the form of "gay cams" and "girl cams".A prominent advertisement in the top left corner depicted a young woman reading a book next to the slogan "Like her? Fuck her!" and a link to a pornographic site.The content of the pornographic advertisements have shifted over time, with considerably more explicit "cam" adverts available on the site earlier in 2021.
Omegle's contradictory vision between "clean fun" and potentially sexual interaction is further underscored by Omegle's governance structure, which is articulated through two key documents: its terms of service 2 and its community guidelines. 3Both features were available on the home page or by selecting them where they are hyperlinked in the introductory paragraph on the Omegle site.Omegle's governance is characterised by what Jarrett (2008) called a strategic denial of authority, in which the site denies any role or responsibility for harms that befall users on the platform.Omegle rejects any responsibility for users, including minors, who access and utilise Omegle's services.The terms of service state "Omegle does not owe you any duty to protect you from the acts of other users or other third parties" and "THE ENTIRE RISK ARISING OUT OF YOUR ACCESS TO AND USE OF THE SERVICES REMAINS WITH YOU".
Like other social media sites, the Omegle terms of service contain a mandatory arbitration clause in which users waive their right to a civil or criminal trial.Of course, the vast majority of social media users do not read these terms of service or the consumer arbitration agreements therein, which have been described as "unfair and deceptive" (Rustad et al., 2011, p. 643).Rustad et al. (2011, p. 644) go on to explain how such clauses are being used by social media companies to create a "liability-free" zone that contravene basic principles of fair process for consumers.Omegle users who seek restitution for the harms that befall them on Omegle are obliged to seek arbitration through a company selected by Omegle, and in such proceedings, there is no right to discovery, no right to appeal and no right to open proceedings.
The community guidelines listed a range of harmful practices that are "prohibited" on Omegle, including violations against the law, violence and threats, hateful conduct and harassment, nudity, pornography and sexually explicit conduct and content, conduct or content involving minors.However, the guidelines made clear that Omegle has no obligation for proactively enforcing its guidelines, stating: "While Omegle is not responsible to you as a user for enforcing these Community Guidelines, reports of violations are helpful to Omegle".Such terms of service rely implicitly on users who encounter unauthorised or illegal activity to report it to the platform (Horsman, 2018).In the context of child sexual exploitation, the absence of proactive compliance measures places the onus on the child victim to report their own victimisation.Rates of disclosure of child sexual abuse are low, and even lower when the child is being threatened or blackmailed (London et al., 2005).Furthermore, even if a report of abuse is made to a live streaming services or other online platform, the adequacy of the response is uncertain.It was unclear whether Omegle has an active safety team.The second author sent an email to the safety email address in July 2021, and they have yet to respond at the time of writing.Reports of breaches of community guidelines were apparently for internal Omegle use as there was no suggestion that Omegle will help in any way.Furthermore, the supposed prohibition against pornography and sexually explicit conduct and content is contradicted by Omegle's own history of pornographic advertisements.
The site provided no detail or information on current content moderation practices.Some description of Omegle's approach to moderation was available from court testimony from K-Brooks in 2018, when he was required to appear in court during the proceedings for a child sex offender who used Omegle to distribute child sexual abuse material (United States of America vs. Wilbert, 2018). 4In his testimony, K-Brooks indicated that moderation on Omegle was limited to the "moderated" stream.During the first few seconds of the video chat, four screenshots were automatically taken and uploaded for analysis for automated assessment by a software system designed by K-Brooks.If the software program identified nudity or sexual content, it was queued for assessment by human moderators, contracted through a thirdparty company.Moderators then flagged any images that appeared to be in breach of the company's terms of service.If an image was determined by a moderator to be suspected CSAM or evidence of child exploitation, the software program developed by K-Brooks automatically compiled a report that was submitted electronically to the National Centre for Missing and Exploited Children (NCMEC) in the United States.
This system had glaring insufficiencies.Moderation was voluntary and there was apparently no mechanism to detect offenders who selected the unmoderated stream.In the "moderated" option, screen captures were made of only the first few seconds of a video chat, and hence these streams were actually unmoderated other than their initial moments.Initial identification of CSAM or child exploitation was dependent upon software of unknown efficacy.K-Brooks has explained that no more than four moderators are employed to scan images generated by up to a million users a day.Furthermore, K-Brooks admitted during cross-examination that not all flagged images are in fact reviewed by human moderators because "things can drain from the end of the queue if it gets too big … not all images end up getting viewed.We try to review most of them, but sometimes they don't".

Discussion
While the stated purpose of the service is to connect its users with others for the purpose of friendship, Omegle's operating model and governance strategies do little to support this vision.Contradictions are evident in Omegle's claims to moderate its video chat followed by the apparent unavailability of a moderated video chat option; statements that the site is not for sexual activity while delivering pornography advertisements to users; and welcoming minors to the site despite this pornography, and while it is apparent that the site is being used for a sexual purpose by many or most users.The site included no basic safety measures such as user registration and verification, or a responsive safety team.Its design features and governance documents were confusing.boyd ( 2013) has argued that websites with an abandoned and sexualised appearance can signal a general lack of care and oversight and appeal to a more suspect audience.
Internet governance scholarship has rejected portrayals of online services as neutral intermediaries, emphasising the ways in which online services shape and craft online interaction (Gillespie, 2015).However, Sadowski (2020) goes further to challenge even the commercial nomenclature of online "platforms", which he argues retains a rhetoric of neutrality in which services are mere "platforms" upon which users interact.Pointing to the commercial imperatives of online services, he argues that a more fitting metaphor is that of the "shopping mall", which also provide a space for social interaction within a broader commercial and retail environment.Omegle's narrow focus on pornographic advertisement suggests that its physical equivalent is not a shopping mall, but more specifically a pornography store, albeit one that welcomes children into the store to mix with adults, and provides unmonitored rooms for those adults to interact with children in private.Such a store would undoubtedly prove very popular for people with a sexual interest in children.
In response to increasing government and public concern about user safety, a number of social media companies have voluntarily implemented governance and oversight initiatives that aim to provide, at the very least, some appearance of oversight, accountability and corporate concern (Gorwa, 2019).While Omegle has not implemented any such governance reforms, in the wake of accumulating public attention and civil suits, they have made alterations to the design and operation of the site.In 2022, Omegle updated their terms of service to stipulate that the site is now for adults only.At the time of writing, pornographic advertisements have been removed from Omegle, and it is unclear how Omegle is generating income.The Omegle privacy policy states explicitly that "Omegle does not exchange your personal information for money", however, it is unlikely that Omegle is providing a livestreaming service to millions of customers per month for free.In 2016, a security researcher reported that Omegle was permanently saving and archiving all text chat logs (Ehrenkranz, 2016).In the same year, K-Brooks co-founded a technology start-up Octane AI that creates chatbots (computer programs based in messenger applications that emulate human conversation) (Roof, 2016).The computational model underlying such chatbots needs to be trained on a "diverse high quality training data set" of text-based conversations, however finding such a data set "is not an easy job, even for big companies" (Bapat et al., 2018, p. 2).Obviously, Omegle would generate a large and valuable amount of computational data for such an exercise.
There is evidence that Omegle has improved its moderation practices.In 2019, Omegle made 3,470 reports to NCEMC, which increased to 20,265 in 2020 and 46,924 in 2021 (NCMEC, 2020(NCMEC, , 2021(NCMEC, , 2022)).In 2022, Omegle filed 608,601 reports of child sexual exploitation to NCMEC (NCMEC, 2023), a 1197% increase on the previous year.This figure is higher than the reports made by very popular social media applications including TikTok (288,125) and Snapchat (551,086) (NCMEC, 2023).When queried by a journalist about this increase, an Omegle spokesperson reiterated the website's ethos of personal responsibility but indicated that their moderation efforts had been augmented.
They stated: Although users are solely responsible for their behavior while using the website, Omegle has voluntarily implemented content moderation services that use both AI tools and contracted human moderators … Content flagged as illegal, inappropriate or in violation of Omegle's policies can lead to a number of actions, including reports to appropriate law enforcement agencies.(quoted in Otis, 2023) Under current legislative arrangements, not only is Omegle immune under criminal law for sexual offences that occur through the website (Citron & Wittes, 2017), but governments are unable to oblige it to change its design to protect child users.Given the shifts in Omegle's design and business model over time, it can be inferred that public scrutiny as well as accumulating civil suits, as well as new business opportunities, have prompted shifts in the site's policies and practices.However, Omegle's statements that the site is only for use by adults is not enforced by any age verification mechanism, while apparent changes to its business model away from pornographic advertising lack transparency.Omegle can be understood as a member, artefact and product of a largely unregulated technology sector with a shared apathy in relation to child protection concerns.Noisy denunciations of child sexual abuse by technology company representatives have not been followed by effective action, while governments have not endowed themselves with the power to enforce such action (Salter, 2023).The retreat of government from the regulation of online content and interaction has enabled the proliferation of business models in which private companies' profit, inadvertently or otherwise, from the opportunities offered by their services to sexually abuse children (Salter & Richardson, 2021).Such developments only affirm Haug's (2001) warnings that the deregulatory trends of neoliberalism have facilitated the ascendence of otherwise submerged cultural and economic forces to recruit paedophilic desire to capital ends.Bray (2011) locates this "paedophilic libidinal economy" in the contemporary demand for children's bodies online, fuelled by the reluctance of governments to intervene in such perverse market dynamics.Nonetheless, as public concern about online harms escalates, governments are proposing a raft of reforms to incentivise or require technology companies to prioritise user safety. 5These shifts are coincident with significant reconsiderations of neoliberal economic and policy frameworks that have been delegitimised in the aftermath of the 2008 global financial crisis and, more recently, the necessity of large-scale government interventions during the COVID-19 crisis.Indeed, the proliferation of COVID-related medical misinformation and conspiracy theories on social media has in many ways accelerated public and political concern about the regulatory vacuum and lack of accountability for the actions of the technology sector.

Conclusion
Drawing on the case study of Omegle, this paper has illustrated how current endemic levels of technology-facilitated child sexual exploitation can be understood in relation to political economic concerns, specifically the legal, economic and moral regulatory dimensions of neoliberalism.The internet was commercialised in the 1990s during a period of anti-regulatory zeal and significant scepticism about the scale of child sexual exploitation.Haug (2001) suggests that these two developments are not coincidental, but rather that the liberation of market forces from pre-existing moral and legal frameworks resulted in the normalisation of child sexual exploitation, which has enabled the incorporation of children's bodies into processes of capital accumulation.Technology-facilitated child sexual exploitation provides a particularly acute example of this convergence, in which a largely unregulated technology sector has produced infrastructure and services that can make profit from the sexual abuse of children.This scenario lies outside criminological theories of perpetrator motivation and characteristics, or situational crime prevention approaches, and calls attention to the commercial and specifically capitalist configuration of online services and infrastructure.
Scholarly work in cultural criminology and cultural studies has examined the commodification of concern and voyeuristic interest in child sexual abuse by the mass media (Schofield, 2004), as well the fetishisation and surreptitious eroticisation of childhood "innocence" (Faulkner, 2011).However, the key theoretical apparatuses of both critical and cultural criminology were generated prior to the media convergences described in this paper and the extraordinary exemption of online service providers and technology companies from the equivalent legal obligations visited upon other sectors.Established media regulation and civil and criminal law frameworks ensured that the capacity of the mass or "old" media to specifically profit from child sex offenders as a potential audience base was very limited.No such constraints have been placed upon the technology sector.In addition, a political economy approach troubles culturalist approaches to child sexual abuse.The technology sector and social media platforms have an outsized role to play in public and media discourse, and thus the very sector that shapes public discussion and debate on child sexual abuse is itself a site of child sexual abuse.
The walkthrough analysis of Omegle's design and administration reveals a service that has minimised costs and maximised profit within the expansive discretion granted to online businesses by governments who have been disinterest in internet regulation.Omegle's active cultivation of a user base of children has not coincided with any investment in child protection; to the contrary, the site clearly devolves responsibility for online harms to minors and has actively serviced them pornographic advertisements.The site's significant number of child users and absence of active safety mechanisms has proven to be appealing to child sex offenders, who then constituted a potential income source through advertising referrals to pornographic sites.Formal remedies are largely unavailable and pressure on services such as Omegle to address their vulnerabilities are limited to civil suits, media exposure and public inquiries.
As Campbell (2023) and Cheit (2014) have demonstrated, political economic factors are always at work in the resourcing of child protection responses, the opportunities available for the sexual abuse of children, and the deterrence provided by formal law enforcement as well as informal controls including social norms and values.Technology-facilitated child sexual exploitation provides a salient example of the relevance of political economy to analyses of child sexual abuse.Hall and Winlow (2018) advocate for the recontextualisation of crime and social harm within the "broad structures and processes of neoliberal capitalism and its attendant culture of consumerism and hyper-individualism" (p 46), in which mutating forms of ruthless exploitation and profit extraction are culturally normalised and legally protected or invisible.The victims of the pervasive harms of such an economic order may well be denied standing within a justice system that has empowered those that exploit them.Such focus on political economy in the development of general theories of crime has prompted a range of criticisms that are beyond the scope of this article (e.g., Wood, 2019).However, the facilitation of child sexual exploitation by the technology sector calls for a politically and economically orientated criminology that is sensitised to the balance of forces at work within sociotechnical processes and infrastructure.
It is pertinent to raise the question about the impact of websites and apps that are indifferent to the risks of child sexual abuse (and indeed may owe some proportion of their audience share to this risk) on the subjective dispositions and behaviours of users.Does the very existence of Omegle signal to men sexually interested in children that their interests are legitimate and to be acted upon?What meaning is transmitted through the ongoing toleration of Omegle by governments and the lack of effective action?To what extent does paedophilic desire map onto the "subterranean values" (Matza & Sykes, 1961) of a technology sector that has pursued the commodification of any and all human activity, including online harm and criminal activity?This case study signals the need for ongoing exploration of the role of political economy and the moral impacts of economic structures and prerogatives on sexual offending against children.

Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
was initially spread by users on the Internet as a way to mock the Chinese President (Haas, 2018).However, it gained popularity with Western users when Chinese censorship authorities banned Winnie the Pooh in July 2017 (Haas, 2018).There is no obvious explanation for the prominent appearance of this meme on the Omegle homepage other than the apparent opposition of Omegle administrators to Jinping's position on the political status and independence of Hong Kong (Restar, 2019).2. Terms of service available at: https://www.omegle.com/static/terms.html 3. Community guidelines available at: https://www.omegle.com/static/guidelines.html 4. Omegle's moderation practices were explained by K-Brooks in a 2018 court case, involving Scott Wilbert, a previously convicted sex offender who in 2014 was investigated by the police in New York for broadcasting child sexual abuse material via Omegle.Wilbert subsequently pled guilty to these charges.However, in 2018, he filed a motion to suppress the evidence against him on the grounds that it was improperly obtained, and hence K-Brooks was obliged to testify as to Omegle's moderation practices.This testimony provides the only publicly available information about Omegle's user safety procedures.Technology companies are not required to declare their moderation or safety practices, although a peak technology industry body has created voluntary transparency codes (Tech Coalition 2022). 5.In the United States, the proposed Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2022 (EARN IT) would remove civil and criminal immunity for online service providers if they are found to be hosting or facilitating the distribution of child sexual abuse material (McKinnon, 2022).In the United Kingdom, under the proposed Online Harms Bill, platforms accessed by children would have a specific duty of care to young users and would need to ensure strong protections from online abuse.Providers who publish or provide access to pornographic content would be required to prevent children from accessing that content (Woodhouse, 2022).The European Union Commissioner for Home Affairs, Ms Ylva Johansson, has announced her intention to introduce legislation which will make it mandatory for companies to proactively detect, report and remove CSAM (Child Rights Intergroup, 2021).In Australia, there have been changes to the Australian Online Safety Act in 2021 which includes shifts towards the proactive responsibility of online services to predict and prevent harms against users.
It is unknown what proportion of its NCMEC reports are actionable; that is to say, what proportion of Omegle interactions reported to NCMEC involve child sexual abuse and exploitation, and what proportion are false positives.Importantly, such reports are necessarily post hoc and do not prevent child sexual exploitation through the site.The very fact that Omegle is reporting over 600,000 potential incidents of child sexual abuse and exploitation per year, and this rate is increasing exponentially, is cause for concern.