Artificial Emotions and Love and Sex Doll Service Workers

Realistic looking humanoid love and sex dolls have been available on a somewhat secretive basis for at least three decades. But today the industry has gone mainstream with North American, European, and Asian producers using mass customization and competing on the bases of features, realism, price, and depth of product lines. As a result, realistic life size artificial companions are becoming more affordable to purchase and more feasible to patronize on a service basis. Sexual relations may be without equal when it comes to emotional intimacy. Yet, the increasingly vocal and interactive robotic versions of these dolls, feel nothing. They may nevertheless induce emotions in users that potentially surpass the pleasure of human-human sexual experiences. The most technologically advanced love and sex robots are forecast to sense human emotions and gear their performances of empathy, conversation, and sexual activity accordingly. I offer a model of how this might be done to provide a better service experience. I compare the nature of resulting “artificial emotions” by robots to natural emotions by humans. I explore the ethical issues entailed in offering love and sex robot services with artificial emotions and offer a conclusion and recommendations for service management and for further research.

Do you begin to see, then, what kind of world we are creating? It is the exact opposite of the stupid hedonistic utopias that the old reformers imagined. … The old civilizations claimed that they were founded on love or justice. Ours is founded on hatred. In our world there will be no emotions except fear, rage, triumph, and selfabasement. … The sex instinct will be eradicated. Procreation will be an annual formality like the renewal of a ration card. We shall abolish the orgasm. …There will be no love, except love of Big Brother (Orwell 1949). In contrast to the Orwellian dystopia of 1984, the "stupid hedonistic utopias" of today are those that enjoin more positive emotions like joy, ecstasy, happiness, and love. Some of these utopias encourage orgasms and sexual fulfillment and relish feelings of loving and being loved (e.g., Fincke 2014). Hazan and Shaver (1987) define romantic love in terms of attachment. Thus, to feel loved is to feel that we are attached to someone. To feel that we have an attachment to a person means that we are not alone. These loving and sexual feelings are beginning to be on offer in retail services, fulfilling Rust and Huang's (2021) forecast that the feeling economy will include applications of AI technology to synthetic companions. They note that robots first began to replace humans in the factories, then in positions using analytical skills, then intuitive skills, and now empathic skills. One prediction is that by 2050 all sex workers in places like Amsterdam will be robots (Yeoman and Mars 2012). It is probably more realistic to anticipate that love and sex dolls and robots will supplement rather than replace human sex workers, but today consumers can choose from an increasing variety of male and female companions either as a purchase or as a shortterm rental from "sex robot brothels" or discrete delivery services that have emerged in Europe, Asia, Australia, and North America.
Scholarly interest in love and sex robots was first stimulated by Levy's (2007) book, Love and Sex with Robots, and has grown explosively along with the sex doll industry (e.g., Döring and Poeschl 2019;González-González, Gil-Iranzo, and Paderewski-Rodriguez 2021;Lee 2017;Owsianik 2020;Su et al. 2019;Yaklasimlar 2018). The Sixth International Congress on Love and Sex with Robots was held in August of 2021. Sex doll manufacturing flourishes in Japan and China (See Figure 1). Western countries are rapidly expanding as well (Xuan 2019). The Spanish adult doll maker Lumi Dolls has a series of franchised Asian and European sex doll brothels where prospective buyers are invited to "try before you buy" (https:// lumidolls.com/en/content/our-brothels). At the same time, caution is needed in evaluating some of these PR and media claims. As Chen and Hao (2020) observe, emotion-focused AI claims have often been overblown, making it appear that they can perform magic.
In the recent 600+ page 57-chapter Routledge International Handbook of Sex Industry Research (Dewey, Crowhurst, and Izugbara 2019), there are zero mentions of love and sex robots. The same omission occurs in most treatments of service robots (e.g., Paluch and Wirtz 2020), although they do receive passing attention from Lu et al. (2020, p. 384) who ask, "Are sex robots, care robots for the elderly with dementia, therapy bots for children with autism the right approach to handing these needs and under which circumstances are they acceptable?" I will discuss ethical issues in a later section, but note that there are populations beyond the commonly cited shy, disabled, and elderly who need such services. And to the degree that sex robots can displace human sex workers, especially the disturbingly high proportion who are sex-trafficked, there is a broader social need for sex service robots (Zheng 2010). While press reports often use the terms sex doll and sex robots, these are seen as pejorative and owners prefer terms like love and sex dolls, dolls, and synthetic or artificial companions.
A minimal distinction for a humanoid machine to be considered to be a robot rather than a doll is that it possesses some AI and speech. Both are currently becoming available in sexual humanoids, but the robotics are still simple. This new service context raises a number of theoretical issues. Most all service contexts rely to some degree of inauthentic or artificial emotions. But arguably no other service encounter places the customer and the service provider in quite so intimate a proximity. Although the emotional labor by flight attendants on a long flight may sometimes try the service provider's patience (Hochschild 1983), the flight does land, and the passengers depart. Even though Mount Everest guides and clients may be in the same small tent encampment for weeks or months at a time (Tumbat and Belk 2011), at the end of the day there is still social distancing. Surgical staff in a hospital OR may sometimes literally hold the patient's life in their hands, but by the time the anesthesia wears off, most of them are gone (Tørring et al. 2019). Encounters between human clients and human sex workers are equally intimate as those between humans and love and sex dolls, but the construction of natural emotions is not subject to the same technological constraints and enablers as trying to convey artificial emotions by mechanical and electronic means. These technological constraints, enabling capabilities, and their effects on intimacy provoke the first two research questions: RQ 1: How are emotions constructed and maintained between humans and love and sex dolls? RQ 2: How do the technological affordances of a love and sex doll constrain or enable the performance of emotions by the robot?
Addressing these two questions is expected to have practical implications as well as to inform our theoretical understanding of emotional communications between humans and machines.
At a deeper level, the companion doll or robot raises questions of how intimate human-doll relations will affect human-human relationships. This may in turn depend upon who is attracted to these dolls and why. We should expect gender differences between cisgender males and females as well as non-binary/LGBTQ+ humans. Although there are both male and female love and sex dolls as well as male and female, gay and straight consumers of these dolls, an estimated 90% of sex doll sales are of female dolls to male consumers. To keep things simple, present attention is restricted to this most common pairing. It is also important to consider the question of possible effects at a more macro societal level, even though there is no expectation that the third and fourth research questions below can be adequately answered at this time.
RQ 3: What is the apparent effect of companion robot-human sex on human-human relationships? RQ 4: What is the apparent effect of companion robot-human sex on human-human relationships and the family at a societal level?
There are practical implications of addressing these research questions as well. For example, gender differences would suggest segmentation strategies in marketing love and sex dolls. Products, promotions, and service delivery all need to differ. In addressing these more macro questions there are also potential public policy and regulatory implications as well as theoretical implications for understanding changing family structure, social mores, divorce rates, and more.
We can begin by observing that machines may be able to "think" but they cannot feel (Huang and Rust 2018). Or can they? Robots and artificial intelligence are now beginning to challenge this assumption by displaying emotional intelligence. Starting with conversational programs like ELIZA (Weisenbaum 1966(Weisenbaum , 1976) and proceeding to artificial friend programs like Replika (Olson 2018) as well as therapeutic chatbots like Woebot (de Jesus 2019), machines are beginning to compete in the feeling economy. Machines still cannot feel, but robots and AI can now give the appearance of understanding the consumer's emotional state and responding in an appropriate artificially emotional manner. Not only can chatbots appear to make sense of our statements and commands, but they are also increasingly able to infer our emotions from our tone of voice, choice of words, and non-verbal cues like our facial expression (McStay 2018). Humanoid sex robots can also respond to our state of arousal, our touch, and even our biometric indices like pulse, blood pressure, and respiration. Based on such indicants they can seemingly mirror our emotions (Ferguson 2010). Belk (2017) observes that just as sex and pornography helped spur the adoption of VHS tape recorders, DVD players, and the internet, they may also propel consumer acceptance of robots.
Over the past three decades human sex workers have been complimented by fully embodied humanoid robot service workers (Hauskeller 2014;Levy 2007;McArthur 2017). Little more than well-rendered life-size dolls at first (BBC 2002;Ferguson 2010;Gillespe 2007), they are now becoming more conversational and animated (Bendel 2020;Döring and Pöschl 2018;Hatfield, Rapson, and Purvis 2020). A successful early creator of these robots, American Matt McMullen, estimates that his firm sells one male doll for every 10 female dolls (Anderson 2016). McMullen's firm, RealDolls, also offers a swap-in a penis to transform female dolls into transgender dolls. Recently Asian dolls have entered North American and European markets largely by offering lower prices than European and American brands (Devlin 2018). Because love and sex doll prices are in the thousands and sometimes tens of thousands of dollars, short-term rental at robot brothels is more feasible than purchasing love dolls. But there are thousands of love doll owners as well (Ciambrone, Phua, and Avery 2017;Knafo and Lo Bosco 2017). This paper considers the role of emotions and feigned machine emotions by robot love and sex workers who engage with human clients. By considering these encounters we may learn something about the limits of EQ or Emotional Quotient (e.g., Goleman 1995; McStay 2018) as opposed to IQ or Intelligence Quotient. If algorithmic robotic empathy can succeed in satisfying consumers in this most emotional context, arguably its emotional masquerade should succeed in most other contexts as well. Child and elder care offer quite different emotional challenges than sex robot EQ, but the emotional displays are unlikely to be more difficult than providing an adequate or superior substitute for a human sex partner (Fosch-Villaronga and Poulsen 2020). Just as robotics in surgery, nursing, and sex can learn from each other (Bendel 2012), so too can the use of robotics in childcare, elder care, and sex care (Fosch-Villaronga and Poulsen 2020).
In the general diffusion of robots, besides technical challenges there may be moral and ethical challenges that take priority. If a child or elderly person says "I love you" to a robot and the robot says "I love you" in return, is this deception objectionable (Sharkey and Sharkey 2012)? Does it violate our expectation that such expressions should be authentic (Turkle 2007)? Or are these relationships mutually negotiated rather than a one-way deception by robots Damiano 2016/2017)? And is a fantasy relationship between an adult and a sexual doll any more problematic than the relationship between a child and a child's doll (McHale and Neubauer 1999)? As we will see, the answer to these questions depends, in part, on what sort of emotional relationship exists between the doll and the human.
Thus far I have introduced the idea of sex robots, the concept of emotional quotient, and the necessity of empathic skills in a feeling economy. I have also stipulated research questions involving the feasibility of human-robot bonds and their consequences. What follows is an attempt to conceptualize this terrain and consider tentative answers to the research questions.

A Word About Sources and Focus
Much of the most detailed relevant research has been carried out among sex doll owners rather than among users who do not own a companion doll. One reason has to do with the high cost of purchasing a doll. For McMullen's RealDolls, prices start at 6000 US dollars. Two models, Harmony and Solana, can also be equipped with AI allowing some speech recognition conversation with lip sync, eye contact, machine learning, and interchangeable faces and personalities. These AI upgrades are not yet available but have been demonstrated with prototypes and are estimated to cost an extra $8000. These are base prices and there are many custom extras for an additional charge.
Even though Lumi Dolls and many Asian dolls are somewhat less expensive, they still cost thousands of dollars and are generally less sophisticated technically and aesthetically. Many explicit examples are offered in ads and threads on thedollforum. com/forum/. There are no doubt many more who have tried the dolls at a brothel or had it delivered to them on a short-term rental basis. However rental users with either human or robotic sex workers are generally too embarrassed, afraid of discovery, and secretive to submit to an in-depth interview (Sanders 2008). Most of the other evidence we have of them is from anonymous survey responses administered to sex doll forum subscribers or postings on blogs and sex forums like https:// www.hipforums.com/forum/forum/15-love-and-sex/. There is often exaggerated male braggadocio in apparently heterosexual male-dominated anonymous posts here. Even though truthfulness questionable, they do give some idea of male fantasies regarding sex dolls.
The remainder of the paper is organized around the research questions and is anchored by a focus on the performance of emotions involving a human and a sex doll or robot. The following section addresses the first two research questions involving the role of emotions in human-sex robot interactions and how current technologies may constrain or enable these performances. It also offers a model of how these performances might be managed. This is followed by a discussion of ethics in human-machinic sex entanglement. The paper concludes with discussion and conclusion sections that address the third and fourth research questions involving the effects of human-robot sex on human-human relationships and on society.

The Emotional Robot
When we hear the word robot, we are unlikely to conjure up an image of a warm, loving, much less sexy, being. We are still apt to say that someone whose movements are stilted or whose speech is monotone is being "robotic." The idea that robots are mechanical clunky machines traces to 1920 when the stage play "R.U.R. Rossum's Universal Robots" was introduced by Czech playwright KaralČapek's in Prague, London, New York, Chicago, and Los Angeles. The play gave us the term "robot" which comes from the Czech word rab for slave, while "Rossum" comes from the Czech word rozum meaning "reason." The robots produced in the R.U.R. factory were actually biological rather than AI-enhanced mechanical creations. These factory workers produce more robots to fulfill orders from around the world (Čapek 1921/2007). Realizing that they are little more than slaves, the worker robots rebel, overpower factory management, and go on to conquer most all of humanity. This vision of robots comes from an era before genetics, computers, and Artificial Intelligence. Our image of and expectations for a contemporary robot have evolved considerably as AI has developed, even if the original meanings of "robotic" and the fear of robot rebellion have never entirely disappeared.
To be convincing and appealing, a sex robot should not, for the most part, appear robotic in this sense. A common ideal would be a robot that appears attractive, sounds seductive, and moves in human-like fashion. Over the past 20 years, robots have become increasingly humanlike in appearance and in their appeal as potential sexual companions. In a 2017 YouGov survey of American adults, 17% said they would consider having sex with a robot, while in a parallel 2020 YouGov survey this number had climbed to 22% (30% of Males and 14% of Females) (Cheok, Karunanayaka, and Zhang 2020). Cheok and colleagues suggest that this increase may in part be due to the human-robot sex in the television series Westworld that had just started.
This need not mean that successful sex robots should necessarily appear fully human however, and some users may prefer robot-like or even animal-like forms (e.g., Healy and Beverland 2013). But generally, having a humanlike head and body shape are key factors enhancing first impressions of robots. Human-like faces and positive and appropriate facial expressions are other non-verbal characteristics that appear to generate greater liking of robots (Chesher and Andreallo 2021).
Eye contact and following the gaze of humans is a key factor in continued engagement (Kompatsiari et al. 2021). And more subtle factors like robot-initiated touching (Willemse and Erop 2019) and brief warm hugs have been found to generate human attachment (Block and Kuchenbecker 2019). Humanoid-loveand-sex robots are still only able to move in a rudimentary human-like manner, but their speech recognition and response systems are becoming increasingly humanlike (Bendel 2018;Ohshima et al. 2015).
Humans experiencing robots as sex workers are likely to first react emotionally to the robot's appearance. It is important to have a sexually attractive robot. But ideally other senses are engaged as well. Speech recognition is important (Faber 2020). So is carrying out a conversation-seeming to understand speech content and responding appropriately-but also reading body language, tone of voice, facial expressions, and perhaps biofeedback from the consumer they are engaging (Rust and Huang 2021). Other sensory modes are evoked with human-like skin and body temperature. The sexual experience should be humanlike and potentially better than human-human sex (Lee 2017;Snell 1997). Unless the client is excited by the thought of robot sex, it is likely that the more humanlike and multisensory the experience is, the better, with the potential exception of over-similarity or the uncanny valley.

The Uncanny Valley
Taking pornography as a model, the visual is a dominant sensory dimension, especially for men (Coopersmith 1998;Dean 2014). But this may not mean that the closer to human appearance that a sex robot achieves, the better. It has been hypothesized that a negative effect can occur when the robot becomes eerily too human-like. The hypothetical "uncanny valley" suggests that we increasingly like robots as they appear more and more human-like until the point when they appear almost, but not quite, human (Mori 1970(Mori /2012. Then our liking dips into the valley that Freud (1919Freud ( /2001 dubbed "the uncanny." One possible explanation given for our revulsion to dolls and robots that look uncannily human, is that they remind us of our own mortality (Faber 2020). MacDorman and Ishiguro (2006) also suggest that with humanoid robots the combination of a mechanical interior and a human-like exterior may also provoke fears that we ourselves are just soulless machines. This suggests that our deeper existential anxiety is with authenticity-both ours and the robot's. As Gibson (2017, p. 227) asks, "What constitutes "us" and how does it differ from "them"?
The existence of an uncanny valley in evaluations of android (male) or gynoid (female) robots has some limited evidence (Kätsyri et al. 2015). But the work to date is based entirely on visual appearance and movement. As already noted, with sex robots, voice, conversation, touch, sexual prowess, and other sensory experiences may be at least as important to our emotions and feelings about sex robot service workers. The purely visual evidence is likely to be more critical to non-user emotional reactions. That is, the public encounters with robots only through media necessarily miss the multisensory experiences of users (Cooper, Cook, and Bilby 2018).

Robot Assessment of Human Emotions
Reversing perspectives, the robot assessment of the human is especially important because it is key to determining an appropriate robot response both in verbal and non-verbal terms. Such assessments can be based on several factors including the human's words, voice tone, facial expression, and various biofeedback cues. To conduct research on reading human emotions it is first necessary to agree on what are the relevant human emotions in the context of intimate human-robot encounters. There are many schemas of emotions with the number of emotions ranging from a handful to hundreds or even thousands. Psychological inventories usually measure 5-15. However, one recent study that elicited open-end responses to more than 2000 short videos and then statistically identified distinct clusters of responses, identified 27 discrete emotions (Cowen and Keltner 2017). Scrutinizing this list for relevant emotions specifically suited to describing sexual experiences reduces the list to perhaps one-third of the original 27: boredom, craving, disgust, excitement, fear, horror, joy, romance, and sexual desire. We would presumably want to minimize boredom, disgust, fear, and horror while maximizing craving, excitement, joy, romance, and sexual desire. One difficulty is that these emotions are not independent. In addition, a small level of one may be pleasant while a large level is unpleasant. Furthermore, what is ideal for one person may be far from ideal for another. Lange et al. (2020) suggest that a solution might be found using psychometrics. However, the four factors of one such attempt (Elfers and Offringa 2019) to capture sexual ecstasy-empathy, sacred connection, space and time, and distress-seem less than compelling, both as sex robot simulation goals and for human emotional response measurement.
But imagine if you will, having people somehow articulate their emotions non-verbally during coitus. The intent of such an exercise would be to try to optimize robotic actions to produce and extend pleasurable emotions including orgasm, optimized for each individual human. To simplify this daunting research task, we might begin by using a simpler theoretical approach with a much smaller set of emotions. Constructivist theories seem most appropriate for this purpose. For example, Russell and Mehrabian's (1974) PAD or pleasure, arousal, dominance schema is one candidate. They were able to locate 151 emotional items within this emotional space. However, a comprehensive study by Gehm and Scherer (1988) was unable to reproduce the findings of Russell and Mehrabian (1974) using their PAD schema. Nor were similar studies able to do so (Hartman, Siegert, and Prylipko (2015).
Very well. Consider another simple theoretical approachthat of Wilhelm Wundt (1922Wundt ( /2012) as depicted in Figure 2. His idea, much like that of Mehrabian and Russell (1974) was to represent a person's current emotional state as a single point in three-dimensional space, as shown in the Figure. As Lange et al. (2020) summarize, such an approach can examine distinct emotions within a person over time. For present purposes let us envision tracking the point depicting the combination of the three emotional states in real time during an episode of coitus between a human and a sex robot. Furthermore, during the episode the robot is continuously measuring a series of bioindicants of emotional arousal in the human, including heartbeat, blood pressure, breathing rate, pH level, and so forth. After the experience we could also have the human rate and describe it subjectively. At the simplest level using machine learning, the sex robot could soon learn to optimize and prolong more intense orgasms in response to its own actions (expansions, contractions, vibrations, etc.) to produce such a state. At a more complex level the entire episode might be optimized like scoring a piece of music to match a film script: individual service optimization. The field of adaptive personalization offers a beginning toward such optimization challenges (e.g., Chung, Rust, and Wedel, 2009;Kazienko and Adamski 2007). This could also be done across individuals and eventually other factors like the weather and day of the week might be included (Cheng and Shen 2014;Bermes, Hartmann, and Danckwerts 2020). Imagine a result like Pink Floyd's "Great Gig in the Sky" (Whiteley 2005(Whiteley / 2017. It is through these processes that human-robot sex might become superior to human-human sex. There are of course other models of emotion, but in this context, Wundt is an ideal starting point (Cacciopo and Gardner 1999;Feldman 1995;Watson and Tellegen 1985). Measurement operationalizations and coordination with sex robot inputs will be a challenge, but they should be solvable with creativity, AI, and machine learning.
We have now gone part way in considering the first two research questions involving the construction of emotions between humans and sex robots. We have considered the nature of robots, the uncanny valley, and one suggested method for robot measurement of human emotions during coitus. We now move on to the key constructs of artificial emotions and empathy.

Artificial Emotions
The notion of emotional intelligence (Goleman 1995) or emotion quotient (EQ) arose in the context of human-human interaction and led to a psychological test paralleling tests of a person's IQ (intelligence quotient) (Mayer, Salovey, and Caruso 2002). The test is based on a person's ability to perceive others' emotions from photos, to emphasize with these emotions, and to respond appropriately in terms of the person's own emotions. A key underlying concept is empathy: the ability to stand in the other's shoes, see the world from their point of view, and react in an understanding and caring manner. The idea of EQ has been applied to various human-to-human service relationships where it is found that service persons who are perceived as more empathetic are judged to be more liked and achieve more satisfactory service outcomes (e.g., Bove 2019; Weiseke, Geigenmüler, and Kraus 2012).
One of the ways that empathy develops from infancy is through mirroring. Mirror neurons provide sensory and motor mechanisms to feel what other are feeling through what Damasio (2004) called the "as-if-body-loop." This loop gives us an internal representation of the other's body state. It can lead to automatic memory and emotional contagion. Robots do not have mirror neurons, but by means of biological inputs as well as speech characteristics, it is possible to set up a somewhat similar mirror system (Lim and OIkuno 2015). Emotional contagion and mimicry among humans operate at the largely unconscious level of emotional empathy, whereas cognitive empathy involves more self-other consciousness leading to sympathy and compassion (Asada 2015b). For other types of robotics in areas such as education and surgery, parallels to cognitive empathy would be desirable (Tisseron, Tordo, and Baddoura 2015). But at the human/sex-robot level, parallels to the unconscious emotional empathy may be all that is sought. The exception is with speech recognition and conversation (Bee et al. 2010). In this case, it is not only linguistic exchange that is involved; body movements and gestures are important as well (Novikova and Watts 2015). Given the expressiveness of the body during sexual acts this should be a key area for further development in sex robotics. At present however, research on sex robot body movements is still quite limited.
If empathetic emotional intelligence is a positive factor in human-human interaction it might be expected to be a positive factor in our interactions with non-human others as well. Studies have examined empathy with companion animals, and it has been found, for example, that women are more empathetic veterinarians than men (Colombo et al. 2017). There has also been work with robot others usually emphasizing the visual design factors that elicit greater anthropomorphic empathy toward the robot (e.g., Paiva et al., 2017;Rosenthal-von der Püten et al. 2014). But advertising for sexbots promises still more. As Andreallo (2019) concludes, "It is also selling emotional intimacy: robots [are] marketed as if they are capable of meeting both physical and psychological needs. They are being sold as a solution to loneliness." Some sex robots also promise to learn from prior conversations and come to know their owner or user over time. This is something more likely to appeal to owners than renters who may instead prefer anonymity. But as Damiano, Dumouchel, and Lehman (2015) emphasize, emotions and empathy are co-constructed in the human-robot interaction.
Sex doll owners post photos of their dolls on websites like TheDollForum posed in inconspicuous clothing doing mundane activities like washing dishes or going on a picnic. There are soft and hardcore pornographic images as well, but a part of what is going on with these images and accompanying captions and stories has been described as a spectacle performance proving their heteronormativity to one another (Burr-Miller and Aoki 2013). This narrative helps hide the fact that doll owners have become caregivers for the dolls, shopping for their clothing, dressing them, and doing their hair and makeup-all stereotypically female roles. But sex doll designers would burst the macho alibi if they were to offer less hypersexualized dolls.
As with getting to know a human acquaintance or friend, long-term contact between a robot and a human builds apparent understanding between them. Bidochka (2021) goes beyond emotional intelligence and suggests that sexual intelligence and erotic intelligence should be developed by sex robots learning to co-produce partners' sexual satisfaction. In rental rather than purchase, at least with human sex workers, there is still an opportunity for this when men intentionally repeat their visits to the same sex partner as often happens (Sanders 2008) and when they purchase the "girlfriend experience" (GFE) in which they must woo and please the sex worker partner, as if on a date (Bernstein 2007;Huff 2011). Still, we must remember that…

Sex Robots Are Faking It
The difference between natural and artificial emotions is twofold. First natural emotions are felt by embodied humans. Because robots lack a biological body and consciousness, it is likely they will never feel emotions (Asada 2015a). But secondly, at a behavioral level, robots can observe and imitate human emotions. Ess (2016) defines an artificial emotion as an imitative display of emotion. In other words, it is not an expression of something that is felt. Robots cannot feel and therefore their expressions of felt emotions are faked. But for Levy (2007) and others, the appearance of an emotion like love toward a human being is sufficient if it is convincing and engenders a feeling of being loved. For Theodore (Joaquin Phoenix) in the Jonze (2013) film Her, conversations with the disembodied operating system voice of Samantha (Scarlett Johansson) were convincing enough for him to fall in love, at least until he came to realize that she had 640 other "lovers" who were also deceived by her artificial emotions. While owned sex dolls may not be this promiscuous, it is certainly the case that sex robot service workers are likely to have many partners. Sullins (2012) argues that it is unethical for a robot to feign the emotion of love, while others like McDonald (2015) argue that "near enough is good enough." That is, although the consumer of robot love may well recognize its artificiality and insincerity, it may be a close enough simulation to be worthwhile and satisfying for the user.
Like Theodore in Her, many have been fooled by clever chatbots. As revealed by a hack of the Ashley Madison extramarital hookup app in 2015, only about 5.5 million of the 37 million participants were actually women. The rest of the predominantly male clientele were served by an army of 70,000 bots that convinced them that they were dealing with real would-be lovers (Cockayne, Leszczynski, and Zook 2017;Newitz 2015). Similar bot scams have been detected since that time as online bots become more sophisticated. Voice-activated digital assistants like Siri and Alexa show that although voice (versus text) applications are getting better, we are still not near the level of science fiction portrayals of "fembots" in films like Cheok, Levy, and Karunanayaka (2016) suggest that perfecting artificial emotions in robots may be a matter of creating an AI "artificial endocrine system." The reason love and sex robots are made to moan in simple versions or mirror the subject's excitement in more advanced versions seems to be that we desire the other's desire (Liang 1970;Su et al. 2019). This appears to be something a sex doll can only simulate by taking on the role of seductress. But we also desire to produce and control the other's pleasure. Ideally this too is something that is achieved with the whole body and not by Bluetooth remote control as some sex robots now offer. Yet this human body to robot body symbiosis is elusive. Drawing on Dreyfus's (1970, 237) critique of "what computers still can't do," Millar (2021, p. 150) points out that "it is precisely the body that the computer can't simulate." I would add, "…effectively." For so long as we know that the sexbots are reacting to machine logic or our Bluetooth controller rather expressed reacting to feelings, we may well fail to be fully convinced of the bot's desire and joy.
Nevertheless, if deception versus sincerity is primarily an interpretive issue with artificial emotions, we might question whether sex between human sex workers and human clients is any more sincere; or for that matter whether other human-human expressions of love during sexual relations are necessarily any more genuine than those in human-robot sexual relations. What is in question here is the authenticity of enacted emotions (Turkle 2007). We may however choose to suspend disbelief in assessing the sincerity of either a human or robot lover's expressions of emotions. And if consistency in professing their desire and love for us is key, then robots likely have an advantage (Lee 2017). A summary of the differences between artificial and natural emotions is shown in Table 1. Overall, it is apparent that if what is sought by humans in relationships with a sex robot is a feeling of being admired, appreciated, loved, and desired, as well as a feeling of creating sexually fulfilling emotions in the robot partner, there is then much to set aside and ignore if artificial emotions are to be enough to satisfy such longing.
There are profound ethical issues involved here, such as whether love and sex with robots will alienate us from love and sex with fellow humans (e.g., Turkle 2010) and whether these robot constructions mirror the patriarchy that pervades society (e.g., Berlatsky 2020; Rhee 2018). I will return to such deeper issues and the final two RQs in the discussion and conclusions.

The Role of Human Emotions
The other half of artificial emotions is how they affect the human consumer of sex robot products and services. There are many surveys of how people feel about sex robots (e.g., Bame 2017; Knox, Huff, and Chang 2017;Nguyen 2017), but given a lack of personal experience with such robots, these reported feelings are based primarily on media accounts, photos or videos, and responses to the concept rather than behavioral experiences.
Given the stigma attached to visiting a brothel (employing either humans or robots) and the male braggadocio found on anonymous sex-for-sale forums (Hughes 1999(Hughes , 2004, experiential data with sex robots is more easily obtained from doll owners rather than renters. Research in other contexts suggests that owners and longer-term renters take better care of the objects they acquire than do short-term users (Bardhi and Eckhardt 2012). Given such lack of care, Richardson (2016) worries about violence against sex dolls in sex doll brothels. There is the threat that such violence could ingrain behaviors that are later carried out against human sex workers or other human partners. For slightly longer-term rentals where the sex robot is delivered to the consumer, this is likely the reason for deposits of Cdn$500 in Vancouver (Kurucz 2019) and £300 in London (Nevett 2018). However, a comparison of 158 sex doll owners and 135 non-owner controls, showed no substantial differences in social objectification of women or biastophilia (interest in coercive sex) (Harper, Livesley and Wanless (2021).

The Sex Robot and Society: Ethical Arguments Against Sex Robots
Despite marketing efforts to normalize brothel patronage (Brents and Hausbeck 2007;Jovanovski and Tyler 2018) as is common in some other counties (e.g., Belk, Østergaard, and Groves 1998), visits to brothels featuring either human or robot sex workers remain largely secretive in the West. There are also ethical fears of clients enacting deviant sexual practices on sex robots, including violence and the use of child robots (Richardson 2016). The social proscription and possible illegality of such acts makes them contentious if we project human rights onto sex robots. However, to envision robots as being legal persons and having rights is also a troublesome concept (Frank and Nyholm 2017;Gerson 2019). Imagine for example issues such as robots purchasing property, voting, and marrying humans if robots were ever to become sentient and have a will of their own. If someone engages in "deviant" sexual practices with a robot, some argue that this is cathartic, even therapeutic, and that it only involves harm to objects rather than subjects (Knox, Hunt, and Chang 2017;McArthur 2017). Others, like Richardson (2016) argue the reverse: that deviant practices with brothel robots will encourage and train enacting similar behaviors with humans. It is the emotions of the public rather than those of sex robot patrons that are at stake in this battle for public opinion and subsequent legislation (or non-legislation) involving the rights of robots. The similarity of humanoid robots to humans, the hypothesized uncanny valley, and the human tendency to anthropomorphize are all involved in these emotional reactions. All of this seems to be creating what some have characterized as a moral panic (e.g., Dennin 2018;Orben 2020;Sanders 2008).
There are several prior treatments of service robot ethics. One review raises issues of privacy, dehumanization, social deprivation, and disempowerment (Čaić, Mahr, and Oderkerken-Schröder 2019). It considers robots for the elderly but does not touch upon sex robots. Van Wynsberghe (2016) suggests that we view sex robots as satisfying human needs and apply the same ethics of care that we apply in other areas of care giving in response to needs such as those we provide for patients in hospitals. Müller (2020) raises several concerns about sex robots. One is that because we easily anthropomorphize even totally inanimate objects, we will likely believe deceptive robot expressions of love. He also raises the possibility that our affection for sex robots will cause us to objectify humans and devalue sex with other humans. Finally, he raises the issue that robots cannot consent to sex. Belk (2020) suggests a benefits versus harm perspective in assessing the spread of sex robot service workers. This is especially needed regarding government regulation which he urges should be cautious rather than reacting to heated public emotions concerning practices like the use of child sex robots.
Among the more detailed ethical critiques of sex robots is Richardson's (2015, forthcoming) Campaign Against Sex Robots (CASR-https://campaignagainstsexrobots.org/; now the Campaign against Porn Robots-https://www.youtube.com/ watch?v=QkgbTYHVdrA). It demands that we outlaw the product or service in any form, but especially with female sex robots. She suggests that the largely female sex robots being sold or rented objectify women. Unless we grant a degree of personhood and individual personality to a sex doll, the critique is certainly warranted. They are objects more than subjects, even if they become ambulatory and mimic human speech, conversation, facial expressions, and gestures. Richardson also adds that sex robots generally cannot say no or fend off uninvited sexual advances. On the other hand, Sparrow (2017) argues that designing a sex robot that can say no is problematic since it will encourage some users to experiment with raping them. Levy (2019) flips the script and asks how a robot can determine for certain whether the human consents to sex. Eskens (2017) develops further arguments for why it should be ethically defensible to have non-consensual sex with robots, but not with cognitively incapacitated humans or non-human animals, because there is a morally significant difference between sentient beings and non-sentient robots. Richardson (2015) also suggests that robot sex is equivalent to slavery and forced prostitution. While agreeing with this premise, from a sex-positive queer theory perspective Kubes (2019) suggests that this entails viewing the robot anthropomorphically and thereby investing it with enough humanness to allow a truly loving relationship to develop and contribute to a sex-positive utopian future. Behrendt (2020) largely agrees, citing sexbots as a sign of sexual progress that reflects an expansive set of human rights to access such robots. Bryson (2010) challenges Richardson's (2015) conclusion on another ground. She argues that we should regard robots as slaves and that doing so does not entail the racism, classism, or endemic cruelty inherent in human slavery. Moreover, she argues that by mistakenly humanizing robots we dehumanize real people. Similarly, in Bryson's view, despite the intent to avoid the sins of the past and present in treating human slaves and servants inhumanely, when we humanize robots, we lower the bar of what it means to be human and thereby dehumanize ourselves. Danaher (2017) is more direct and calls the prospect of treating robots as people "outlandish." Belk (2020) adds that presumably this is because robots do not have and cannot have feelings.
Rather than arguing for respecting sex robot dignity, Fosch-Villaroga and Poulsen (2020) argue that we should instead emphasize the sexual rights of the disabled and older results by providing them with access to sex care robots. Koumpos and Gees (2020) add that such humans tend to face isolation and loneliness. They point to the use of dames de voyage or "Dutch wives" that were precursors to today's sex dolls and were used by sailors on long voyages. In this case they advocate for attending to the "long voyages" of those facing emptiness and death without sexual companionship. Still nearly 30 years after the United Nations advocated for universal access to sexual and reproductive health in 1993, the needs of the elderly and disabled remain largely unmet. Sex care robots could help change this.
Before leaving this part of Richardson's argument, we should note that there is also some pushback to Kubes' (2019) assumption that creating a loving relationship with an anthropomorphized robot is necessarily a good thing. Nyholm and Frank (2019) argue that creating or encouraging such a relationship takes advantage of lonely, socially awkward, or disabled consumers' vulnerabilities. Secondly, they argue that efforts to display and market sex robots as humanlike is deceptive. And thirdly, they suggest that cultivating relationships with robots may reduce chances for more meaningful relationships with humans, not least because these quasi-human companions are likely to provoke stigma from and condemnation by other humans. And although sex robots may seem like an elegant a solution to the problem of loneliness and sexual longing, the violent extremism of misogynistic incels (involuntary celibates) problematizes this easy pairing (Bains and Hudson 2018; Ondrej forthcoming).
Another of Richardson's (2015, forthcoming) complaints that gets more traction with progressives is that some sex robots being (illegally) imported to Europe, North America, and Australia are designed as children and encourage pedophilia which can be practiced in sex robot brothels and then carried out with human children. There is debate about whether use of child sex robots in a clinical setting could retrain pedophiles not to pursue their predilections with human children (Cox-George and Bewley 2018). Similar arguments have been made for using an adult sex robot with a consent scenario in a clinical setting (Peeters and Haselager 2021), in partial response to the #MeToo movement.
A further ethical question involves the security of the extremely private data that a sex robot is potentially able to gather about its user. Earlier, the Ashley Madison hack was mentioned (Newitz 2015). The notion of home as haven and bodily privacy as sanctum sanctorum is a strong legal doctrine (McClain 1995). But we have seen a plethora of illegal hacks not only of corporate consumer data, but of government citizen data as well. It is conceivable that data on sexual habits could be used for blackmail, extortion, or revenge porn.
In my opinion, the arguments for robot rights can be set aside because robots are not sentient beings, at least for now. Arguments about encouraging versus discouraging sexual violation of humans based on modeling behavior with robots need further empirical study. And arguments about data security are a real concern here as they are with other IOT technologies.

Discussion and Conclusion
Despite 30 years of realistic sex dolls and the increasing presence of sex robots, this service industry has remained largely invisible, at least until a popular press or online news outlet publishes an article or video sensationalizing and singling out such activities in the vicinity. Even in cities where human sex work is legal, robot sex work seems to attract sensational press. As the preceding section emphasizes, there are some ethical reasons for some of this negative attention. While not discounting this concern, before discussing the larger concerns of research questions three and four below, several managerial strategies suggest ways to enhance sex robot service, potentially help protect human sex workers, and engage in public education.
1. Emphasize the sexual health benefits of employing a nonjudgmental sex robot in treating sexual dysfunctions. 2. Continue to work on robotic physical movement capabilities and AI improvements to sex robot functioning. 3. Continue to work on improving artificial emotion responses in sex robots. 4. Continue to improve robot sexual performance using bio feedback and a dynamic adaptation of Wundt's 3-dimensional model applied to sexual performance. 5. Sex doll designers should give more attention to the companionate function that these dolls often serve, while maintaining sexy bodies and erotic clothing to capture attraction and offer a heteronormative sexist fantasy façade. 6. Foster further documentary work with sex doll owners and users. 7. Consider adopting the Lumi Dolls try-before-you-buy introduction in sex doll brothels. 8. Develop public awareness of the use of sex dolls to reduce the dangers of human trafficking and physical violence against human sex workers. 9. Donate to rehabilitation facilities for former sex workers and partner with NGOs to operationalize such halfway houses. 10. Choose sex doll brothel locations in industrial parks away from houses, schools, houses of worship, and bars.
These details on managing sex robot brothels see Jones (2020).
Wundt's 3-dimensional model of emotional response has largely lain theoretically dormant for many years. Further research is needed to assess its adequacy in monitoring human sexual response. Work in adaptive personalization systems can be useful in optimizing these responses. The construct of artificial emotions is beginning to be researched but more nuanced work is needed. Perhaps something similar to the Turing test that involves comparing algorithmic versus human conversation is needed, in this case involving the display of artificial versus human emotional responses. There appear to be strong gender differences in response to the concept of sex robots (e.g., Scheutz and Arnold 2016) that warrant further study. Especially because it appears that the major customers for male sex dolls are non-cisgender males, a broader theoretical gender framework is needed as well. This is another visible community on the sex doll forums (e.g., TDF) that should be a good source of data.
It is easy to see why press sensationalism occurs in reporting on sex robots. But there seem to be some deeper anxieties surfaced by these activities as well. In what follows, I outline several of these anxieties in addressing RQ 3 and 4 involving the effects of sex robots on human-human relationships and society. Each requires further research to understand more fully how realistic these anxieties are and what issues they bring to the fore. The first deeper fear (RQ 3) is that as we engage with robots, we may disengage with each other. Together with other technologies substituting for human roles in the home starting with applications like Siri, Alexa, and smart home devices, there is a growing challenge to the role of the traditional family (RQ 4). For example, queer theorists Strengers and Kennedy (2020. p. 2) foresee "the slow death of the wife." Families only began in the Neolithic era after the development of agriculture and settled communities (Spar 2020). Children were needed to help farm, so the communal model of child-rearing practiced among hunters and gatherers was largely abandoned in favor of familial care. The Industrial Revolution was the next big change, when women and children (and later men) began to work outside the home in factories.
We are now at the beginning of the Digital Revolution and together with birth control and assistive reproductive technologies, two-parent heteronormative families are no longer needed in the traditional sense. Furthermore, thanks to an explosion of online pornography, the acceptance of sex toys, rising divorce rates, and growing singlehood (Lee 2017; Strengers and Kennedy 2020), we no longer need human partners for sex, which has become more individualized (RQ 3). Enter the love and sex robot as both a consumer product and a consumer service that is beginning to be purchased and hired, primarily by heterosexual males. At the same time sex toys for women are no longer taboo and are even advertised on television in the UK (Bardzell and Bardzell 2011;Devlin 2018;Hester 2014;Lee 2017;Wilner and Huff 2017). The resulting loss of family thus arises due to sex robots, sex toys, ubiquitous pornography, and the substitution of solitary pleasure-seeking for traditional (human to human) sexual access within marriage. Thus, we have possible early signs of the death of the family. This is largely consistent with "family decline theory" (Langcaster-James and Bentley 2018; Schneer and Reitman 2017).
An additional underlying fear with love and sex robot service workers becoming more common, is that as we see more commodification of sex, we will also begin to believe that we too are fungible commodities with little to differentiate us from robots. This possibility is also apparent in readily accessible potential short-term partners from dating and hookup apps like Tinder and Grinder. There seems to be an endless supply of choices and superficial appearances appear to be critical. This is also seen in the tendency of sex dolls to boast bigger breasts, bigger penises, and bigger buttocks. Pornography reinforces these bigger is better archetypes. But humanoid love and sex robots may reinforce a repressed fear of our own sexual inadequacy.
These broader societal fears help ground a call for considering whether and how to legislate and police love and sex robot marketing and operations. This is a genie that will be difficult, and more likely impossible, to put back in the bottle. Sex robots are certainly an emotional purchase and an emotional service consumption experience. Perhaps birth and funeral services rival sexual services for intensity of emotion for the key actors, but they are not as high tech and high touch as having sex with a robot.
The most fundamental behavioral and ethical question is whether this "artificial sex" with "artificial partners" constitutes a utopia or a dystopia. In multiple dystopian fictional portrayals, the emotions generated by sex and war are seen as dangerous and as needing control. In Huxley's (1932) Brave New World people are fed soma to keep them happy and sexual promiscuity is encouraged to keep violent and revolutionary emotions in check. Orwell's (1949) 1984 is more repressive and uses surveillance to suppress subversive thought. Here the family is done away with as a threat to loyalty toward Big Brother. Sex is also suppressed and to be used only for procreation. Perpetual war and reported victories are used to reinforce loyalty to the state. Artificial sex may not be a utopia, but it is certainly not the dystopia of 1984.
As technology has developed, robots have also become a more important part of speculative fiction. In Levin's (1972) The Stepford Wives, a group of engineers in the Stepford Men's Club turn their would-be liberated wives into beautiful, docile, and obedient stay-at-home robot doubles. They never age and presumably never object to the sexual demands of their husbands. Dick's (1968) Do Androids Dream of Electric Sheep and the two loosely based Bladerunner films feature androids ("replicants" in the films) who are designed to exist for only 4 years lest they develop emotions and the desire to pass as humans. Bounty hunters track down replicants to exterminate them. They identify them through testing if they show empathy, in which case they are deemed human. If not, they are replicants and must be "killed." Sex in Bladerunner is normally with prostitutes, but the central character unknowingly develops a relationship with a replicant. Eventually his artificial partner admits that she has dispatched many bounty hunters by seducing them-the same femme fatale motif found in the Pandora myth (Mulvey 2013) as well as Her, Metropolis, and Ghost in a Shell. In the Westworld films and television series, the robots include robot prostitutes and a cast of others who can be killed with impunity by guests at the Westworld theme park. The robots who are killed have their memories erased and are physically restored by park engineers. Here it is both the aggressive and erotic emotions of humans that are exercised and satisfied by the realistic robots, at least until robot memories begin to flicker back and they rebel. As just one more example, in the "Be Right Back" episode of the British television series Black Mirror (Brooker and Jones 2018), Martha, a woman whose young husband Ash has died, first has him recreated through traces from his online activity (shades of Replika-see Olson 2018), and eventually re-embodied in a look-alike robot. Aside from this advance in technology, the episode is a gender-swapped and extended version of The Stepford Wives (Levin 1972). Martha later is disappointed that robot Ash cannot read her moods. She eventually realizes that her robot reboot husband is not the Ash she married, at which point she terminates him. It may be that after the sex robot novelty subsides, consumers will reach the same conclusion. It is too soon to say for certain.
Many of these futuristic plots dance around the edge of what is possible now or soon will be. They also suggest the simultaneous push and pull, love and hate feelings we are seen to have toward sexually capable robots as in the Black Mirror episode with Martha and Ash. The speculative fiction plots about sex robots seem to illustrate and explain some of society's polarized emotions regarding sex robots. Beyond the attraction or repulsion that people may feel toward the idea of sex robots, there remain basic behavioral questions: can we love something that can never love us? Is close enough to being human good enough? Are we coming close to a material 3-dimensional version of the "feelies" of Brave New World? And if so, is this wonderful or terrible? Could we be fulfilled by access to or ownership of a love and sex robot? What difference does this product/service distinction make? Sex-for-sale has been an important part of the service economy for millennia and will very likely remain so. But sex robots not only add fake empathic robotic sex workers to the equation, for better or worse, they also provide a service access alternative that introduces new possibilities neverbefore available to the masses.
Summarizing, artificial humans in the form of sex dolls and robots have become the male equivalent of female sex toys which along with online pornography have become normalized in most of Asia and the West. They are expensive to purchase but are becoming increasingly available in sex doll brothels and via discreet home delivery. Most research to date has been done with purchasers who find that the dolls often become companions as much as sex objects. The opening of sex doll brothels often causes a moral panic inflamed by media accounts and photos rather than firsthand experience. As with pornography there are outcries of immorality, misogyny, objectification of women, pedophilia, and encouraging violence toward women. Some of this critique is quite warranted. Nevertheless, like pornography and prostitution involving human sex workers, love and sex dolls are likely to fade into the background and eventually become accepted in many places and be driven underground in others.
I define romantic love in terms of attachment. Although rented love and sex robots may profess or feign love, the lack of ownership means that unlike owners of love dolls, renters are not apt to feel loved as much as sexually satisfied. Objectively, sex dolls and robots should be less controversial than human sex workers. Sexist and misogynistic as these dolls are, they are after all, objects. It should be less objectionable to treat objects as objects than to treat humans as objects. A more realistic objection is that the potential for better-than-human-sex, if it really does emerge in something like the algorithmic optimization manner suggested here, involves a threat to humanhuman relations. Together with the growth of IVF, pornography, the normalization of divorce, and sex toys for women, there are also larger threats to the traditional family. Sex dolls may well exacerbate this onslaught against the family, but they did not initiate it and it will likely continue to grow with or without sex dolls.
Whatever the ultimate outcome of our flirtation with sex robots, the human drive to create artificial human beings with artificial emotions is a long-standing one in both fact and fiction (Belk, Humayun, and Gopaldas 2020). If service robots master the triple-jump from the physical economy to the thinking economy to the feeling economy (Huang and Rust 2018), sex robots will offer a critical testing ground for consumer receptivity toward and acceptance of humanoids with artificial emotions. In this arena it may be that near enough is good enough when it comes to emotional expression; but this is only a part of sex robot performance. The larger part may be sexual performance per se. Here it could be humans who struggle to keep up and who hope that near enough (to the pleasure provided by sex robot rentals) is good enough.

Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.