Logics, tensions and negotiations in the everyday life of a news-ranking algorithm

This article attends to tensions and negotiations surrounding the introduction and development of a news-ranking algorithm in a Swedish daily. Approaching algorithms as culture, being composed of collective human practices, the study emphasizes socio-institutional dynamics in the everyday life of the algorithm. The focus on tensions and negotiations is justified from an institutional perspective and operationalized through an analytical framework of logics. Empirically the study is based on interviews with 14 different in-house workers at the daily, journalists as well as programmers and market actors. The study shows that logics connected to both journalism and programming co-developed the news-ranking algorithm. Tensions and their negotiations around these logics contributed to its very development. One example is labeling of the algorithm as editor-led, allowing journalists to oversee some of its parameters. Social practices in the newsroom, such as Algorithm-Coffee, was also important for its development. In other words, different actors, tensions between them and how these were negotiated, co-constituted by the algorithm itself.


Introduction
This article takes us back to 2015 when a Swedish daily (the Daily henceforth) introduced an algorithm to automatically rank and mix news on the front-page (the Algorithm henceforth). The problem for the Daily was profitability. The number of visits to the website was down, advertising was down and there were even rumors that its owners wanted to sell the newspaper. To become profitable, the Daily got 1 year to launch a new site. For this purpose, a rather loose team of programmers and journalists were put together. The Daily needed to do more with less people, use their journalist resources better, and "not having us to drag news stories up and down their online front-page," as one journalist phrased it (all quotes translated into English by the author). In other words, automating the ranking and mixing of news on the front-page would imply less journalistic labor and thus less expenses. The Algorithm is thus an example of algorithmic automation and its imagined benefits.
This article attends to the humans behind algorithmic automation, how journalist, market, and tech actors jointly made sense of the Algorithm and negotiated tensions in connection to its introduction and development. This article provides an in-depth, empirical, qualitative, and behind the screen account of algorithmic automation in context. In the next section, I will situate the study in relation to academic discussions on automation. By approaching both journalism and algorithms as institutions, tensions and negotiations around algorithmic automation in the newsrooms are foregrounded. One way to study this is through different, clashing and sometimes overlapping, logics connected to both journalism and algorithms. Logics thus constitute the article's analytical framework. Following this, the research questions revolve around logics, tensions, and negotiations. Before presenting the findings, the method is presented.

Background: Automation
Within the academic literature, automation is sometimes approached of as a complete tech take-over, something that cannot be avoided, a fait accompli. In journalism research, this has been referred to as a technological fetish (Comor and Compton, 2015), the belief in technology as inherently powerful. Milosavljevic and Vobic (2019b) lament what they label an algorithmic sublime, downplaying automations degrading prospects and elevating its revitalizing visions for journalism. In a recent book, Andrejevic (2020) argues that we live in an era of a cascading logic of automation (p. 9), referring to how automated data collection leads to automated data processing which leads to an automated response. This bias of automation (p. 21) preempts human decision-making by operationalizing large datasets and its collection through which it is believed that everything is captured from every possible angle. According to Andrejevic (2020: 30), there is a post-social bias as automation attempts to displace social processes with machinic ones, replacing humans, human judgment as well as decision-making. In journalism research, news automation is sometimes labeled robot journalism (Clerwall, 2014). This has led to an automation anxiety among journalists (Akst, referred to in Lindén 2017: 124).
In contrast to Andrejevic's post-social account of automation, others have forwarded algorithms as both social and material processes (Bucher, 2017a;Neyland, 2019). Already in 1986, Noble argued that automation-more than merely a technological advance-is a social process. As such, they should be studied in their particular socio-institutional situations. Algorithms should be approached as unstable and as enacted through people that engage with them, rather than as constrained and procedural formulas (Seaver, 2017;Neyland, 2019). Bucher (2017a) has, for example, showed that algorithmic imaginaries are important for, in her case, the molding of the Facebook algorithm itself. By forwarding the idea of algorithms as culture, Seaver (2017) similarly underlines that algorithms are enacted by practices that blend the technical and the social. Algorithms are cultural, not because they influence cultural practices and artefacts (like a newspaper) or are objects of concern (when for example automating journalists' work), but because they are composed of collective human practices. Following this, it becomes important to attend to socioinstitutional dynamics around algorithms in their everyday life. Christin (2020) has, for example, shown how cultural differences between French and American journalists influenced how they approached and understood metrics and algorithms at work. She reveals a dialectical relationship between algorithms and the institutional contexts in which they are used and concludes that we need to understand algorithms as "contested symbolic objects" (Christin, 2020: 153). As such, they can be negotiated, contested, and used in different ways depending on their institutional context (Christin, 2020: 12). Therefore, it becomes important to attend to how algorithms participate in the everyday, compose the everyday and thus also becomes the everyday (Neyland, 2019). Disenchanted with the lgorithmic drama-popular concerns about the supposed power and opacity of algorithms-Neyland argues that "the everyday, humdrum banalities of life are somewhat sidelined" (Neyland, 2019: 3, with reference to Ziewitz).
One way to study people and everyday practices behind algorithms is through an institutional perspective. Institutional theory, however, is characterized by definitional ambiguity and interpretive inconsistency (Napoli, 2014: 343). This article follows Jepperson's (1991) definition of institutions as routines, norms, rules, and behavioral guidelines. Journalism is often approached as an institution as it is constituted by shared beliefs and norms, rules, and routines (Vos, 2019), even though institutional contexts (the national setting, professional fields, and organizational structures in which journalists are embedded) differs between cultures (see Christin, 2020). Furthermore, journalism is also interconnected with other institutions, which then may become sources of tensions over, for example, journalism's legitimacy, destabilizing beliefs, norms, and rules that have constituted the institution (Vos, 2019). Algorithms could be thought of as one such other institution. Napoli (2014) argues here that institutional theory can be useful for understanding algorithms in media production and shows that an institutional analytical frame is valuable in those increasingly common instances in which algorithms are serving in capacities that intersect with those of traditional media institutions. The process via which practices, norms and cognitions becomes accepted and adopted (i.e., institutionalization), how institutions develop, thus becomes an important line of inquiry (Napoli, 2014: 357).
Newsrooms' use of algorithms has indeed created a stir within journalism. Scholars have shown that algorithms are increasingly involved in automating production and consumption of media (Napoli, 2014), sometimes replacing editorial decision-making (DeVito, 2016), and even journalists themselves (Clerwall, 2014;Diakopoulos, 2019). Indeed, what Christin (2020:5) labels as click-versus editorial-based evaluation of news value, often clash. Algorithms question journalistic rules and values (Singer, 2003), and create "ambiguity and uncertainty for actors who need their practices, assumptions, values, and beliefs to be grounded in some widely legitimated institutional order" (Lowrey, 2017: 380). Ferrer-Conill and Tandoc (2018: 437) therefore warn us that journalism risks being decoupled from its civic duty.
At the same time, some studies have concluded that humans are still important in the newsroom and that there remain many jobs in journalism despite an ongoing digitalization. Lindén (2017) argues that this is so because of journalism's imagination of a social contract, providing readers what they need to function as informed citizens in a democracy, mitigates what Andrejevic (2020: 9) labels a cascading logic of automation. Nonetheless, the influx of tech actors into the heart of the newsrooms has brought with it tensions as their norms of personalization, rules of trending and data-driven development are particularly unsettling to the institution of journalism with its one-way publishing mind-set (Lewis and Westlund 2014).

Analytical framework: Logics
One way to study tensions when institutions intersect is by attending to the logics that accompany the different actors in the newsroom. The concept of logics is also connected to institutional theory (at least since the 70s, see Thornton and Ocasio, 2008). Logics will thus function as this article's analytical framework, the lens through which tensions and negotiations between different actors in the newsroom may be studied and made sense of. As Seaver underlined already in 2013, algorithms are systems that sometimes have hundreds of hands reaching into them and we need to examine the logics that guide these hands.
But what is meant with logics? Logics has come to imply thought and reason and has thus been connected to the study of forms of arguments. In this article, logics are understood as sense-making/meaning-generating frameworks for practices and experiences (Lowrey, 2017: 380). Logics thus organize work and provide stability to social behavior (Asp, 2014: 259). Anonymized for peer review define logics as rules of the game, meaning the ideals, commercial imperatives, and technological affordances behind, in their case, how information is produced, distributed, and used. This study departs from this definition and operationalizes logics in terms of rules, values, and imaginations. This also resonates with how institutions are studied in institutional theory as regulative (i.e., rules), normative (i.e., norms), and cultural-cognitive (i.e., imagination, see Scott, cited in Napoli, 2014: 343).
Within journalism research, a logics framework is frequently used. Asp (2014), for example, argues that news organizations are Janus-faced as they are simultaneously following a normative logic (serving a democratic function) and market-driven logic (to make money). Sjövaag (2010) connects journalism's democratic function to an idea of a social contract, journalists being the ones to fulfill watchdog function in mature democracies. As Asp (2014) shows, the idea of serving citizens with news, may clash with a market imperative to make money. On top of this, Lewis and Westlund (2014) argue that digital technology constitutes a third face on news organizations Janus bust. Discussing challenges to journalism in a digital age, Westlund (2017) adds a logic of participation and a logic of automation. The logic of participation values citizens possibilities to participate in journalism by, for example, (co)creating and discussing news. The logic of automation values technologies' possibilities for journalism, in, for example, reducing costs and streamlining newsroom work (Westlund, 2017). To this can be added Lewis and Usher's (2013: 607-608) study of open source in journalism in which they underline normative values of iteration (continuously releasing unfinished code for beta-testing) and tinkering (privileging play and participation). Lowrey (2017: 382) suggests that new logics introduced in the institution of journalism may clash with more traditional ones. As Asp (2014: 261-262) argues, journalisms' values of public service, objectivity, autonomy, immediacy, ethics, and its social contract are challenged by algorithms. Story selection on Facebook, for example, values friend relationships and expressed user interest (DeVito, 2016) rather than a relevant and balanced news mix in the service of an informed citizenry. Personalization, in terms of serving readers what they want, does not fit easily with journalism's social contract. That democracy cannot be personalized is a widespread idea among staff in Scandinavian news organizations (Bucher, 2017b). What Deuze (2005: 449) labels editorial autonomy, i.e. to have a designated publisher that can legally be held responsible, is an important rule in journalism. This becomes confronted online. Indeed, newsrooms need to find a balance between a click-based imperative and editorial ambitions (Christin, 2020: 122). In other words, when logics connected to programmers and algorithms enter the newsroom, they create tensions (see also Deuze, 2005: 455;Milosavljevic and Vobic, 2019b: 6). But how do these tensions play out in context? This leads me to the research questions.

Research questions and contribution
Following the background underlining the people and everyday practices behind algorithmic automation, an understanding of both journalism and algorithms as institutions, and an analytical framework around logics, the research questions are as follows: 1) What logics were at play during the during the introduction and development of the Algorithm? 2) What tensions did these logics give rise, or contribute to, in the Daily's newsroom?
3) In what ways were these tensions negotiated?
Within journalism studies, algorithms are not a novel topic. There are accounts of data journalism (Appelgren and Nygren, 2014), computational journalism (Bucher, 2017b), computer-assisted reporting (Coddington, 2015), and algorithmic journalism (Dörr, 2016) just to mention a few. Indeed, attempts to define new forms of journalism in the wake of digitalization have produced a "cacophony of overlapping and indistinct definitions" (Coddington, 2015: 322). This study is, however, not easily placed within any of these definitions. For example, the Algorithm rather deals with the automation of work processes than distribution of news (as with Dörr's concept of algorithmic journalism). Diakopoulos (2019) recent work concerns automation of content rather than ranking and mixing news on the front-page. My study rather follows Christin's (2020:4) call to study algorithms in practice and conceptualize them as symbolic resources that can be negotiated, contested, and used in different ways. But in contrast to her, this article focuses on tensions and negotiations between different actors within the newsroom rather than with algorithmic publics (see Christin, 2020:6). Furthermore, her study focused on news outlets that were exclusively online. This article attends to a leading Swedish legacy daily and focuses on the introduction and evolvement of algorithmic automation of work processes rather than on audience metrics. Here, I follow Westlund's (2017) call for research on negotiations between different actors within the newsroom. Similarly, Napoli (2014) has called for an inquiry into the processes via which practices, and norms become accepted and adopted. It is here this article contributes with an in-depth, qualitative, and empirical snapshot of the negotiated and institutional character of algorithmic automation in a newsroom setting by zooming in on the logics, tensions, and negotiations between inhouse tech, market, and journalist actors behind the front-page of the Daily.

Methodological considerations
This study is part of a larger project researching humans, cultures, and practices behind algorithms (see Anonymized for peer review). Preparing for a larger project application, I contacted acquaintances working in tech. From there, I snowballed my way forward and came in contact with a programmer working on a short-term contract at the Daily. He put me in contact with a manager who agreed to be part of an in-depth study. The manager has been my main point of contact, a gate-opener, and has also organized interviews, especially during the first visit to the Daily. Learning more about the work with the Algorithm, I became increasingly independent in arranging interviews with actors I deemed necessary for this study (see Table 1).
Focusing on logics, tensions and negotiations behind the Algorithm, a qualitative methodology seemed appropriate. For this article, the majority of the data has consisted of interviews. Apart from general questions about themselves, their background, the introduction of the Algorithm, and practices around its development, the interviews were semi-structured into themes revolving around logics in terms of rules, values, and imaginations at play and how these were navigated and negotiated. With the reservation that rules, values and imaginations intersect, questions regarding rules have revolved around what regulated their workday and their work with the Algorithm. In order to approach values, I have asked questions about rituals, role models, (heroes) and symbols. Concerning imaginations, I have asked questions about how they view the future, their greatest hopes as well as fears. These questions led participants to formulate imaginations of how ideal journalism, programming, or sustainable newspaper market should look like. The main part of the interviews has revolved around tensions and negotiations connected to different rules, values, and imaginations. All the interviews followed the same themes but differed in which directions they meandered toward and where emphasis was put (i.e., they were semi-structured).
Apart from the introductory interviews (accounted for above), 19 interviews were conducted with 14 participants, and they lasted on average 50 min each. Conducting these interviews, it became apparent that participants had held different positions in the organization (see Table 1) and could thus reflect upon my questions from different perspectives. The study did reach saturation in terms of that during the last interviews nothing novel emerged.
The interviews took place during 2018 and all but three took place at the Daily. The other three were conducted at cafés or lunch restaurants. Follow-up interviews were conducted digitally during April 2021 with interviewees 1,3,5,6, and 12. The reason to get back to participants were twofold. First, as algorithms are never stable and in constant development, I wanted to ask about the current development of the Algorithm and its parameters. Second, as the initial interviews were conducted some years ago, I wanted to make sure my results were still valid. The interview transcriptions were analyzed vertically (interview per interview) and horizontally (analyzing theme per theme across all interviews). I have also spent time at the Daily's newsroom before interviews and during interviews, to sit in meetings and observe the different actors and sections in the newsroom. One limitation of an interview-based study is that it relies on reflective accounts from participants asked to remember the introduction of the Algorithm. I have had little direct access to any decision-making moments, only to accounts of them. How decisions were made would have been interesting to observe, understanding whose voices and what argument resonated the most. Furthermore, external tech actors hired on a consultancy basis have not been included in the study. One interesting observation though, is that many of the interviewed in-house programmers, worked on consultant basis during the introduction of the Algorithm in 2015. The results should also to be viewed in light of only studying one algorithm in one news organization, as well as its Swedish context with its relatively high circulation and extensive readership. National context and professional culture indeed matters (see Christin, 2020).
Concerning ethics, I have been open with the research and who I was during the datagathering. Due to a request from the manager, participants have been anonymized as well as the Daily itself. This does not entail full anonymity as people with insight into the Swedish news ecology will be able to figure out which newspaper this study is about. Following Kozinets (2011: 210-211), this level of anonymization falls into middlemasking. This middle-masking has been explained to all participants, and informed consent has been secured. A confidentiality agreement with the Daily has also been signed which among other things prohibits me from directly revealing the name of the newspaper and media group. Following this agreement, it will not be possible to disclose how many worked for the Daily, or any details about the organizational structure. This is considered a minor limitation as this is not primarily a case study of a news organization, but a study of how tech and journalist actors navigated and negotiated logics connected to introducing and developing the Algorithm.

Findings
Logics. The findings revolve around four different (while intersecting) logics that were apparent in the empirical material. Some of these are well-known in journalism research and some are perhaps more novel. But as (Thornton and Ocasio, 2008: 102) argue, individual and institutional behavior must be located in context. Therefore, I will start the presentation of the findings by briefly outlining these different, while overlapping logics, that guided actors in the Daily's newsroom (see Table 2).
First, a market logic was apparent in the Daily's newsroom. The Algorithm was introduced because the Daily needed to make a profit. To save money, the Daily needed to do more with less people, to use its journalist resources better, and "not having us to manually drag news stories up and down their online front-page," as one journalist phrased it. In terms of values, a market logic premieres profit, connected to rules of making money (at the Daily in terms of advertisement and subscription) and an imagination of an ever-increasing market.
Second, a democratic logic of journalism was also prominent at the Daily. This is a logic that values of freedom and right to information, together with imagination of a social contract in which journalists give readers what they need. This logic also comes with rules of publishing deadlines and having a responsible publisher. As one of the programmers phrased it: We have an algorithm that controls which news reaches the readers (…) It is an important aspect of a democracy that no matter who you are, you should get the same news mix.
These logics are not novel and has created conflicts within the institutions of journalism as Asp (2014) has discussed. But with the increase of programmers into the Daily's newsroom, a programming logic also became apparent. This is a logic that values of progress and change, rules of data-driven development, and imagination of solving all kinds of problem through algorithmic automation (here saving labor and hence money, see also anonymized for peer review).
What Westlund (2017) discusses as a logic of participation was not apparent as the Daily's readers/visitors to the front-page have not been included in this study. But what I label as a programming logic has similarities with Westlund's logic of automation, solving the Daily's profit problem through reducing costs and streamlining the news publishing. This logic's rule of data-driven development also resonates in Lewis and Usher's (2013: 607-608) normative values of iteration.
Finally, a logic of personalization was apparent with values of user experience, rules of engagement and a popularity principle, together with imagination of giving users what they want through a, if not unbiased algorithm, at least less biased than humans. "We have to give readers what they want" as one of the programmers phrased it, adding "without personalizing too much." Indeed, personalization has challenged journalists for some time now (see Bucher, 2017b;Christin, 2020).
The logics at play at the Daily are thus not a surprising finding. But with these four logics established, it is possible to attend to the other two research questions concerning tensions and negotiations. "New," or rather hybrid, logics take form as rules, values, and imaginations are negotiated by actors as they try to make sense of change. But let us therefore turn to first the tensions that the introduction and development of the Algorithm gave rise to.

Tensions
The most apparent tension during the introduction of the Algorithm was that between tech and journalist actors, automating something "that had been someone's baby, to do the front-page" (as one programmer phrased it). At the Daily, programmers talked about journalists "not being used to technology having a big impact on their everyday work." Journalists themselves mentioned their fears "having built a career and pride manually controlling the front-page" a front-page that was now run by "IT-boys without editorial experience." One journalist described this automation anxiety (Akst 2013; in Lindén 2017: 124) as a feeling or your work being "worthless when you can just be replaced by technology." Below is an illustrative example from a UX designer reflecting about herself, when she was new at the Media Group: You must think that those who worked at the newspaper were old in the business, came from print and knew basically everything, and here comes a girl straight from university, not even finished (talking about herself, authors remark) … I remember, there was a very stubborn editor that I cleaned up after (online, authors remark) (…) and at a meeting, this editor, he was so angry, he sat and bit on an ink pen, and finally it broke so there was ink dripping from his mouth Author: He was angry with you? Yes, though he did not take it out on me, more that "I have been working for 30 years and they are doing as she says" (mimicking the editor she is talking about, authors remark, emphasis in original).
The rules of programming and journalism also differs and thus created tensions in the newsroom. Programmers developed their services by rolling out them out little at a time (so-called data-driven development (see also Anonymized for peer review), or tinkering and iteration to use Lewis and Usher's, 2013, terminology). "A news article you can write in less than an hour, a new web function can take up to 3 weeks to develop," as one programmer phrased it. While journalists worked along strict rules of deadlines when everything must be finished and ready to publish, programmers seem to prefer to test, to roll out little by little, to consider feedback-data and improve the calculations accordingly. To release a whole new webpage at the same time was "terrifying," as one programmer explained, or in the words of a tech developer: In journalism, there are no long perspectives, you know what is going to happen next week, but then it ends, then you have no idea. From an editorial perspective, it is perfect, but from a product development and technology perspective, it is a bit more difficult because you want a long-term perspective.
Tensions connected to journalists' expectations of tech were also apparent at the Daily. Among programmers there was a feeling that journalists underestimated how difficult it would be to program certain functions. Some programmers complained about journalists expressing wishes for technological solutions being completely unrealistic, as in the example below (emphasis in original): If someone who is not technically knowledgeable should think about a solution that … "it should be a button there and when you press the button it should come up exactly the stuff you like" (mimicking a journalist, authors remark). … and then you are like … from where does the stuff you like come? And he answers, "from pressing the button" … but if there should be a button that selects articles you like, then there must be a model behind it, some really advanced AI.
Journalists' requests for difficult functions can be linked to programming logics' imagination of solving all kinds of problems through algorithmic computation, the belief in technology as inherently powerful (what Comor and Compton, 2015, labels as a technological fetish). Interviews suggest a kind of jump-on-the-bandwagon mentality, that the Daily had to "have these systems," "build on these techniques" and "think in these ways" (illustrating Milosavljevic and Vobic's, 2019b, concept of the algorithmic sublime). This can in turn be linked to a programming logic with its value of progress and change as something positive in itself.
The introduction of the Algorithm also accentuated old tensions between journalists, ad, and subscription departments. The Algorithm regulated the front-page. Here, the market logic pushed for ranking subscription content high to signal that if you pay you get more. Advertisements should also be highly visible on the front-page since they still make for 80% of the Daily's profit. And the democratic logic pushed for journalistic content, the reason why readers go to the Daily's website in the first place. Profit is also the main reason for the Daily's presence on social media (Facebook, Instagram, and Twitter) and having an assigned social media editor to work with these platforms. Her mission is to find readers in their social media flows and direct them to the Daily's front-page, for subscription conversion and for generating page views and ad-clicks.
There was also a tension within the market logic, how profit through subscription collided against promoting clicks into the site (i.e., profit through advertisement). Rules of the market logic, that is, how to make profit, seems to be changing, and this impacted the work with the Algorithm. So-called native advertising (integrated into the news flow like news stories) were gaining grounds in relation to display (based on page views, which still is the major income in terms digital ads). The Daily had gone from 95 to 80% ad financed with digital subscriptions going up. Material for paying subscribers was thus becoming increasingly important, as well as to lock news stories to convert readers in to paying subscribers. During the time of study, the Daily started to experiment with adding a parameter to the Algorithm, automatically locking news stories, mainly on basis how much traffic the story had generated. Again, at play here is programming logic's imagination of problem-solving through algorithmic automation.
Being present on social media meant that the Daily was constantly adapting to a logic of personalization. The idea of enhancing the visibility of quality journalistic content on social media platform, is an example of how personalization logic's popularity principle intersected with journalisms' democratic logic. But these logics were also in opposition to each other in terms of journalism's imagination of serving readers/citizens what they need, and the imagination of serving user what they want through personalization (see also Bucher, 2017b;Christin, 2020). Within the Daily, a web developer phrased it like this: "we are not Facebook, we produce journalistic content, and it is important to challenge our readers." This is the idea that the Daily needs to challenge their readers with giving them the right material, "we have a democratic mission, and we cannot provide filter bubbles as in other places," as one programmer put it. At the same time, giving users what they want, did intersect with market logic's value of profit-making. Christin (2020:6) even argues that this chase for clicks is part of a longer trend in journalism towards a market logic. This was evident in the following interview with a web developer: We have the demands to deliver revenue and success in the business. And we see how other actors out there work. Netflix had received 1 million new users. We see how digital development takes place and there we must keep up even if we are a super small organization. But the customers have the same expectation on us as on Netflix.
Hence, there were not only tensions between different actors in the Daily's newsroom. The empirical material also shows how different logics co-existed. Rather than clashing into each other (Lowrey, 2017: 382), logics not only co-existed, but they also intersected and sometimes could feed of each other. This leads me to the next section on negotiations.

Negotiations
The quest for profit is a core value according to the market logic. The Daily needed to do more with less people, to use their journalist resources better. What is interesting here is how a market logic value of profit intersected with an imagination of saving labor (and hence money) by automation (i.e., a programming logic). The imagination that algorithmic automation is "labor-saving, making it possible to do more things without employing more people" and that "journalists should do journalism and not drag things up and down the front-page," was prominent. At the Daily, it became apparent that the Algorithm itself stood for change (a value connected to programming logic) and that it could solve the Daily's problem with profitability (a value connected to market logic) by automating tasks (an imagination connected to programming logic) that had previously been done manually by journalists. And since this task was not directly connected to journalism's democratic purpose, such automation could be justified.
In order to negotiate tensions between tech and journalist actors, the Algorithm was labeled editor-led. Journalists often referred to their gut feeling, their tacit knowledge and internalized rules. In the development team, it was therefore decided that journalists should manually put in some parameters of the Algorithm. "We were very responsive, journalists had the final say," as one programmer explained. This is interesting in itself because referring to the Algorithm as automating news-ranking is only partially true. Not all parameters of the Algorithms computations were automated.
When arguing for the Algorithm in 2015, programmers worked with an automated front-page side-by-side the by the editor manually rendered front-page. In this way, by showing similarities between the two pages, programmers could eventually convince the editors of the Algorithm. Interesting here is how programmers used the brand, the Daily, its reputation and claimed niche in the Swedish news ecology. Because the side-by-sidecomparison allowed programmers to show how different editors' gut feelings (or instincts, see Bucher, 2017b) differed among them. Indeed, algorithms do come with a promise of objectivity, being detached from human bias (see Author removed for peer review), or in the words of a web developer: Before you could notice who had had the editing shift. If there was a lot of international news on the front-page, then we knew who the editor had been. Now it is a more coherent product.
Still, what Deuze (2005: 449) labels as editorial autonomy and Christin (2020:5) editorial-based evaluation, was important. "The Algorithm cannot become like magic, we need to have control, especially in a newspaper with a publisher," as one editor put it. And here, the fear was that technology would extend beyond editors' comprehension. Even programmers expressed that if the Algorithm should be editor-led, the editors needed to understand the Algorithm, and be conceived as the ones in control of its calculations. Labeling the Algorithm editor-led was one way to negotiate editorial autonomy in this context of algorithmic automation.
Tensions connected to different expectations of algorithmic automation were a topic for Algorithm-Coffee, a weekly meeting where programmers, journalists and often also data analysts met to discuss the Algorithm over a cup of coffee. After some time, journalists learned how the Algorithm functioned and could also outsmart it, or "massage it" (as one journalist phrased it), put in news values that would rank their story on top of the front-page et cetera. To get feedback on the Algorithm, to "discipline" journalists, and to deal with tensions between different groups at the Daily, was also a purpose of Algorithm-Coffee.
Algorithm-Coffee seems to have worked as tensions between tech and journalist actors have become less pronounced over time according to my interviews. The Daily now has a "digital first thinking," as one editor phrased it. "I love the Algorithm" exclaimed another editor adding that "we get more time to be editors now." Programmers say they feel journalists trust them more and one journalist told me she is not "afraid any longer." These are all examples of how journalists have accepted the intersection of market and programming logics into the Algorithm, with its imagination of freeing resources for them to do more important (journalistic) work. Allowing for the Algorithm to be editor-led (and thus only semi-automated), journalist and tech actors working together tweaking and maintaining the Algorithm during Algorithm-Coffee, were underlined in the interviews as factors behind journalists' acceptance of the Algorithm.
But there were other tensions in the newsroom to be negotiated, especially those between journalist and market actors. Interesting here is how journalists could use a personalization logic's value of user experience to argue against advertisements on the front-page. "The revenue has trumped users" experiences' and "no reader wants a giant ad before you can reach your news story," as two journalists argued. Indeed, the value of user experience informed arguments for premiering subscriptions instead of advertisement. As one programmer explained, "subscription needs an attractive product to sell to subscribers, advertisement need space to sell ads, but then the product becomes less attractive." This is an example of how a market turn towards subscription was justified in terms of the democratic logic of journalism, as also explained be a web developer below: The business has been driven by a click hysteria and we want to get away from that. It has not promoted the quality in the content, and this is connected to the brand. For the quality of content, it is much better for us to have a subscriber than an advertiser. However, journalists urge to be read sometimes manifested in them not wanting their stories to be locked for subscribers only. Similarly, the social media editor fought to keep material open so that links news stories could be freely shared on the Daily's social media accounts. Not knowing what stories would be locked and when they would be locked was a problem for her. She wanted as much freely available news stories as possible. To navigate this tension, the social media editor constructed side-stories, or trailers to news stories she suspected would be locked in order not to disappoint readers being drawn into the Daily's front-page from its social media accounts.
Concerning tensions around personalization, interviewees articulated ideas about how to personalize the news mix without compromising too much of journalism's democratic purpose, keeping a balance between what readers want to know (i.e., personalization), and what they need to know (i.e., democracy). At the time of the study, subscribers could get personalized newsletters. On the web page, readers could choose to follow a topic or a particular journalist. These practices were labeled by some journalists as personalization light. However, the ad department was eager for the Algorithm to "help people to see the right material in order of optimize engagement in the product, to stay longer on the site and read more". Web development also talked about making the site different for those who have paid a subscription, since they knew these users were regular visitors to the site. The balance then, as one journalist expressed it, became how much to personalize "before you betray your journalistic promise to your readers"? Most important here seemed that personalization should not lead to filter bubbles, as a programmer explained to me: I definitely think we should try to personalize, but not as Netflix, to just feed you with what you want. It may be more to facilitate ... we know you read this article series and now a new article has come out ... or you have read half this article last time, do you want to continue where you were? It can be something like that. One thing is that those who were visiting an hour ago might want to find something new, and those who were in last week should find the big news, our prestige jobs. If you could make a site for different people, those who were in last week and those who were in yesterday, everyone could have a better user experience, without in any way being a filter bubble.
If there are ways to provide news that personalizes the readers experience without creating filter bubbles and click baits, it seems likely that a logic of personalization will continue to intersect with journalism's democratic logic and create new personalized online news experiences. While not having included the readers directly in this study, it was nonetheless apparent that both journalists' and tech actors' imaginations of the readers were important (what Christin, 2020, labels algorithmic publics). The increasing use of audience analytics and metrics feeds into the popularity principle and the importance of "trending" as monitors looking down at the news-worked in Daily's newsroom was an example of. These displayed the news stories that had led most readers to subscribe (subscription conversion), the news stories most read (i.e., clicked on) the last minute as well as the last hour (which is connected to how much advertisements could be sold in connections to those news stories). These screens were constantly looking down on the news workers (in a panopticon kind of way), reminding them of the importance of making a profit. There are reasons to believe that this influenced the journalists. For example, if a news story was doing well, this was used as an indicator to do a follow-up the day after.

Concluding discussion
Having outlined logics at play, the tensions these gave rise to and how they were negotiated in the Daily's newsroom during the introduction and development of the Algorithm, I will in this concluding discussion underline how the Algorithm was constituted by its institutional setting, how tensions and their negotiations contributed to the development of the Algorithm itself. The Algorithm is an example of how different logics informed each other, of how new/hybrid logics take form as rules, values and imaginations are challenged and re-negotiated.
One example is the practice of letting editors lead the Algorithm. Editors were allowed to oversee some of the parameters. Parameters around time (in terms of latest news and longevity of a news item) and news value (in terms of 1-5) were manually reported by the editors, while subscription conversion (if a news item converted many readers to paying subscribers, it was pushed higher) and popularity (in terms of clicks/page views) were automated. In other words, the news-ranking on the Daily's front-page was only partly automated. Journalism's democratic logic and its imagination of giving readers what they need, was coded into to the Algorithm through the news value parameter. Journalism's rule of breaking was coded in through the time parameters. The market logics value of profit was coded in through the parameters of subscription conversion and popularity, a parameter that also intersects with the logic of personalization and its popularity principle. The Algorithm and its calculations were thus co-produced by logics, tensions, and negotiations between different actors within the newsroom. In this way, actors in the Daily's newsroom also navigated relationships with each other through the Algorithm. Not only do algorithms mediate the relationship with journalists and their publics (as Christin 2020 has underlined), but also workplace relationships. Old conflicts in the newsroom were recast and negotiated through the Algorithm.
Allowing for the Algorithm to be editor-led underlines how tech actors appreciated news journalism. Programmers at the Daily were largely sympathetic towards journalism and its democratic mission. The imagination of journalisms social contract with its higher democratic purpose was alive and kicking also within the tech actors themselves. There were constant reminders that the Daily should be journalism-controlled, and that the programmers got their instructions from the editors and should support their requests. And most programmers had chosen their workplace because they valued journalism. This appreciation for journalism among tech actors has been highlighted in other studies in different settings (see for example Lewis and Usher, 2013). Hence, we are not talking about a tech take-over. On the contrary, at the Daily, it was apparent that in-house tech actors primarily should serve journalists and their requests. Not only are human journalists still in the loop (Milosavljevic and Vobic, 2019a), but at the Daily they were the dominant actors, embedding the Algorithm within a democratic logic of journalism. This resonates with Lindén's (2017) argument that journalists have a strong capacity for adaptation and mitigation of new technology, not the least through using arguments of journalisms democratic function. It thus seems that warnings about journalism being decoupled from its civic duty (Ferrer-Conill and Tandoc, 2018: 437), or their professionalism being challenged (Singer, 2003), are somewhat exaggerated.
What this study has shown is how different actors, tensions between them, and how these were negotiated, co-constituted by the Algorithm itself. This study thus empirically illustrates Seaver's (2017) argument of algorithms as culture, as constituted by both material and socio-institutional practices. Furthermore, it responds to Neyland's (2019) call for attending to its everyday life. Because the weight of the Algorithms, different parameters was often tweaked and fine-tuned and therefore in constant development. If we put novelty, the sublime, progress and other terms associated with technological innovation and automation aside, and instead emphasize the organic, incomplete and situated nature of algorithms, it becomes apparent that they are constituted by everyday socio-institutional practices. Meeting over coffee to discuss the workings of Algorithm and how to improve them from a range of different perspectives within the news organization, is an example of this. The conclusion then is to question automation and to tease out what is meant with automation. To do this, we should rather direct attention to the everyday socio-institutional practices in which they are put to use. Because, as this newsrooms study has underlined, journalist, tech, and market actors enacted the Algorithm through their, often joint sense-making practices. In here, lies a potential for future research lies, especially as the composition of the workforce within the newsroom itself continues to change with an increasing influx of programmers, tech workers, and data analysts.

Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/ or publication of this article.