The “Dark Side” of Community Ties: Collective Action and Lynching in Mexico

Lynching remains a common form of collective punishment for alleged wrongdoers in Latin America, Africa, and Asia today. Unlike other kinds of collective violence, lynching is usually not carried out by standing organizations. How do lynch mobs overcome the high barriers to violent collective action? I argue that they draw on local community ties to compensate for a lack of centralized organization. Lynch mobs benefit from solidarity and peer pressure, which facilitate collective action. The study focuses on Mexico, where lynching is prevalent and often amounts to the collective beating of thieves. Based on original survey data from Mexico City and a novel lynching event dataset covering the whole of Mexico, I find that individuals with more ties in their communities participate more often in lynching, and municipalities with more highly integrated communities have higher lynching rates. As community ties and lynching may be endogenously related, I also examine the posited mechanisms and the causal direction. Findings reveal that municipalities exposed to a recent major earthquake—an event that tends to increase community ties—subsequently experienced increased levels of lynching. Importantly, I find that interpersonal trust is unrelated to lynching, thus showing that different aspects of social capital have diverging consequences for collective violence, with community ties revealing a “dark side.”

question did not differ significantly, which is why we abstained from including the list experiment in the final questionnaire, to reduce its complexity.
The same vignette, including three key questions of the questionnaire (agreement with lynching, participation in lynching, and knowing about lynching in community), was fielded in November 2021 in a face-to-face Omnibus survey representative of the Mexican adult population (N = 1,019), resulting in very similar proportions as the ones recovered for the Mexico City survey.

Note on reliability of participation in lynching question
Readers may be concerned about the reliability of asking about participation in lynching.To examine this aspect, I did the following: First, I conducted two anonymous online pilot surveys.Online surveys are less susceptible to social desirability bias (Kreuter,  Presser, and Tourangeau 2008).Using the Qualtrics platform 4 and restricting the sample selection to Mexico City (N = 522 and N = 300), 7.3 and 8 percent, respectively, answered that they had participated in a lynching-style incident, slightly below the 9.6 percent in our face-to-face sample. 5Second, an identical vignette-based participation question was included in a survey representative of the adult population in the whole of Mexico in November 2021 (N = 1,019), conducted by the same survey firm as the Mexico City survey.In this survey, 10.3 percent of participants stated they had participated, close to our estimate for Mexico City.Given that the majority of respondents in the Mexico City survey agree with the perpetrators (71.3 percent), social desirability seems less problematic for our measurement.

Sampling
The survey targeted a high number of primary sampling units (colonias).The targeted number of 340 units was a compromise between sufficient dispersion across the city and a sufficiently large number of interviewees within each unit (at least six).The six individuals in each unit were block-randomized to the three treatment conditions (an element of the survey not used in this study 6 ).
The 340 colonias were selected by the principal investigators using probability proportional to size (PPS) sampling (Skinner 2016) without replacement, implemented in Stata 16.There is no readily available information on colonia-level population.We therefore created a colonia-level indicator for population size based on census information for the street block, or "manzana," level (see Colonia-level dataset below) (Vilalta, Muggah, and Fondevila  2020).We also selected a secondary set of 70 replacement colonias using the same procedure, to allow for quick replacement in case the enumerators faced security problems or could not access buildings.In 19 colonias, interviews had to be complemented in directly adjacent areas or colonias had to be replaced completely (using colonias from the same alcaldía or borough).The main reason for this was restricted access to gated communities in wealthy neighborhoods.One colonia had to be replaced due to security concerns for the enumerators, a much lower number than initially expected.
The selection of households within colonias was identical to the procedures used by the widely tested Latin American Public Opinion Project.First, enumerators identified the Northeastern corner of a randomly chosen starting street block (manzana) within the selected colonia.From this point, the enumerators followed a random walk procedure, contacting every fifth household around the street block.Then, they selected a further street block using the same procedure until they reached the targeted six interviews reflecting an equal number of men and women, and an age distribution in line with census information.Given that a team of enumerators was working within the same colonia at the same time, certain colonias have slightly more than six interviews, but all have at least six interviews.

Survey fielding
The survey was implemented as a face-to-face survey due to the increased difficulty of reaching individuals by phone during the Covid-19 pandemic7 and the limited possibilities in terms of geographically focused sampling in online surveys.Overall, the quality of a face-toface survey was considered superior for our purposes.
A total of 61 previously trained enumerators conducted the survey between February 9 and March 1, 2022.The non-response rate was 44 percent.After introducing themselves, the enumerators began with a short description of the survey and handed out a consent sheet, including contact information for the study organizers and psychological counseling (see Ethical Considerations below).The average duration of the interview was 19 minutes.The last block of questions was directed to the enumerators, who filled them out without the presence of the respondents.
In the survey firm's field report, enumerators noted the following difficulties: encountering male participants as they are often out of the household and access to gated communities.Also, some questions had a difficult wording for participants with low levels of education.Advantages included the fact that enumerators could present themselves as working for a Swiss university and the topic of the survey (insecurity), which is of general interest to residents of Mexico City.

Survey monitoring
The survey implementation was monitored by 12 supervisors, three auditors, and two team coordinators, including in-person supervision of 15 percent of the sample and remote supervision of another 12 percent.The survey platform Survey To Go automatically recorded audio files of two short segments of the interview, without the enumerators' knowledge, to guarantee the effective application of the survey.Enumerators had to provide geographic and photographic information (not of the study participants) to show they effectively conducted the interview where they were supposed to conduct the interview and at the time they were supposed to conduct the interview.Interviews with evidence that enumerators did not conduct the survey appropriately were removed before starting the analysis.Unrealistically short interviews and interviews with overly fast progress from one question to the next were also removed.Identical responses to a series of consecutive questions were taken as a potential warning sign.

Links to questionnaire and consent sheet
Due to the length of the full survey questionnaire, I include a link to the original and translated versions.

Ethical considerations
In addition to the short references in the main text, here I report in more detail how I dealt with ethical challenges for the Mexico City survey, focusing mainly on risk mitigation strategies.The Ethics Board of ETH Zurich approved the proposed procedures.
This study did not create health risks for any participants, as we only collected information about their attitudes, beliefs, and narratives.Also, we do not foresee any societallevel or political risks stemming from this study.Similar research endeavors have been carried out in the past-for example, a survey on lynching in Mexico (CNDH 2019).Thus, the wider public is aware of the topic and direct political consequences are unlikely.
Our project is about collective violence.To better understand why collective violence occurs, we asked participants about their experience with and participation in collective violence.This is an important undertaking as it can potentially help us identify violence prevention tools, but it also has implications for ethically sound research.We are highly aware of potential risks and abstain from exposing participants to any excessive risks.We specifically focus here on psychological distress, legal implications, security risks for researchers, risks from the Corona pandemic, and data protection, and we discuss our strategies to mitigate these potential risks.
Psychological distress.After extended conversations with our local partners (Data OPM, legal counselor, and academic partners), we assessed the risk of re-traumatization during face-to-face interviews to be minimal.While the term lynching evokes grisly associations among a Western audience, who might think about brutal lynchings in the United States targeting African Americans in the nineteenth and early-twentieth centuries, Mexicans are constantly exposed to acts of self-justice, which usually amount to beatings of thieves, either in the news or through their own experience.This is also how we represent these events in our vignettes; we abstain from using the term lynching or "linchamiento."For Mexicans, these events are normalized, as are surveys and studies about security-relevant questions.Hence, the risk of re-traumatization is minimal.
However, there may be rare cases in which study participants suffer emotional distress as a result of our surveys and interview questions.As a general stance, we avoid using valueladen language in our surveys and interviews.Also, we avoid talking about specific forms of violence that occurred during a lynching, as this is not the main focus of our inquiry.This should generally help limit the risk of psychological distress.
In preparing the surveys we made a further series of decisions to limit potential emotional distress.Our questions were asked in a non-intrusive way.Instead of asking directly about lynching, we presented respondents with a typical incident of a thief being punished by neighbors.The term lynching was avoided and other forms of extreme violence (e.g., burning, hanging) were not mentioned.Also, participants received contact details for free psychological counseling in case they felt uneasy after responding to our questions (see consent sheet below).These counseling options were independent from us, which local partners indicated was preferable to connecting respondents with a specific service, which may be seen suspiciously.
Legal risks.Our survey asked about experiences with and participation in events similar to lynching.Identifying the drivers of lynching violence is a key contribution of our project.However, questions about experiences and participation are sensitive.In particular, the Mexican penal code requires citizens who have first-hand evidence of crimes to report them.This legal obligation did not affect our work, as we only received information about crimes via third-person referral.
We understand the tension between, on the one side, the moral obligation to report crimes when becoming aware of them and, on the other side, safeguarding the privacy of study participants.However, if this tension (reporting crimes versus privacy) was always solved in favor of the moral obligation to report crimes, many of the standard practices in studying violence would be impossible, including the very common practice of qualitative interviews in conflict contexts and victimization surveys, which are used around the globe.Given that a core goal of science is to contribute to the understanding and solution of important societal problems (such as violence), we argue that solving the tension in favor of protecting the privacy of study participants should be possible under appropriate conditions, to allow for research on the drivers and consequences of violence.
However, this has to be done responsibly, as described, for example, in the Manual on Victimization Surveys developed by the United Nations Office on Drugs and Crime (UNODC 2010).We devised a series of strategies to mitigate potential concerns.We developed these strategies in collaboration with the Mexican lawyer Alberto Abad Suárez Ávila, from the university UNAM, who has experience in public opinion surveys.In accordance with UNODC guidelines and local guidance from our Mexican partners, our strategy in collecting information about the experience and commission of violent acts was designed to avoid receiving any specific information on concrete crimes.
We did this by referring to hypothetical scenarios and asking general questions.With the information we received, it would not be possible to support the prosecution of crimes, unless we asked for additional information from our study participants, which was not in our interest.The information we did receive in no case determined a specific act of violence that could be reported to police or judicial institutions.In this sense, we avoided the tension between a moral obligation for reporting crimes and safeguarding respondents' privacy, as we are unable to report a concrete crime to authorities.
Specific approach taken with regard to the survey question "Have you ever participated in such an event?Yes, No, no answer."This question refers to a beforementioned vignette: "A thief assaults a lady on the street.Using a knife, he takes her belongings and escapes.After the robbery, a passer-by manages to take away the thief's knife and subdues the thief.In this moment, a large number of people gather, insult and punish the thief." We understand the sensitivity of our question, which is why we thought at length about how to formulate it.Before describing our strategy, the key here is that research on selfjustice has exclusively focused on support for such violence but not participation in it.We believe that support and participation cannot be equalized, and that empirical research on this important societal problem would be largely improved if we had a more direct measure of participation.
Therefore, we developed a strategy (together with our Mexican legal counselor) to ask this question in a way that would not create a legal or moral obligation to contribute to prosecution.First, our participation question does not refer to an actual crime but to a hypothetical scenario.An affirmative answer-meaning the surveyee participated in a similar event-does not make it possible for us to contribute to the prosecution of a crime, as we are not informed about any actual crime.We would have to ask additional questions to receive information on an actual crime that could be reported to authorities.
Second, the scenario does not necessarily describe a crime, as the outcome remains legally indeterminate.Citizens are described as insulting and punishing the thief, terms that have no clear legal implication.
Third, we do not specify the kind of participation.Hence, we do not know whether an affirmative answer to the participation question implies involvement of the surveyee in an actual crime or whether it describes active bystanding.
To sum up, given this framing of the question, it would be impossible for us to contribute to the prosecution of a crime.
In addition, no identifying information about the person who witnessed the hypothetical event or who participated in it was shared with us.Hence, we would not be able to report the participation of a given individual.Securing anonymity provides an additional layer of protection in our study set-up and has to be seen as part of a broader strategy to make it "impossible to draw any conclusions from the answers regarding the commitment of such crimes," as demanded by the Ethics Board.Also, in line with the recommendation of our legal adviser, we included a note in the consent sheet and offered the possibility of psychological counseling (as mentioned above).
Safety risks for researchers and study participants.When studying violence, researchers may be exposed to risks emanating from their research sites.We devised a series of strategies to minimize risks for the researchers, in accordance with guidelines from our university's safety unit.
With regard to the surveys, we have long debated the mode of survey application.Phone and online survey modes have certain advantages when asking sensitive questions and are less costly, but face-to-face surveying has clear advantages in terms of sampling and data quality.The pilot survey was conducted online, but we decided to conduct the main survey face-toface.This decision was the result of an extended conversation with our survey partners at DATA OPM.Given our questionnaire and sampling procedures, they estimated the risks for their enumerators to be minimal (in line with most other surveys they have done in the past).They also noted that phone survey participants became less responsive during the COVID-19 pandemic, making phone surveys in Mexico extremely burdensome.
The face-to-face survey had safety implications, especially for the enumerators, so our survey partner DATA OPM created a plan to ensure safety (Protocolo de Seguridad).For example, during the sampling procedure, locations known to be highly insecure were replaced.Furthermore, when enumerators encountered safety problems at a new location, they abandoned this location right away.This was not ideal from the perspective of random sampling, but we obviously weighed the safety of enumerators higher than minor methodological concerns.
Risks from the Covid-19 pandemic.The special situation of the Covid-19 pandemic necessitated additional safety measures.Researchers adhered to the guidelines of local authorities in Mexico and only traveled to field sites if the health risks of their study participants and themselves were manageable.
DATA OPM had their own additional guidelines to reduce risks for their enumerators ("Protocolo Sanitario por Contingencia COVID"); for example, no "high-risk" individuals were part of the enumerator team.Enumerators measured their temperature each day before starting the interviews, disinfected the screens of their mobile devices regularly, used face masks, maintained their distance from each other and the study participants, did not shake hands with participants, did not enter participants' households, and used private transportation rather than public transport.
The survey and field visits started no earlier than November 2021, as Mexico was on track to fully vaccinate their adult population by then.Data protection.Data protection is particularly important in the context of our study, given that we accessed sensitive information.In principle, all information was anonymized before storage, safely stored on password protected computers, and was only accessible by the authors for the course of the study duration.
All data of the pilot online surveys (conducted with Qualtrics) were fully anonymized, hence, it was not possible for us to attribute any responses to specific individuals once we received the data from Qualtrics.This corresponds to Qualtrics' standard practice, which fully complies with EU's General Data Protection Regulation (GDPR). 8ata from the face-to-face survey (conducted by DATA OPM) were anonymized.Our survey partner followed the guidelines of the Mexican Federal Law of Personal Data Protection, and for this project, they also complied with EU's GDPR.Furthermore, their standard operating procedures comply with the ethical code of conduct and best practices of the most important public opinion research organizations (including the World Association for Public Opinion Research and the European Society for Opinion and Market Research).If DATA OPM receives information that allows for the identification of specific interviewees (e.g., the exact location of residence), this information is (a) not shared with the client, and (b) stored separately from the dataset that contains the survey responses.Specifically, DATA OPM creates an anonymized key for each respondent to which only their authorized employees have access.Secure file transfer services are used to transfer only the anonymized data at the end of the survey; the full information on respondents is stored and encrypted on DATA OPM's own server and erased after maximally six months.
Fully anonymized data of the survey (including questionnaires, codebooks, and replication code) will be made publicly available once the respective studies are published, as is standard practice in social science and following FAIR principles for scientific data management (Wilkinson et al. 2016).This is also in line with the spirit of our university's initiative on research data management.Publication of data facilitates replication of our analysis and thus prevents inappropriate data usage.

A2. Lynching data 9
To identify and categorize lynching events, we primarily relied on Factiva, the most comprehensive global news database, containing almost two billion news articles from more than 33,000 news sources from 200 countries in around 28 languages (Nussio and Clayton  2024).This includes news networks, such as Reuters and the Associated Press, as well as local radio, television, and newspaper reporting in local languages. 10Factiva allows researchers to search for specific keywords and specify the countries of interest. 11We pilot tested several search strings, ultimately settling on a specification that included common terms for lynching in English and Spanish (e.g., lynching and linchamiento), a number of related colloquial terms in Spanish (e.g., justicia por mano propia), terms relating to mob violence in both Spanish and English (e.g., lynch mob and vigilantes), and excluding a number of common terms unrelated to our concept (e.g., the bank Merrill Lynch).
Next, we limited the geographic scope of our search to Latin America.Given that our data collection was mainly based on newspapers, we limited our focus to Spanish-and Portuguese-speaking countries in Latin America for reasons of language comparability.This includes Argentina, Bolivia, Brazil, Chile, Colombia, Costa Rica, Dominican Republic, Ecuador, El Salvador, Guatemala, Honduras, Mexico, Nicaragua, Panama, Paraguay, Peru, Uruguay, and Venezuela.We excluded Jamaica, Belize, Haiti, Guyana, Suriname, and a series of smaller Caribbean states.We also excluded Cuba due to limited newspaper reporting.The temporal focus for the whole of Latin America was 2010 to 2019 (for Mexico only, we collected data ranging from 2000 to February 2022).This approach produced a corpus of around 80,000 news articles.
Human coders then reviewed each article to determine whether it identified a lynching event.Identified cases of lynching were coded to capture key details such as the date of the event, the location, and number of targets and perpetrators,12 as well as a series of descriptive variables, such as the form of violence, the alleged form of wrongdoing, and the physical consequences suffered.
Following best practices identified in prior projects collecting violent event data (Davenport and Moore 2015), all coding sheets were then checked by one of the more experienced coders.Any disagreements or contentious issues were either discussed in monthly coder meetings with the project leaders or solved between the coders bilaterally in the event of a clear error.Therefore, all lynching events included in the LYLA data were checked by at least two persons, and unclear cases reviewed by at least one project leader.
We set a low bar for events to enter the dataset, including cases that some may not consider lynchings but "attempted" lynchings.The boundary condition for inclusion was a clear threat of lynching violence.This allows researchers who use the dataset to set their own threshold, for example, including all LYLA cases, only cases resulting in injury, or, even more restrictively, only cases resulting in death.
The newspaper-based approach we adopt was the best suited to gather systematic crossnational lynching data, but this method has well-known limits when collecting violent event data.Reporting bias affects the collection of data on all types of violent events (Godoy  2006:26; Mendoza 2008:5; Weidmann 2015).News editors prioritize more newsworthy events, which means that more violent, urban, and spectacular lynchings involving unusual protagonists are reported more often (Miller et al. 2022; Odartey-Wellington and MacRae 2020).Our approach therefore risks introducing systematic bias (e.g., urban bias, bias toward bigger events).Similar problems also afflict well-known lynching datasets from the United States (Spilerman and Gerratana 2009).By relying on local media sources included within the Factiva database, we hope to mitigate some of these problems.Prior research has shown that despite these challenges this type of data can be instructive (Sundberg and Melander 2013).
On this note, while perhaps obvious to most readers, it is important to stress that the LYLA data capture reported lynchings.A significant number of lynchings go unreported in newspapers.Hence, we are not blind to potential biases in data collection and try to be as transparent as possible in our presentation.We also conducted additional validation checks using locally coded national datasets, which largely support the validity of our data.We provide a separate 50-page document including individual country reports comparing our data with other datasets. 13mportantly, our dataset recovers similar over-time variation and magnitudes of the phenomenon of lynching as other previously published datasets in Mexico.Figure A2.1 shows the development of reported lynchings in the LYLA dataset (black line) along with lynchings registered in other sources.We observe a generally increasing trend in the LYLA data.Another notable aspect of Figure 2.1 is the similarity in the pattern between the different sources.Between 2006 and 2011, Gamallo (2015) and Godínez (2017) have comparable observations.BBC Mundo's (2016) research collected 56 events in 2015, the CNDH ( 2019) report records 43 lynchings, and the LYLA dataset 98. Rodríguez and Veloz (2019) collected data for the period from 1988 to 2018.CEC records 150 lynching events for the year 2020.14

A2.1 Yearly evolution of lynching in different data sources
We opted to rely on newspaper reports after carefully considering alternatives, including crime statistics, social media, and surveys.First, lynching is not defined as a crime in the penal code of any Latin American state.A lynching incident may enter crime statistics as a homicide or injury, but given the large number of homicides and injuries unrelated to lynchings, these forms of violence do not provide a meaningful proxy.Hence, there is no readily available official information on lynching. 15econd, we decided against using social media.News reports provide a relatively consistent corpus of data that can be analyzed systematically and retrospectively, and of which the biases are relatively well understood (Miller et al. 2022).Social media is harder to study systematically, and the biases are less clear and possibly quite different across contexts and time.Furthermore, social media entries on brutal violence tend to be quickly deleted from platforms.Using social media would also have risked exposing our coders to considerable psychological harm (Bellingcat 2018).We made sure our coders were only exposed to text, rather than to potentially more harmful visual material about lynching, shown on Facebook and other platforms. 16hird, we opted against using surveys for the systematic collection of lynching event data.To achieve sufficient coverage across time and space would have been prohibitively expensive, and would likely have revealed only scattered and geographically limited information about lynching events.Instead, we ran a survey to validate the LYLA data on the level of Mexico City neighborhoods.We also explored whether existing surveys might provide a workable source of data.However, we found possible indicators, such as expressed support for self-justice, to be relatively poor proxies for prevalence of lynching events (Agostini and van Zomeren 2021).17

A3. Mexico municipality data
The municipality data used for independent variables and covariates (along with the lynching data) was pulled from different sources.
The most important source is the ENVIPE surveys (Encuesta de Victimización y Percepción de Seguridad) conducted by INEGI, the national statistics institute, a technocratic state institution producing high-quality data, including the national census. 18We aggregated all ENVIPE surveys from 2011 to 2020 to generate mean municipality-level indicators.Given the large sample size of ENVIPE (N = 854,720), this provides good coverage of most municipalities in Mexico.Of the 2,457 Mexican municipalities, our indicators cover more than 1,500.The municipalities not covered are mostly micro-municipalities.For example, in Oaxaca alone, 399 extremely small municipalities are not covered for one of our key indicators (whether neighbors collaborated to deal with problems of public lighting).This is a result of the multi-stage random sampling procedure applied by the INEGI.The surveys were conducted usually between February and April each year.For each variable, I also collected how many observations were used to generate the municipality mean.This makes it possible to run the analysis on municipalities with particularly good coverage.I ran the main analyses with models including municipality means that are based on at least 10 survey respondents.In robustness checks, I ran the same analyses on a restricted sample of municipalities with at least 50 and 100 respondents, hence on a sample of generally larger municipalities, but with higher data quality.Results remained very similar.
For validation of the aggregate measures, I correlate a variable that exists in the ENVIPE surveys and in repeated national censuses.Pearson's correlation between the two measures is 0.87, suggesting the ENVIPE surveys are a valid reflection of municipality averages.
The analyses' main indicators based on ENVIPE are the following: • Neighborly cooperation to deal with lighting: mean of asked people whose neighbors organized to deal with lighting problems • Problem with lighting: mean of asked people who had problems with lack of lighting in their neighborhood • Victimization: mean of asked people who suffered from a situation on the card [different types of crime] before the year before the survey • Interpersonal trust: mean of asked people who trust neighbors • Trust in army: mean of asked people who trust the army Other variables used in the municipality analysis come from different state organizations: • Population: INEGI

A4.2 Community participation and lynching participation
Instead of names known in colonia, here I use the level of community participation to indicate community ties.The community participation indicator is the first component of participation in community meetings and participation in religious events. (1) (

A4.4 Community ties and lynching participation without outliers
Some respondents mention very high numbers of people known by name in their colonia.In the below analysis, I exclude outliers who say they know more than 200 people by name in their colonia. (1) (

A4.8 Community ties and lynching participation, adjusting for previous lynchings in the colonia
(1) (

A4.9 Parents' siblings and lynching participation
These models use parents' number of siblings as a temporally prior proxy for community ties. (1) (

A4.11 Alternative mechanisms
Here, I examine whether community ties and trust are related to alternative mechanisms that may explain participation in lynching.I focus on whether individuals feel threatened by delinquency (1 to 4 Likert scale), which should increase participation in lynching, have trust in government (1 to 7 Likert scale), which should decrease participation in lynching, and binding or collectivist values, measured with the moral foundations questionnaire (Graham et  al. 2011), which should increase participation in lynching.Community ties are not systematically related to any of these alternative mechanisms.Interpersonal trust is, as expected, positively related to trust in government.In summary, none of these alternative mechanisms can account for the relationship between community ties and lynching participation.
Different from the models reported in the main paper, Model 4 does not adjust for trust in government, as trust in government is the dependent variable. (1) (

A4.15 Community ties, trust, and lynching rate (log)
In these models, I use the first principal component of three items of neighborly cooperation dealing with lighting, water, and water leakage problems.This procedure reduces the number of covered municipalities, but results remain consistent.
The three items used for principal component analysis have scale reliability alpha: 0.85 Neighborly cooperation (principal component) .12*(. .

A4.20 Community ties, trust, and lynching rate (log) -limited to municipalities with at least 100 ENVIPE individual observations for each municipality
(1) (  NOTE: For this analysis, I do not have a reliable measure of time-varying community ties.Using the same ENVIPE survey wave data as for the cross-sectional analysis above, most individual municipality-year observations would be based on a very small sample of survey respondents and the analysis would thus be based on high levels of random measurement error.For comparison, the median number of survey respondents per municipality-year is 14 and the mean 38, whereas the median of survey respondents per municipality used in the cross-sectional analysis above is 61 and the mean 217.This allows me to restrict analysis to only those municipalities with precise measures (50, 100, and 300 respondents per municipalitysee A4.16) but would create an unbalanced and highly restricted sample for municipality-year analysis.

A4.23 Earthquake exposure and newspaper reporting
An increase in lynching reports may be due to additional attention to areas exposed by the earthquake.Therefore, I compare the total news reporting in areas exposed and not exposed to the earthquake.The below reported dependent variable for this analysis is the number of news reports in the 40 largest municipalities of Mexico in each year.Collection of this information was restricted to the 40 largest municipalities of the country, as subnational news searches based on smaller municipalities would have generated too much noise.This explains the lower N for the analysis.The 40 municipalities are spread across the geography, including areas exposed and not exposed to the earthquake.If anything, news reporting decreases in earthquake-exposed municipalities, although the coefficients are not significant.In any case, increased lynchings are not a result of increased reporting. (

A4.24 Earthquake exposure and lynchings, adjusting for newspaper reporting
In this analysis, I adjust for the number of news reports per municipality.This analysis is restricted to the 40 largest municipalities.Results point in the same direction as in the main text, despite the limited number of observations. (1) (

A4.32 Distance to active volcano and lynching rate
Different from a one-time shock, long-term exposure to active volcanos can affect the social structure of a community over time, as individuals adapt after experiencing disaster or in anticipation of future disaster (de Mauleón 2023).The advantage of this approach is that I can examine the effect of disaster exposure on community ties in a cross-sectional set-up using the same survey-based measure of community ties as above.
I use distance to volcanos that were active in the past 100 years.Does this variable satisfy the conditions for a plausibly exogenous proxy?First, active volcanos can be seen "as if random" because there is no direct relationship between volcanos and lynching.Second, I can examine whether this proxy is related to community ties, as I have a measure for neighborly cooperation in this cross-sectional dataset (see Models 1, 3, and 5).With increasing distance, there is less neighborly cooperation, as expected.This means distance to volcano predicts community ties.Third, does the proxy influence lynching only through community ties?This question remains open as volcano exposure could have led-through migration, state attention, or soil fertility-to a distinct path for social behavior.This is a central limitation of using volcano exposure for this analysis and needs to be weighed against the collected evidence in the remaining analysis.
The negative coefficients in the below table suggest exposure to active volcanos (less distance) is associated with both more neighborly cooperation, an indicator of community ties, and higher lynching rates.The coefficients of Models 2, 4, and 6 remain in a similar range despite an increasing number of control variables and model fit (R 2 ).
Given that distance to volcano may not be linearly related to community ties, I also use the squared distance and the square root of distance.Both produce similar results. (1) ( 19

21 Community ties, trust, and lynching rate (log) -limited to municipalities with at least 300 ENVIPE individual observations for each municipality
Standard errors are in parentheses.Linear regression with cross-sectional municipality data and different specifications.The number of municipality-level observations is smaller than in other models, because I only include municipalities with a larger sample of ENVIPE individuals.*p < 0.05; **p < 0.01; ***p < 0.001.Two-tailed test.The number of municipality-level observations is smaller than in other models, because I only include municipalities with a larger sample of ENVIPE individuals.Standard errors are in parentheses.Linear regression with cross-sectional municipality data and different specifications.*p < 0.05; **p < 0.01; ***p < 0.001.Two-tailed test. A4.