Research Review

Image-Based Abuse: A Threat to Privacy, Safety, and Speech

Abstract

Commonly misnamed “revenge porn,” image-based abuse is the non-consensual creation and distribution of private images. Much more than “one-off” attacks that cause hurt feelings, image-based abuse has become a broad effort to silence and shame people in public spaces. This literature review brings together relevant research from the fields of law, communication, psychology, and public health to explain how image-based abuse has become an ever-present threat to privacy, safety, and speech.

What is Image-Based Abuse?

Image-based abuse is the non-consensual creation or distribution of a person’s image. It has wide-ranging impacts that could include activities as distinct as sharing images of a child who is not part of the sharer’s immediate family, to sharing a video of someone’s private conversation at a doctor’s office (Powell & Henry, 2017). Losing control over one’s image is not a trivial matter. In many cases, it can be life-threatening. The distribution of private images without consent on social media, or through other means, has caused victims severe psychological harm, even leading to suicide (Powell & Henry, 2017).

Image-based sexual abuse is a specific behavior that fits within this broader category. It refers to the creation, distribution, or threat to distribute sexual images or videos of someone without their consent (McGlynn et al., 2019). Much more than a scorned ex-boyfriend sharing a woman’s nudes, image-based sexual abuse covers a wide set of harms. Abusive intimate partners, child sex abusers, rapists, and sex traffickers have all used the threat of exposure to maintain power and control (Maddocks, 2018). Hackers unknown to victims have also broken into photo storage accounts to expose and extort them for money. Victims’ phone numbers and addresses are often shared with their intimate images, opening the floodgates for offline abuses including stalking and physical violence (Eaton et al., 2017). The phenomenon is a global one. South Korea, for example, faces a nationwide “spy cam” problem, with nude content from thousands of public restrooms streamed online every day (Kim, 2021). Image-based sexual abuse now pervades public life in many different guises. Sexual forms of image-based abuse are the most prevalent and the most severe. Thus, this review focuses predominantly on image-based sexual abuse (IBSA).

A Brief History of Image-Based Abuse

Image-based abuse existed long before the internet era. In 1953, nude images of Marilyn Monroe were published on the cover of Playboy without her consent, and every decade since high-profile women have been victimized (Grasso, 2017; Hills, 2017). Take beauty queens for example. They have become common targets: Miss America, runner-up Miss Lebanon, Miss Korea Universe, Miss World Zimbabwe, and Miss France contestants have all been victimized over the last four decades. As media technologies advanced in the late twentieth century, image-based abuse became more widespread. The arrival of home video in the 1980s and the rise of the internet in the 1990s empowered anyone with a digital camera and an internet connection to non-consensually create and distribute images. Online spaces that have become central to our daily lives are also becoming increasingly “toxic,” with platform design and algorithms facilitating the spread of misogynistic discourses and abusive content (Massanari, 2016).

With advances in AI technology in the 2010s, users no longer need real images of their victims. Instead, they can create “deep fakes.” These are forgeries of sexual content made using AI face-swapping technology (Ajder, 2019). When Indian journalist Rana Ayyub investigated a case of child rape in 2019, she received virulent online abuse. This included the creation of a fake pornographic video that appeared to depict her in sexualized ways. The video was circulated widely online, leading Ayyub to reduce her public profile to protect her physical safety (Ayyub, 2018; Citron, 2019). A year earlier, actress Bella Thorne was also targeted by a “deep fake” pornographic video after advocating for survivors of sexual violence (France, 2018). Pornographic deep fakes are the newest iteration of IBSA and they disproportionately target high-profile women (Ajder, 2019; Chemaly, 2019).

As stay-at-home orders and social distancing swept across the globe in the spring of 2020 due to the Covid-19 pandemic, many of us began relying on digital technologies to work, socialize, and connect. Stress, unemployment, and confinement all contributed to a significant increase in violence against women, and a concurrent increase in online sexual harassment and image-based sexual abuse (UN Women, 2020; Goldstein, 2020; Taddeo, 2020; Robinson et al., 2020). Based in the United Kingdom, the Revenge Porn Helpline experienced a 98% increase in cases in April 2020, compared with April 2019. Adolescents were also at increased risk of IBSA during the pandemic because many were spending more time online, while perpetrators were also online more (Davies, 2020). It appears that people also distributed their own intimate content more freely during the pandemic, most notably through the spread of “dick pics” (Mishna et al., 2021). It is unclear whether IBSA will increase at the same rate after social distancing measures are relaxed, but it is reasonable to predict that this harm will continue to spread in the coming years. Historical indifference to violence against women combined with the new features of digital technology have created the conditions for IBSA to flourish indefinitely. The rise of “INCEL” culture has inflamed discourses around IBSA: instead of dealing with a single perpetrator, victims now face “cyber mobs” of like-minded men who congregate on digital platforms to collectively harass women victims (Citron, 2020; Mikoley 2021; Citron, 2014).

Key Drivers of Image-Based Abuse

People who create or share private images without consent report a range of motivations. Many commit this harm to scare, punish, or control their victim (Harder & Hasinoff, 2021). These are the same motivations that drive intimate partner violence. According to employees at shelters for those escaping abusive relationships, almost all new clients report experiencing IBSA or a similar form of technology-enabled harassment (Maddocks, 2018). In a recent study, psychologists at Florida International University found that IBSA encapsulates every aspect of the “power and control” wheel, a common tool used to identify intimate partner violence (Eaton et al., 2020). But the motivation to scare, punish, or control is not limited to those who know their victim personally. Many perpetrators target public figures that they don’t know but seek to punish due to their views, actions, or their existence in public space. Whether the victim is a former girlfriend or a famous actor, IBSA is often motivated by a desire to control and punish (Langlois & Slane, 2017). A classic example of this imperative is a Canadian case in which an ex-husband threatened to distribute nude images of his former wife to her new partner and his family if she did not agree to divorce terms related to their shared property (Dodge, 2021a).

Beyond power and control, perpetrators also report committing this crime because they think it is entertaining, or because they consider the images their “property” to share (Batha, 2021; Harder & Hasinoff, 2021). Again, these motivations lead us back to patriarchal cultural discourses: that women’s bodies are entertaining objects of male consumption and ownership. IBSA is motivated by these familiar patriarchal norms: it is sexual in nature, involves the stripping of individuals, and disproportionately targets women and LGBTQ individuals (Eaton et al., 2020; Franks, 2019). For men who experience IBSA, perpetrator motivations still seek to inscribe, structure, and reinforce gender inequalities through the emasculation of victims (Powell & Henry, 2017, p. 182). Those who target heterosexual adult men are often driven by financial motivations. Referred to as “sextortion”, this form of IBSA is the process of filming a sexual encounter and then blackmailing the victim for financial gain. Although driven by economic motives, sextortion remains deeply gendered. Its male victims are not beholden to the same sexual double standards as women, nor are they targets of the cybermobs that harass female victims (Salter & Crofts, 2015). To truly grasp the implications of image-based abuse we must recognize it as a form of abuse that contributes to broader societal shaming, silencing, and controlling of women and sexually diverse individuals (Eaton et al., 2020; Franks, 2019).

The Scope of the Problem

One in 25 Americans have either been threatened with IBSA or experienced it (Duggan, Smith & Caiazza, 2017, 7). A 2017 study of 4,122 Australians reported that 1 in 10 respondents have experienced this harm (e-Safety Commission, 2017). It is likely that victimization rates fall somewhere within this range across most contexts. However, the risk of victimization is not distributed equally, and IBSA is perpetrated in ways that exacerbate existing inequalities. Women of color, immigrant women, and indigenous women are more likely to experience IBSA, as are younger single women and adolescent girls (Lenhart et al., 2016; Henry, Powell & Flynn, 2017). Bisexual women and gender-nonconforming individuals have also been found to be at increased risk of harassment (Duggan, Smith, and Caiazza 2017, 7). More research is needed to understand how this harm affects other minority groups, including trans people and sex workers, who anecdotally appear to be at increased risk. Men, white people, older people, married people, and wealthy people are less likely to experience IBSA and more likely to have the resources to navigate its consequences. Moreover, those at increased risk are also more likely to experience other forms of sexual abuse. Polyvictimization, the experience of multiple forms of victimization, often causes those from marginalized populations to experience IBSA as just one aspect of their abuse (Finkelhor et al., 2009).

In the school environment, several studies indicate that girls and lesbian, gay, or bisexual adolescents are at increased risk of IBSA (Ouytsel et al., 2021). Girls between the ages of 12 and 18 report constantly navigating repeated requests for nude images from their male peers (Thomas, 2018). This extends to college campuses, where research indicates that 10% of students have experienced IBSA (Branch et al., 2017). Adolescent boys also experience IBSA in school environments (Berndtsson, 2021). Across the gender spectrum, adolescents must navigate the flip side of disseminating intimate images: receiving unwanted images, which is also experienced by youth as traumatizing and invasive.

It is borne out across the research literature that perpetrators of IBSA are mostly male (Karasavva, 2021; Powell et al., 2022; Ringrose et al., 2022; Mishna et al., 2021). As well as being male, being heterosexual and holding victim-blaming attitudes toward women are also associated with perpetration (Powell et al., 2022; Karasavva, 2021). Those who perpetrate IBSA are more likely to have sexist views towards women and more likely to possess “dark triad” personality traits, such as “Machiavellianism,” a tendency toward manipulation and desire for power. These individuals are also more likely to be disengaged from accepted moral and ethical standards (Pina et al., 2017; Pina et al., 2021). Both male and female perpetrators use IBSA to “fit in” with prevailing sexist norms. In two studies of youth perpetrators, researchers found that boys commit this harm as a marker of masculinity and heterosexual prowess (Ringrose et al., 2022; Naezer & van Oosterhout, 2021). As much bullying in school is rooted in homophobia, boys also share images to prove their heterosexuality and to avoid harassment (Bailey, 2015, p. 33). Girl perpetrators, on the other hand, distribute images non-consensually to feel “normative chaste femininity” by separating themselves from the “slutty” girls who take nude images (Naezer and van Oosterhout, 2021).

Sexist and homophobic norms often stop male victims from identifying with their victimhood (Berndtsson, 2020), and cases involving men are less likely to make it to criminal prosecution(Dodge, 2021a). These stereotypes work to cast heterosexual men as powerful “non-victims,” while women and sexual minorities are seen as less powerful and less credible in comparison. These stereotypes have real-life consequences. For example, gay men are the most likely of any group to consensually take and share nudes, and victims from this demographic have taken their own lives after experiencing IBSA (McGlynn et al., 2017).

What Does Image-Based Abuse Do to Victims?

Maintaining a “good” online identity has become essential to exist in contemporary society. Ordering a taxi, applying for a job, or looking for a romantic relationship demands the use of web applications that depend on one’s ability to craft a good online identity. By exposing their most intimate moments, image-based abuse eviscerates victims’ “good” online identities. When images are sent to their employers or posted publicly, this harm reaches every aspect of victims’ lives. The phrase “seeing is believing” speaks to the assumed truthfulness of visual information. Although the rise of “deep fakes” undermines this norm, images continue to be used—in courtrooms and in the media—as indicators of truth. The assumed truthfulness of visual information combined with the pervasive shaming of sexually active women, make it especially difficult for female victims to rebuild good online identities after their intimate photos have been leaked. With their addresses and cell phone numbers posted alongside their content, often the only option for victims is to reduce their online activity or disconnect completely (Bates, 2017). Victims have described this as a shattering social rupture that devastates all aspects of their lives (McGlynn et al., 2019).

At the psychological level, IBSA has similar symptoms to those experienced by survivors of offline sexual abuse. Post-traumatic stress disorder was a common experience among victims who often can’t reach a place of psychological safety because their images are never fully deleted (Zaleski, 2022). Even after leaving a violent situation, prosecuting the perpetrator, or changing their names, victims’ content continued to appear in search engine results, on Google Maps, and in popular social networking sites associated with their new identities (Maddocks, 2018). For adolescent victims, these outcomes are particularly harrowing (Franks, 2019). Even when they don’t send nudes to their peers, the constant barrage of requests to do so experienced by adolescent girls leads them to feel exhaustion, hopelessness, and stress (Mishna et al., 2021). This harm is often depicted simply as a one-off instance, when in fact it is experienced by many as a constant barrage of attempts to silence, shame, and delegitimize. In a recent study of gender-based online harassment, victims explained that even in moments when they weren’t being harassed, they were “waiting for the shoe to drop” (Sobieraj, 2020, p. 27). For victims of IBSA, this often means searching porn sites and blogs every day to see if their images have been reposted.

Cumulative Impacts of Image-Based Abuse

As a group, women endure the most sustained, severe, and sexualized forms of harassment on the internet (Chemaly, 2019, p. 151; Lenhart et al., 2016; Duggan et al., 2014). Women who experience multiple forms of identity-based oppression are perceived as feminist or gender-noncompliant, and women who speak out in male-dominated spaces are at the highest risk of harassment, both IBSA and other forms of abuse (Sobieraj, 2020, p. 11). For example, Black women on Twitter are 84% more likely to be targeted in abusive tweets (Glitch UK, 2017). For women working in “hegemonically masculine” fields like media, politics, sports, and finance, digital harassment has become “part of the workday” (Chemaly, 2019, p. 151). This reality is driving girls and women out of public spaces at an alarming rate. Around 41% of women between the ages of 15 and 29 now “self-censor” online to avoid harassment (Sobieraj, 2020, p. 131). In her study of women journalists and public figures, Sarah Sobieraj found that they often avoid writing and speaking on “high-risk” subjects and watch public debates play out without participating, even when they wanted to (Sobieraj, 2020, p.116). An Amnesty International study of women across eight countries found that 63% to 83% of women have experienced digital harassment and 31% stopped posting content altogether in response. Image-based abuse mimics many of the dynamics of offline sexual harassment in the public sphere (Hayes and Dragiewicz, 2018). This has created a “hostile speaking environment” online that is comparable to the hostile workplaces women navigate offline (Thakur & Hankerson, 2022; Sobieraj, 2020, p. 38).

Power, Trust, and Credibility

The threat of image-based abuse must be understood through the lens of gender. Image-based sexual abuse is disproportionately perpetrated by men and disproportionately victimizes “women, feminized bodies, and other marginalized communities” (Shokooh Valle, 2021, p. 1; Franks, 2019). While men are also victims, this harm still works to maintain masculine and heterosexist norms regardless of the gender of individual victims and perpetrators (Powell & Henry, 2017, p. 182). These norms facilitate the differential treatment of male and female victims. Across multiple studies, female victims of IBSA are assigned more blame than male victims (Attrill-Smith et al, 2021; Zvi and Bitton, 2021). These studies also reveal that women who originally took the images that were later distributed are seen—by both men and women—as responsible for the harm they experienced. As a result, women who express their sexuality are demonized and deemed less deserving of protection. This is the same victim-blaming discourse that emerges in response to offline forms of gender-based violence. In both cases, the body becomes a tool of shame used to reduce credibility and trust toward women (Duggan et al., 2014; McGlynn et al., 2021).

Despite targeting the most high-profile women in the world, [IBSA] is not at the center of fact-checking initiatives, detection software, or efforts to fight “fake news,” perhaps because it doesn’t affect the predominantly male policymakers who shape these priorities, or because it is seen as “sexual” rather than political.

This process is clearly illustrated in the rise of deepfakes. The vast majority of deepfakes circulating on the internet today are pornographic, yet public attention is almost exclusively focused on political deepfakes (Paris & Donovan, 2019; Ajder et al., 2019; Parkin, 2019; Watts & Hwang, 2020). These are fake videos that simulate someone’s likeness, making it appear as if they are saying something they didn’t say. Charged with spreading disinformation, polluting the information environment, and triggering political instability, political deepfakes are depicted as a democracy-destroying blight. Yet their pornographic counterparts have received little public concern even as they silence politicians, journalists, and public figures. Forged sexual images have a long history, as we see in the case of Sukarno, the late President of Indonesia, about whom the CIA produced fake pornographic images in the 1960s (Nutter, 1999).   

Fifty years later, we find the example of Hillary Clinton. Fake pornographic imagery of Clinton has been circulating on Twitter since 2012, but it was not until 2018 that news broke about a Russia-linked agency promoting such fake sex videos of Clinton (Maddocks, 2020; Collins, 2018). Image-based abuse is disinformation: the spreading of false information designed to mislead. Despite targeting the most high-profile women in the world, it is not at the center of fact-checking initiatives, detection software, or efforts to fight “fake news,” perhaps because it doesn’t affect the predominantly male policymakers who shape these priorities, or because it is seen as “sexual” rather than political.

When we center pornographic deepfakes as a fundamental concern for building a resilient democracy and a healthy and safe internet, we see that the rise of deepfakes may not mark a shift towards a “post-truth era” marked by false images and statements that undermine the credibility of public figures and political discourse. Deep fakes that falsify political speech are seen to undermine trust in a straightforward way: the viewer is deceived by the content, and can’t ascertain which political speech is authentic. What remains is political confusion. By contrast, deepfake porn appears to reinforce existing hierarchies of legitimacy that portray women as less believable than men. People who see deepfake pornography usually know that it is not authentic, yet the circulation of deepfake porn featuring high-profile women nevertheless contributes to these women’s shaming and silencing in public discourse. Unlike ones that falsify political speech, deepfake pornographic images and videos produce a fraudulent sense of certainty, rather than confusion. At the societal level, the rise of IBSA and the acts of self-censorship it prompts add up to serious “democratic disturbances:” fomenting disinformation about women, reducing their likelihood of running for office, and reducing their involvement in public discourse (Sobieraj, 2015, p. 115).

Consistency Across Cultures and Contexts

There are remarkable similarities between image-based abuse perpetration across cultures and geographical contexts. Patriarchal norms, victim-blaming, and sexual censorship structure perpetration, victimization, and governmental responses in similar ways around the world. In Uganda, for example, if an individual’s nude images are released, they can be held responsible under strict obscenity laws like the Anti-Pornography Act 2014 (Amnesty International, 2014, p. 17). Similarly, in the United States, victims have been criminalized when their intimate content is distributed consensually and non-consensually. Adolescent victims, as well as those who sext consensually, have been charged with creating child pornography (Hasinoff, 2015; Franks, 2014). Across these contrasting contexts, the same blaming and shaming practices operate.

Research indicates some differences in the way IBSA affects minoritized groups across the globe. In New Zealand, this harm disproportionately targets Asian individuals and sexually diverse individuals, whereas in Jamaica it is weaponized against working-class victims, with many choosing to migrate after experiencing this harm (Pacheco et al., 2019; Haynes, 2018). Experiences of IBSA also differ across rural and urban spaces. Research in a Spanish village found that rurality exacerbates the negative effects of this harm because people in smaller communities are more likely to know about each other’s personal lives (Pavon-Benitez et al., 2021). While image-based abuse is pervasive worldwide, the same victim-blaming rhetoric results in deadlier consequences in some countries in the Global South. In 2011, a video of four girls singing in a village in Pakistan while a boy danced was circulated online without their consent, and three of the girls were murdered by their family members in an “honor” style killing (BBC News, 2019). This example illustrates how women’s morality, sexuality, and privacy are differently constructed across contexts. 

In India, far-right politicians have argued that young women’s use of social networking sites has made them “sluts” (Arora & Scheiber, 2017). This discourse paints social networks like Facebook as extensions of immoral Western public spaces that women should not enter (Arora & Scheiber, 2017; Shah, 2015, p. 4). In countries where women experience increased vulnerability to violence in public spaces and have fewer private spaces to express their sexuality, the non-consensual distribution of private images can become normalized (Kaya, 2009). When it comes to minors, it is a commonly upheld norm that children should be protected from harm, but often this is constructed in terms of preserving girls’ virtue in line with the moral norms of the local context (Arora & Scheiber, 2017). This focus on virtue has shaped the legislative landscape in Nigeria, where women are re-victimized when they seek help because they are blamed for allowing their virtue to be destroyed by “allowing” images to be taken in the first place (Aborisade, 2021). 

The definitional boundaries of image-based abuse vary across cultural, religious, and geographical contexts because what is considered private, intimate, or transgressive also varies. For example, image-based abuse can involve distributing images of Muslim women and girls without their hijabs (Huber, 2022). In the United States, a recent case involved a police department forcing women to remove their hijab for mugshots. Privacy and intimacy are not stable constructs that can be applied across contexts, but consent is. Ultimately, it is the non-consensual release of private images that distinguishes these as examples of image-based abuse.

Big Tech: Problem or Solution?

When image-based abuse cases began reaching the courts around the world, lawyers who were advocating on behalf of people who have experienced this harm came up against fierce objections from technology companies, social networks, and search engines alike. They consistently refused to remove or de-index nonconsensual content on the grounds that it interfered with free speech (Citron, 2019). In the United States, legislators and civil liberties organizations have taken strong action to protect the privacy of medical records, financial information, and cell phone data in recent years, but they have not been so willing to extend the same efforts to removing non-consensually released private images (Franks, 2017). While Silicon Valley ascribes to the ideology that information “wants” to be free and any constraint on this is a free speech issue, in reality, platforms exert significant control over their content (Hasinoff, 2015).

Uniquely underregulated compared to other large industries, social networks have no legal liability for abusive content posted on their platforms. If a company was facilitating IBSA offline by providing rooms for the non-consensual filming of women or offering training on how to secretly record your spouse, they would be inundated with lawsuits, public pressure, and critique. But in the United States, Section 230 of the Communications Decency Act grants blanket immunity to social networks, evacuating any responsibility for abusive content uploaded to their sites and spread globally (Citron & Wittes, 2017). After ten years of lobbying by activists and increasing public pressure, the immunity that “shields online platforms from liability for illegality that they enable” is being reevaluated in earnest (Citron, 2022). The “EARN IT” act, currently being debated in the Senate, would reduce the protection afforded to platforms under Section 230. Several other bills seeking to reduce immunity were introduced in Congress last year, and President Joe Biden has called to remove Section 230 altogether (Kern, 2022; Konkel, 2022). In Europe, the recently enacted Digital Services Act will require that pornography platforms—upon which IBSA is often distributed—provide thorough human content moderation and reporting procedures (Woods and McGlynn, 2022).

The most high-profile examples of legislation in the United States seeking to limit the spread of sexually abusive content online are FOSTA (Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act). Passed under the Trump administration, these laws remove Section 230 immunity for platforms if they publish advertisements for sex. Without the ability to safely advertise and vet clients, however, this law has put sex workers at increased risk of harm (Romano, 2018). To avoid legislation that pits the needs of vulnerable groups against each other, legal scholars recommend implementing a duty of care, in which platforms can only access immunity if they can prove that they “took reasonable steps” to address abuse, with no immunity extended to platforms deliberately designed to promote abuse (Citron, 2022).

Beyond regulation, a range of tools could be piloted by companies to reduce abuse. Experts have suggested “hybrid systems” that combine flagging mechanisms with preventative approaches. For instance, rather than simply flagging abusive content after it has been posted, individuals who have posted similar content in the past could be subject to targeted pre-moderation to ensure they aren’t repeating past behaviors (Sobieraj, 2020). The design of new features, prevention tools, and reporting strategies should all be conducted in consultation with victims of online harassment. A well-funded victim-centered approach could significantly reduce the prevalence and impact of image-based abuse.

Several activists and experts in Latin America and Asia have criticized social networks for only addressing IBSA as it manifests in the Global North. These activists warn that they are “left to their own devices” to deal with content moderation and online violence. They also describe their experiences of “tokenism” and “gender washing” when technology companies claim to work with victims in the Global South, but don’t do concrete things to help them (Maddocks, 2018). Inconsistency in responses to IBSA across contexts creates spaces where online harms are more prevalent and help is harder to access. What activists are calling “balkanization” and the “splinter internet” leaves the most vulnerable victims with very little support.

Without much-needed reform, innovation will continue as a “race to the bottom,” and new technologies will further threaten the users’ privacy, safety, and speech. As well as formal regulation and legal liability for content posted to their sites, activists have suggested that companies become more transparent and responsive: releasing regular incident reports, deepening their support mechanisms, and engaging with activists at regional levels. Explicit consent standards and data protection rules should center informed consent and be developed in consultation with survivors of sexual violence online (Peña & Varon, 2018; Hasinoff, 2015). Activist Seyi Akiwowo also suggests a global ranking for social media companies that incentivizes ethical innovation (Akiwowo, 2022).

How Effective Are Legal Remedies?

While some jurisdictions have ignored IBSA, others have been quick to legislate against it. The Philippines became one of the first countries to ban “revenge porn” in 2009. Japan and Germany amended their existing privacy laws to encapsulate the practice in 2013, and in the following years, “revenge porn” laws were passed around the world (Haynes, 2016). Since 2006, there has been legislation developed in the European Union and Argentina around “the right to be forgotten.” Distinct from the right to privacy, the right to be forgotten enables people to have public information about themselves removed from websites and search engines. In the United States, the number of states to legislate against “revenge porn” has increased dramatically from three to forty-six since 2013. In 2022, the passage of the Violence Against Women Reauthorization Act included a private right of action for IBSA victims (Killion, 2022).

Lobbying for legislative change has been an essential goal of many activists worldwide. While they recognize the limitations of the law, they argue that new laws will deter potential abusers, enable victims to seek justice, and punish perpetrators. In Zimbabwe, research indicates an “urgent need” to develop a clear legal and policy framework for addressing IBSA (Mafa et al., 2020). While the law can be a powerful deterrent, it is also a blunt instrument. Many activists are wary of slipping into a ‘carceral feminism’ that supports the disproportionate incarceration of the most marginalized (Bernstein, 2007). They argue that any focus on new laws must not divert attention from efforts to fairly enforce existing legislation. Abusers commonly use their real names and leave obvious digital traces, but too often these are not investigated by the police. In response, some activists focus on training, awareness raising, and capacity building in law enforcement agencies. Other advocates seek out alternatives to law enforcement that avoid carceral approaches to addressing IBSA.

Several legal remedies further threaten victims’ human rights. Introduced on the pretense of protecting Internet users from non-consensual image sharing, Canada’s anti-cyberbullying law (bill C-13) gave law enforcement new powers to obtain information about Internet users (Coburn et al., 2015). Meanwhile, portions of Pakistan’s Prevention of Electronic Crimes Act facilitated the surveillance, intimidation, and even abduction of Internet activists, while claiming to protect women online (Hussain, 2017). While “bad” laws are being made everywhere, it is particularly dangerous when they are made by countries in the Global North, because it gives countries in the Global South precedent to implement similar laws that claim to promote online safety while silencing women journalists and human rights defenders (Hussain, 2017; Dad, 2022).

Legal remedies and their judicial interpretations also vary vastly across contexts. Since the implementation of new legislation, IBSA has been taken seriously by the courts in Canada, whereas a multi-country study of New Zealand, the United Kingdom, and Australia reveals that these countries’ piecemeal legislation fails to “capture the real essence” of IBSA because it is recognized as a communication offense, not a form of sexual abuse (Dodge, 2019; Dueck-Read, 2020; Rackley et al., 2021). Consequently, victims are not granted automatic anonymity, which leads to the publishing of their names in news reports; this, of course, magnifies their abuse. While some jurisdictions have implemented laws that include “deepfake” pornography (Chesney & Citron, 2020), others have not (Rackley et al., 2021). Similarly, some legislation fails to include the threat to distribute images non-consensually, which is a central tenet of this harm. UK law also requires that images are explicit, sexual, and involve nudity, which excludes many non-sexual intimate images. This leads to a bias towards white female victims and reinforces “dominant norms” around race, ethnicity, and religion, further marginalizing certain victims (Dodge, 2021b; Rackley et al., 2021). If it were more affordable, pursuing civil claims would be a powerful avenue of redress. Much more victim-centered, civil complaints span breach of confidence, defamation, copyright, and invasion of privacy (Haynes, 2018).  

Legal remedies must be grounded in the needs and experiences of victims. A “sense of justice” can be found in rehabilitating offenders, holding platforms accountable, and creating support systems that uphold victims’ anonymity and autonomy (Rackley et al., 2021). Instead of piecemeal laws that capture only parts of this offense, legislative responses should be built from a clearer understanding of the breadth of this harm and the motives that drive it. To achieve a more joined-up approach that connects law enforcement, social services, and educators, several jurisdictions have developed national offices. Australia’s eSafety commission and South Korea’s Advocacy Centre for Online Sexual Abuse Victims have become central hubs for support, training, and prevention.

Future Directions for Research

Research across a range of contexts is building a coherent picture of IBSA as a gendered harm, rooted in a desire for control, that disproportionately targets minoritized victims. Studies of victimization rates, perpetrator motivations, and stakeholder attitudes have enriched public understanding of this harm. To ensure policy responses are intersectional, further research is needed on victimization levels across social groups. Sexual orientation, gender identity, race, and ability are significant risk factors that require further study (Chun & Friedland, 2015; Karaian, 2014). This research should be conducted in partnership with these individuals, centering their needs and expertise. Efforts to decry and stop image-based abuse often center on the treatment of heterosexual, cis, white female victims, which obscures how racism and misogyny work together to increase victimization and limit access to justice for Black women (Crenshaw, 1991Sokoloff & DuPont, 2005; Dodge, 2021a).  

Among youth, more research is needed to better understand the impact of receiving “unsolicited “dick pics,” unsolicited “female nudes,” and non-consensual “explicit video” (Berndtsson, 2021). More research is also needed to understand the effects of coercive sexting and the complexity of refusing requests for explicit content (Mishna et al., 2021). As well as victims, more research is needed on bystanders. When we scroll through our newsfeeds or timelines, we all have the potential to become IBSA bystanders: people who view non-consensually released images. Around 65% of bystanders take some action when they witness online harassment: either reaching out to the victim, reaching out to the perpetrator, or reporting the content to a social network moderator (Lenhart & Zickhur, 2016). Attitudes appear to be moving in a positive direction among younger bystanders, who are more likely to act against harassment online (Lenhart & Zickhur, 2016). More research on the role of bystanders who specifically view IBSA could illuminate ways that all users can promote safety in online spaces.

Prevention is a key area for future research. Research already indicates that punitive and abstinence-focused approaches are ineffective among youth (Strassberg et al., 2017). Instead, educational interventions focused on explicit consent, interpersonal respect, and healthy relationship practices should be delivered and evaluated longitudinally to understand their impact on youth and adult offending rates. Prevention requires that lawmakers, law enforcement, social services, and schools reverse practices that criminalize victims and instead center the needs of victims, advocates, and marginalized community members (Survived and Punished, 2017; Hasinoff, 2015). Researching best practices in countries that have developed their own eSafety offices could provide important insight into what works best when addressing IBSA at the national level. Research should also explore the efficacy of existing laws: Which victims have utilized new laws? Who has gained access to legal redress? Which perpetrators were targeted? And, how did law enforcement respond? Social networks are also a key site for prevention. Research is urgently needed on the efficacy of new approaches to platform regulation. From filtering algorithms, to warning pop-ups, to account freezing, what works to deter perpetrators? In answering these questions, research will provide essential evidence about which policies and practices promote healthy online spaces.

Image-based abuse is a public health crisis that affects the physical and psychological well-being of one in 25 Americans (Duggan, Smith & Caiazza, 2017). It is also a deeply gendered and racialized harm used to silence and shame women who speak out in public spaces. As new technologies open new avenues for repression, research is urgently needed to understand how digital abuse spreads and how it can be best prevented (Robinson et al., 2020). Crucially, this research must be rooted in the reality that digital abuse reflects and exacerbates offline inequalities (Noble and Tynes, 2016). Research should begin from the premise that white male supremacist norms like misogyny, racism, rape culture, and victim-blaming are now circulating in newly networked ways that drive repression both on- and offline. From this starting point, scholars can begin to identify practices that work best to address this severe social ill.

Works Cited

Aborisade Richard, A. 2022. “Image-Based Sexual Abuse in a Culturally Conservative Nigerian Society: Female Victims’ Narratives of Psychosocial Costs.” Sexuality Research & Social Policy 19 (1): 220–32.

Ajder, Henry, Giorgio Patrini, Francesco Cavalli, and Laurence Cullen. 2019. “The State of Deepfakes: Landscape, Threats, and Impact.”

Akiwowo, S., and S. Maddocks. 2022. “Image-Based Abuse: Future Directions for Research and Policy”.” In Report on the Center for Media at Risk’s 2021 Annual Symposium.

Arora, P., and L. Scheiber. 2017. “Slumdog Romance: Facebook Love and Digital Privacy at the Margins.” Media Culture & Society 39 (3): 408–22.

Attrill-Smith, A., C.J. Wesson, M.L. Chater, and L. Weekes. 2021. “Gender Differences in Videoed Accounts of Victim Blaming for Revenge Porn for Self-Taken and Stealth-Taken Sexually Explicit Images and Videos.” Cyberpsychology: Journal of Psychosocial Research on Cyberspace 15 (4).

Ayyub, Rana. 2018. “I Was the Victim of A Deepfake Porn Plot Intended to Silence Me.”

Bailey, M. 2021. Misogynoir Transformed: Black Women’s Digital Resistance. New York University Press.

Bates, S. 2016. “Revenge Porn and Mental Health: A Qualitative Analysis of the Mental Health Effects of Revenge Porn on Female Survivors.” Feminist Criminology 12 (1).

Batha, Emma. 2021. “UK Revenge Porn Nearly Doubled in Two Years, Survey Finds.” Reuters, March 25, 2021, sec. Media Industry. https://www.reuters.com/article/us-britain-women-revengeporn-trfn-idUSKBN2BH00U.

BBC News. 2019. “Kohistan Video Murders: Three Guilty in ‘honour Killing’ Blood Feud,” September 5, 2019.

Berndtsson, K.H. 2021. “Something You Just Don’t Talk About”: An Analysis of Teenage Boys’ Experiences of Non-Consensual Sexting in Lower Secondary School.” The Journal of Men’s Studies 30 (2).

Bernstein, E. 2007. “The Sexual Politics of the “New Abolitionism.” Differences 18 (3): 128–51.

Branch, K., C.M. Hilinski-Rosick, E. Johnson, and G. Solano. 2017. “Revenge Porn Victimization of College Students in the United States: An Exploratory Analysis”.” International Journal of Cyber Criminology 11 (1): 128–42.

Chemaly, S. 2019. “Demographics, Design. and Free Speech.” In Free Speech in the Digital Age, edited by Susan Brison and Katharine Gelber. Oxford: Oxford University Press.

Chesney, R., and D.K. Citron. 2019. “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security”.” California Law Review 107 (692): 1753.

Chun, W., and S. Friedland. 2015. “Habits of Leaking: Of Sluts and Network Cards”.” Differences 26 (2): 1–28.

Citron, Danielle Keats. 2022. “How To Fix Section 230.” SSRN Scholarly Paper. Rochester, NY. https://papers.ssrn.com/abstract=4054906.

Citron, D.K. 2014. “How Cyber Mobs and Trolls Have Ruined the Internet—and Destroyed Lives”.” Newsweek.

———. 2020. “Cyber Mobs, Disinformation, and Death Videos: The Internet as It Is (and as It Should Be.” Michigan Law Review 118: 1073.

Citron, D.K., and M.A. Franks. 2014. “Criminalizing Revenge Porn”.” Wake Forest Law Review 49: 345.

Citron, D.K., and B. Wittes. 2017. “The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity”.” Fordham Law Review 86: 401.

Coburn, P., D. Connolly, and R. Roesch. 2015. “Cyberbullying: Is Federal Criminal Legislation the Solution?” Canadian Journal of Criminology and Criminal Justice 57: 566–79.

Collins, Ben. 2018. “Russia-Linked Account Pushed Fake Hillary Clinton Sex Video”.” NBC News, April 10, 2018.

Crenshaw, K.W. 1991. “Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color”.” Stanford Law Review 43 (6): 1241–99.

Dad, N., and S. Maddocks. 2022. “Image-Based Abuse: Future Directions for Research and Policy”.” In Report on the Center for Media at Risk’s 2021 Annual Symposium.

Davies, Sophie. 2020. “Revenge Porn Soars in Europe’s Coronavirus Lockdown as Student Fights Back.” Reuters, May 5, 2020.

Dodge, A. 2019. “Nudes Are Forever: Judicial Interpretations of Digital Technology’s Impact on ‘Revenge Porn.’” Canadian Journal of Law and Society 34 (1): 121–43.

———. 2021a. “Trading Nudes Like Hockey Cards: Exploring the Diversity of ‘Revenge Porn’ Cases Responded to in Law.” Social & Legal Studies 30 (3): 448–68.

———. 2021b. “Try Not to be Embarrassed”: A Sex Positive Analysis of Nonconsensual Pornography Case Law.” Feminist Legal Studies 29 (1): 23–41.

D.S., Strassberg, D. Cann, and V. Velarde. 2017. “Sexting by High School Students”.” Archives of Sexual Behavior 46 (6): 1667–72.

Dueck-Read, A. 2020. “Judicial Constructions of Responsibility in Revenge Porn: Judicial Discourse in Non-Consensual Intimate Image Distribution Cases – A Feminist Analysis”.” Manitoba Law Journal 43 (3): 357–90.

Duggan, Maeve. 2014. “Online Harassment 2014.” Pew Research Center. https://www.pewresearch.org/internet/2014/10/22/online-harassment/.

———. 2017. “Online Harassment 2017.” Pew Research Center. https://www.pewresearch.org/internet/2017/07/11/online-harassment-2017/.

Eaton, A., H. Jacobs, and Y. Ruvalcaba. 2017. “2017 Nationwide Online Study of Nonconsensual Porn Victimization and Perpetration.” Cyber Civil Rights Initiative. https://www.cybercivilrights.org/wp-content/uploads/2017/06/CCRI-1-pager_Final.pdf.

Eaton, Asia A., Sofia Noori, Amy Bonomi, Dionne P. Stephens, and Tameka L. Gillum. 2021. “Nonconsensual Porn as a Form of Intimate Partner Violence: Using the Power and Control Wheel to Understand Nonconsensual Porn Perpetration in Intimate Relationships.” Trauma Violence Abuse 22 (5): 1140–54.

France, Lisa R. 2018. “Fans Rally Around Bella Thorne After Sexual Abuse Revelation.” CNN, January 9, 2018.

Hasinoff, A. 2015. Sexting Panic: Rethinking Criminalization, Privacy, and Consent. Univ. of Illinois Press.

Hayes, R.M., and M. Dragiewicz. 2018. “Unsolicited Dick Pics: Erotica, Exhibitionism or Entitlement?” Women’s Studies International Forum 71: 114–20.

Haynes, J. 2018. “Judicial Approaches to Combating ‘Revenge Porn’: A Multi-Jurisdictional Perspective”.” Commonwealth Law Bulletin 44 (3): 400–428.

Henry, N., A. Flynn, and A. Powell. 2018. “Policing Image-Based Sexual Abuse: Stakeholder Perspectives”.” Police Practice & Research 19 (6): 565–81.

Hills, M.C. 2017. “How Hugh Hefner Built an Entire Empire without Marilyn Monroe’s Consent.” Marie Claire UK, September. https://www.marieclaire.co.uk/news/celebrity-news/hugh-hefner-marilyn-monroe-541688.

Huber, Antoinette. 2022. “‘A Shadow of Me Old Self’: The Impact of Image-Based Sexual Abuse in a Digital Society.” International Review of Victimology, April. https://doi.org/10.1177/02697580211063659.

Hussain, T. 2017. “The Disappeared: Why Pakistan Is Feeling Spooked Over Missing Social Media Activists.” South China Morning Post, January 23, 2017. https://www.scmp.com/week-asia/politics/article/2064001/disappeared-why-pakistan-feeling-spooked-over-missing-social.

Karaian, Lara. 2014. “Policing ‘Sexting’: Responsibilization, Respectability and Sexual Subjectivity in Child Protection/Crime Prevention Responses to Teenagers’ Digital Sexual Expression.” Theoretical Criminology 18 (3): 282–99. https://doi.org/10.1177/1362480613504331.

Karasavva, V., J. Swanek, A. Smodis, and A. Forth. 2022. “From Myth to Reality: Sexual Image Abuse Myth Acceptance, the Dark Tetrad, and Non-Consensual Intimate Image Dissemination Proclivity”.” Journal of Sexual Aggression, 1–17.

Kaya, L.P. 2009. “Dating in a Sexually Segregated Society: Embodied Practices of Online Romance in Irbid, Jordan”.” Anthropological Quarterly 82 (1): 251–78.

Kern, Rebecca. 2022. “White House Renews Call to ‘Remove’ Section 230 Liability Shield.” POLITICO, September 8, 2022. https://www.politico.com/news/2022/09/08/white-house-renews-call-to-remove-section-230-liability-shield-00055771.

Killion, V. 2022. Federal Civil Action for Disclosure of Intimate Images: Free Speech Considerations”. Congressional Research Service Legal Sidebar.

Kim, J. 2021. “Sticky Activism: The Gangnam Station Murder Case and New Feminist Practices against Misogyny and Femicide”.” Journal of Cinema and Media Studies 60 (4).

Konkel, Frank. 2022. “Controversial Section 230 Reform Finds More Opposition in Senate.” Nextgov, February 3, 2022. https://www.nextgov.com/policy/2022/02/controversial-section-230-reform-finds-more-opposition-senate/361518/.

Langlois, G., and A. Slane. 2017. “Economies of Reputation: The Case of Revenge Porn.” Communication and Critical/Cultural Studies 14 (2): 120–38.

Lenhart, A., M. Ybarra, and M. Price-Feeney. 2016. “Nonconsensual Image Sharing: One in 25 Americans Has Been a Victim of Revenge Porn”.” Data & Society Research Institute Data Memo.

Lenhart, A., and K. Zickhur. 2016. “Online Harassment, Digital Abuse, and Cyberstalking in America.” Data & Society Research Institute Report.

Maddocks, Sophie. 2018. “From Non-Consensual Pornography to Image-Based Sexual Abuse: Charting the Course of a Problem with Many Names.” Australian Feminist Studies 33 (97): 345–61. https://doi.org/10.1080/08164649.2018.1542592.

———. 2020. “‘A Deepfake Porn Plot Intended to Silence Me’: Exploring Continuities between Pornographic and ‘Political’ Deep Fakes.” Porn Studies 7 (4): 415–23. https://doi.org/10.1080/23268743.2020.1757499.

Mafa, I., S. Kang’ethe, and V. Chikadzi. 2020. “Revenge Porn’ and Women Empowerment Issues: Implications for Human Rights and Social Work Practice in Zimbabwe.” Journal of Human Rights and Social Work 5 (2): 118–28.

Massanari, A. 2016. “#Gamergate and The Fappening: How Reddit’s Algorithm, Governance, and Culture Support Toxic Technocultures.” New Media & Society 19 (3): 329–46. https://doi.org/10.1177/1461444815608807.

McGlynn, C., K. Johnson, E. Rackley, N. Henry, N. Gavey, A. Flynn, and A. Powell. 2021. “It’s Torture for the Soul’: The Harms of Image-Based Sexual Abuse.” Social & Legal Studies 30 (4): 541–62.

McGlynn, C., E. Rackley, and R. Houghton. 2017. “Beyond ‘Revenge Porn’: The Continuum of Image-Based Sexual Abuse.” Feminist Legal Studies 25 (1): 25–46.

McGlynn, C., E. Rackley, K. Johnson, N. Henry, A. Flynn, A. Powell, N. Gavey, and A. Scott. 2019. “Shattering Lives and Myths: A Report on Image-Based Sexual Abuse.” Durham University. https://dro.dur.ac.uk/28683/.

Mikoley, K. 2021. Cyber Mobs, Trolls, and Online Harassment. Cavendish Square Publishing.

Mishna, Faye, Elizabeth Milne, Charlene Cook, Andrea Slane, and Jessica Ringrose. 2021. “Unsolicited Sexts and Unwanted Requests for Sexts: Reflecting on the Online Sexual Harassment of Youth.” Youth & Society, November, 0044118X211058226. https://doi.org/10.1177/0044118X211058226.

Naezer, Marijke, and Lotte van Oosterhout. 2021. “Only Sluts Love Sexting: Youth, Sexual Norms and Non-Consensual Sharing of Digital Sexual Images.” Journal of Gender Studies 30 (1): 79–90. https://doi.org/10.1080/09589236.2020.1799767.

Noble, Safiya Umoja, and Brendesha M. Tynes. 2016. The Intersectional Internet: Race, Sex, Class and Culture Online. Peter Lang Publishing. https://intersectionalinternet.com/.

Nutter, John Jacob. 2009. The CIA’s Black Ops: Covert Action, Foreign Policy, and Democracy. Prometheus Books.

Ouytsel, J., M. Walrave, L. Marez, B. Vanhaelewyn, and K. Ponnet. 2021. “Sexting, Pressured Sexting and Image-Based Sexual Abuse among a Weighted-Sample of Heterosexual and LGB-Youth”.” Computers in Human Behavior 117.

Pacheco, Edgar, Neil Melhuish, and Jandy Fiske. 2019. “Image-Based Sexual Abuse: A Snapshot of New Zealand Adults’ Experiences.” SSRN Electronic Journal, January. https://doi.org/10.2139/ssrn.3315984.

Paris, B., and J. Donovan. 2019. Deepfakes and Cheap Fakes: The Manipulation of Audio and Visual Evidence. Data & Society Research.

Parkin, Simon. 2019. “The Rise of the Deepfake and the Threat to Democracy.” The Guardian, June 22, 2019. http://www.theguardian.com/technology/ng-interactive/2019/jun/22/the-rise-of-the-deepfake-and-the-threat-to-democracy.

Pavón-Benítez, Laura, Nuria Romo-Avilés, and Pilar Tarancón Gómez. 2021. “‘In My Village Everything Is Known’: Sexting and Revenge Porn in Young People from Rural Spain.” Feminist Media Studies, June, 1–17. https://doi.org/10.1080/14680777.2021.1935290.

Peña, Paz, and J. 2018. “Recommendations on Technology-Related Violence Against Women (VAW) for the UN.” Medium (blog). March 1, 2018. https://medium.com/@pazpena/recommendations-on-technology-related-violence-against-women-vaw-for-the-un-5e27b544e6b2.

Pina, A., A. Bell, K. Griffin, and E. Vasquez. 2021. “Image Based Sexual Abuse Proclivity and Victim Blaming: The Role of Dark Personality Traits and Moral Disengagement”.” Oñati Socio-Legal Series 11 (5): 1179–97.

Pina, Afroditi, James Holland, and Mark James. 2017. “The Malevolent Side of Revenge Porn Proclivity: Dark Personality Traits and Sexist Ideology.” International Journal of Technoethics 8 (1): 30–43. https://doi.org/10.4018/IJT.2017010103.

Powell, A., A.J. Scott, A. Flynn, and S. McCook. 2022. “Perpetration of Image-Based Sexual Abuse: Extent, Nature and Correlates in a Multi-Country Sample”.” Journal of Interpersonal Violence.

Powell, Anastasia, and Nicola Henry. 2017. “Sexual Violence in a Digital Age.” Palgrave Macmillan. https://link.springer.com/book/10.1057/978-1-137-58047-4.

Rackley, E., C. McGlynn, K. Johnson, N. Henry, N. Gavey, A. Flynn, and A. Powell. 2021. “Seeking Justice and Redress for Victim-Survivors of Image-Based Sexual Abuse”.” Feminist Legal Studies 29 (3): 293–322.

Ringrose, J., K. Regehr, and S. Whitehead. 2022. “Wanna Trade?’: Cisheteronormative Homosocial Masculinity and the Normalization of Abuse in Youth Digital Sexual Image Exchange.” Journal of Gender Studies 31 (2): 243–61.

Robinson, Laura, Jeremy Schulz, Grant Blank, Massimo Ragnedda, Hiroshi Ono, Bernie Hogan, Gustavo S. Mesch, et al. 2020. “Digital Inequalities 2.0: Legacy Inequalities in the Information Age.” First Monday, June. https://doi.org/10.5210/fm.v25i7.10842.

Romano, Aja. 2018. “A New Law Intended to Curb Sex Trafficking Threatens the Future of the Internet as We Know It.” Vox, July 2, 2018. https://www.vox.com/culture/2018/4/13/17172762/fosta-sesta-backpage-230-internet-freedom.

“Rule By Law: Discriminatory Legislation And Legitimized Abuses In Uganda.” 2014. AFR 59/006/2014. United Kingdom: Amnesty International, International Secretariat. https://www.amnesty.org/en/documents/AFR59/006/2014/en/.

Salter, M., and T. Crofts. 2015. “Responding to Revenge Porn: Challenging Online Legal Impunity.” In New Views on Pornography: Sexuality, Politics and the Law, edited by Lynn Comella and Sira Tarrant, 233–56. Westport: Praeger Publisher.

Shah, Nishant. 2015. “Sluts ‘r’ Us: Intersections of Gender, Protocol and Agency in the Digital Age.” First Monday, March. https://doi.org/10.5210/fm.v20i4.5463.

Shokooh Valle, Firuzeh. 2021. “Turning Fear into Pleasure: Feminist Resistance against Online Violence in the Global South.” Feminist Media Studies 21 (4): 621–38. https://doi.org/10.1080/14680777.2020.1749692.

Sobieraj, Sarah. 2020. Credible Threat: Attacks Against Women Online and the Future of Democracy. New York, NY: Oxford University Press. https://doi.org/10.1093/oso/9780190089283.001.0001.

Sokoloff, Natalie J., and Ida Dupont. 2005. “Domestic Violence at the Intersections of Race, Class, and Gender: Challenges and Contributions to Understanding Violence against Marginalized Women in Diverse Communities.” Violence Against Women 11 (1): 38–64. https://doi.org/10.1177/1077801204271476.

“Survived and Punished: Survivor Defense as Abolitionist Praxis.” 2017. Survived and Punished. https://survivedandpunished.org/defense-campaign-toolkit/.

Taddeo, Sarah. 2020. “Revenge Porn Is up during the COVID-19 Pandemic. Here’s How to Protect Yourself.” USA TODAY, November 19, 2020. https://www.usatoday.com/story/news/nation/2020/11/19/revenge-porn-rises-during-covid-pandemic-ny-attorney-general-warns/3777569001/.

Thakur, Dhanaraj, and DeVan Hankerson Madrigal. 2022. “An Unrepresentative Democracy – How Disinformation and Online Abuse Hinder Women of Color Political Candidates in the United States.” Center for Democracy & Technology. https://cdt.org/insights/an-unrepresentative-democracy-how-disinformation-and-online-abuse-hinder-women-of-color-political-candidates-in-the-united-states/.

“The National Security Challenge of Artificial Intelligence, Manipulated Media, and ‘Deep Fakes”.” 2019. Washington, D.C.

“The Shadow Pandemic: Violence against Women during COVID-19.” 2020. UN Women. 2020. https://www.unwomen.org/en/news/in-focus/in-focus-gender-equality-in-covid-19-response/violence-against-women-during-covid-19.

Thomas, Sara E. 2018. “‘What Should I Do?’: Young Women’s Reported Dilemmas with Nude Photographs.” Sexuality Research & Social Policy: A Journal of the NSRC 15 (2): 192–207. https://doi.org/10.1007/s13178-017-0310-0.

Woods, Lorna, and Clare McGlynn. 2022. “Pornography Platforms, the EU Digital Services Act and Image-Based Sexual Abuse.” Media@LSE (blog). January 26, 2022. https://blogs.lse.ac.uk/medialse/2022/01/26/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse/.

Zaleski, K., and S. Maddocks. 2022. “Image-Based Abuse: Future Directions for Research and Policy”.” In Report on the Center for Media at Risk’s 2021 Annual Symposium.

Zvi, Liza, and Mally Shechory Bitton. 2021. “Perceptions of Victim and Offender Culpability in Non-Consensual Distribution of Intimate Images.” Psychology, Crime & Law 27 (5): 427–42. https://doi.org/10.1080/1068316X.2020.1818236.