Copy
View this email in your browser
EVENTS
                   
Section 230 Protections: Can Legal Revisions or Novel Technologies Limit Online Misinformation and Abuse? A Workshop
Apr 22nd, 2021 | 12:00-6:00PM ET


An ad hoc committee will organize a workshop to examine whether immunity protections provided to Internet-based technology companies and social media platforms should be reconsidered in light of increasing online misinformation and abuse.  The workshop will examine the legal and economic rationale for these protections, review relevant court decisions, discuss revisions to these protections, and explore technological approaches that might allow for a more responsible management of third party Internet content.

Register here
WEEKLY NEWSLETTER

"Automatically" Detecting Online Abuse Requires an Editorial Eye 
Effectively flagging abusive content on social media is hard, but journalists can share their insights - and their data - to improve computational tools.

By Susan E. McGregor 

 

On April 9, 2021, women Twitter users from a range of professions-including law, journalism, academia and advertising, to name a few-participated in a one day boycott to bring attention to the abuse that women regularly experience on the platform. In fact, according to a 2018 Amnesty International report highlighted by participant and startup lawyer Moe Odele, a woman is abused on Twitter every 30 seconds.

Women journalists reading these statistics won't be surprised.  In the course of researching my upcoming book, "Information Security Essentials: A Guide for Reporters, Editors and Newsroom Leaders," I spoke with dozens of security specialists at news organizations and found that online harassment -which disproportionately, though by no means exclusively, affects journalists who present as women or people of color- was their most pressing security problem. Manifesting as everything from doxxing campaigns to relentless streams of otherwise "innocuous" messages, online harassment is a huge drain on journalistic resources, both in direct financial costs and lost productivity. According to recent research by the International Women's Media Foundation, nearly one-third of women journalists who experience online abuse consider leaving the profession altogether. As IWMF executive director Elisa Lees Muñoz recently wrote for Nieman Lab, this type of "[o]nline violence has the same impact as violence experienced offline."

In 2015, Twitter CEO Jack Dorsey acknowledged how poor the company was at dealing with abuse on the platform, and since then the platform has developed a handful of new tools (such as restricting who can reply to your posts) in an effort to improve women users' experiences. But addressing the problem of online abuse at the scale and speed of social media is not a trivial problem, even for the platforms themselves. Scanning every message posted to a social media platform demands a computational solution, yet tasks that tend to be simple for humans, like interpreting the "gist" or tone of an exchange, are effectively impossible for computational systems. At best, computational tools can be trained to mimic this type of understanding by algorithmically "studying" large volumes of actual human communication that has been qualitatively labeled -by humans. But which humans are doing the labeling also matters because the language of abuse is not "one-size-fits-all." Research shows that anti-Semitic speech, for example, taps a different vocabulary than misogynistic or racist speech. This means that efficiently and automatically detecting abuse on social media platforms requires "teaching" computational tools to identify, among other things, the specific linguistic structures of the abuse targeting a particular community, using realistic, expertly-labeled examples of the behavior you want to detect.

In my previous research collaborations with computer scientists on this topic, one thing that stood out to me was how poor most of that training data was. So in early 2020, I once again reached out to colleagues in computer science -Julia Hirschberg and Sarah Ita Levitan- to see if we could create better tools for detecting online abuse by collecting better training data. With support from the Tow Center for Digital Journalism and the Brown Institute for Media Innovation, we designed a participatory data-collection process. Rather than scraping data from public Twitter accounts and asking research assistants to make judgments about abusive content, we have been recruiting and paying women journalists to share and label their own Twitter conversations. With enough participants, we are confident that we can build a tool to better recognize the linguistic characteristics of the online abuse that targets women journalists, but may also help clarify the broader mechanisms of harassment used in these spaces. Once we can accurately and efficiently detect abuse, we'll continue to work with the journalism community to understand what should be done with it. In some cases, simply blocking or quarantining it will be enough; in others, documenting and reporting it to platforms and/or law enforcement  will be important. To that end, we are building relationships with apps like Block Party, which are already beginning to provide some of this much-needed functionality. 

Realistically, improving Twitter for women --and especially women journalists-- will not be solved by any one effort. The goal of the April 9 boycott, according to organizer Heidi N. Moore, was to spur Twitter to action by exerting economic pressure, since the platform depends on user activity to generate advertising revenue. Opening up social media platforms for independent research and development is also key. On April 19, for example, Block Party founder Tracy Chou wrote in Wired about the need for platforms to create application programming interfaces (APIs) that would help companies like hers more easily innovate tools for improving Twitter users' experience This move that would reflect the origin of many of Twitter's most successful interfaces, many of which were originally built by people outside the company.  For our part, we hope that our work will demonstrate that creating effective, scalable tools to detect online abuse is well within the realm of technical possibility. Likewise, we hope to demonstrate to other researchers the importance of working in direct collaboration with communities on this type of work, especially to ensure accuracy and fairness of the tools we produce. And we hope you that you, whether you are a woman journalist or one of their colleagues, will consider working with us to build a better, more effective online working environment for us all.

If you are a woman journalist, please consider participating in our research and/or sharing this information with your colleagues. We welcome participants from all types of media and freelance practices, and hope to make our work with women journalists just the first of many appropriately.

STORIES YOU MAY HAVE MISSED
  • When the Facebook Oversight Board announced last week that it would broaden its scope of eligible cases for review by the board, Tow Director Emily Bell tweeted that the move was: “consistent with what is happening within Facebook generally. Policies cannot possibly scale one set of rules for moderation - it doesn’t work. So it has to do what it always said it would not, which is respond on a case by case basis to hot issues.” The Washington Post plugged Bell’s take in its analysis of the new rule, which will allow the board to accept the submission of posts that remain online and are flagged for moderation. The original purview of the board was limited to posts that had already been removed by Facebook, which the board would then review for compliance...with the platform’s community standards. This news came just days before the board’s announcement that its determination in the case of former President Donald Trump’s removal will be announced “in the coming weeks.

  • The United States isn’t the only country with a journalism and media crisis. The U.S.’ downward trend of both sustainable news operations and  jobs in the industry, especially at the local level, is similar to the current media landscape in the Philippines. The pandemic has only exacerbated the problem in both countries. According to a late summer report by Rappler, the Philippines’ decline in community journalism can be attributed to, “COVID-19; low sales, revenue; reduced number of circulation, pages; reporting gaps; and job cuts and uncertainty.”
     

  • Speaking of local news, Axios reported last week that, “A slew of new companies are launching platforms for local newsletters, a shift that could help finally bring the local news industry into the digital era.” The companies--Substack, Facebook, Patch, and Axios itself, to name a few--seem to be following the “newsletter-ification” of news trend with hopes that modernizing local news will help bolster engagement, readership and sustainability. Platform involvement with news publication, however, has a fraught history. What will come of this marriage between larger media or media-adjacent companies with local newsrooms will perhaps indicate how local journalism will look moving forward.
     

  • After reports last week that, “Minnesota local media [were] barred from police shooting press conference,” the Trump-era escalation of violence against reporters continued to ratchet up. In the wake of the fatal police shooting of Daunte Wright in Brooklyn Center, Minn., protests calling for police reforms and accountability spilled out into the streets in cities across the country, but in Brooklyn Center, Minn., the Trump-era escalation of violence against reporters continued to ratchet up. Protests calling for police reforms and accountability spilled out into the streets in cities across the country, but in Brooklyn Center, the New York Times reported that, “Journalists have been sprayed with chemical irritants, arrested, thrown to the ground and beaten by police officers while covering protests.” The state’s governor said that such alleged attacks, “just cannot happen.” In response, veteran reporter and Columbia Journalism School professor Dale Maharidge tweeted, “The governor must invoke sanctions against these cops.” At the same time, multiple outlets reported that journalists also faced violence from protesters themselves over the past week

Twitter
Facebook
Website
Copyright © 2021 Tow Center for Digital Journalism, All rights reserved.


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp