View this email in your browser
Despite clear policies, Google labels many shadowy, politically backed websites as news
By Emily Bell and Sara Sheridan


When the Institute for Nonprofit News was approached several years ago by a news organization targeting African American readers, executive director Sue Cross was intrigued. Although the organization did not pass the transparency of funding standards for INN membership, Cross followed their journalism closely over a period of months. While much of the reporting seemed solid, Cross noted that, “Over time you could see a pattern of stories emerging; particularly positive stories about coal and stories about how particular types of new energy have a racist impact.” The undisclosed money funding the site was clearly aligned with the interests of the fossil fuel industry, but as Cross noted, “You had to read the output of a number of months to even detect it.”

Sites like this one were the subject of a recent Tow Center discussion on the phenomenon of covertly partisan money funding local news. Tow Center research into understanding how partisan online news networks operate ahead of elections revealed over a thousand politically backed sites cropping up across the US producing largely automatically generated stories. The Metric Media network at the center of the study is the largest but by no means the only example of organizations that are serving lobbying or political interests by producing what appears to be local news content. As Cross points out, the lack of transparency around funding sources is designed to deceive readers by making it difficult to detect political or commercial motives. 

Of course the phenomenon of partisan news is not new. Is the Metric Media network (which insists it is not partisan) substantially different from, say, Sinclair Media, which operates controversial right-leaning local TV news franchises, or blatantly partisan cable news channels like Fox News? In the UK, both left- and right-leaning press chains, from the Daily Mail to the Daily Mirror, have operated local newspapers for decades. Yet for online outlets, the facility of digital composition and the lack of clarity around funding pushes us to make tighter distinctions around what a “news source” might look like. And for third party platforms that aggregate news content like Google and Facebook, there is an increasing need to flag instances where news production becomes lobbying or advertising.

Facebook and Google classify publishers as news sources in different ways. Facebook relies on self-identification, meaning effectively anyone can register as a “Media/News Company.” Google, on the other hand, proactively identifies news sources by including them in Google News, and has made several statements about the need to support high quality, original reporting in its search results. However, research by the Tow Center has found that despite clear guidelines about inclusion in Google News, standards for identifying outlets as  “news sources” are inconsistently applied.   

In a January 2019 post on Google’s Official Webmaster Central, Public Liaison for Search Danny Sullivan outlined a list of best practices for websites looking to be officially classified as news sources” on Google News. The blog post, entitled, “Ways to succeed in Google News,” offered several tips on how to properly format dates and bylines, how to structure data, and how to optimize headlines with SEO-friendly keywords. The article also provided information on what news sites should avoid in order to stay within the parameters of Google’s News guidelines. Sullivan warned publishers against duplicating content, citing that, “Google News seeks to reward independent, original journalistic content by giving credit to the originating publisher, as both users and publishers would prefer.” For Google, duplicate content includes news that relies on scraping, rewriting, or republishing stories. In theory, this would mean that algorithmically generated news would not rank highly on Google News. Yet despite clear policies, research by the Tow Center found that a number of shadowy, politically backed “local news websites” are indexed as news sources by Google News. 

The organizational structure of the network of shadowy, politically backed “local news websites” designed to promote partisan talking points analyzed in a recent Tow Report

We sampled roughly one third of over 1200 sites identified in the Metric Media network, 90 percent of whose stories are automated. We found that the “news source” label was inconsistently applied in Google News. For example, only one of the 268 Metric Media sites surveyed on Google News was indexed as a news source, yet 35 of the 36 Local Government Information Services (LGIS) sites, also linked to Metric Media, were indexed. In total, 13% of the “news sites” sampled were found to be indexed as news sources by Google. 

Determining if a website is identified as a news source is straightforward. By using the search bar within Google News, some sites will appear with an italicized modifier “news source,” meaning they are indexed as news publications. For example:

The Carbondale Reporter is owned by LGIS, which is in turn owned by Dan Proft, a Republican politician and columnist in Illinois. 

Metric Media and LGIS websites often generate stories through the automated publishing of press releases or public data feeds, meaning there is a high degree of replication within the story headlines across a number of publications: 

According to the official guidelines of the Google News Initiative, the company uses seven standards to rate, rank, and categorize news: relevance, interests, location, prominence, authoritativeness, freshness, and usability. While they explicitly state that “Google does not make editorial decisions,” each of these categories is theoretically designed to prevent a rash of aggregated news when people attempt to search for timely, relevant, and accurate information on the platform. 

What’s more, in the 2019 article by Google that outlined publishers’ best practices, the site explicitly warned would-be news providers to avoid repurposing stories, “without adding significant information or some other compelling reason for freshening.” 

It is unclear why LGIS sites are predominately labelled as news sources while Metric Media sites are not, although it could be related to the relatively recent genesis of the latter (most LGIS sites have been around since before 2016, whereas most Metric Media sites were created in the last 12 months). What is clear, however, is that there is a lack of consistency in how Google indexes news sites, which might feed public confusion as to the provenance or value of the news that appears in search results. If the indexing of a publisher as a Google news source translates to increased credibility and more prominent placement in search rankings, it is vital that Google gets it right.

Facebook has also struggled with properly labeling politically backed local news outlets like the ones described above. The company recently announced it was cracking down on news pages with “direct and meaningful ties” to political organizations, effectively removing their publisher privileges and ensuring their promoted posts will be identified as political advertising. As with Google, there are inherent advantages for publishers to be categorized as “news.” But scrutinizing these designations, given to tens of thousands of titles, is laborious and difficult. Facebook does not offer a public register or an API where researchers, journalists or the public can easily see what is indexed as a “Media/News Company.”

With both Facebook and Google recently stepping up their efforts to financially support local news, it is reasonable to ask whether both companies could do more on their own sites to ensure credible local news sources are privileged over mysterious, politically backed websites that rely on low-cost automated story generation.

  • A 449-page report from House lawmakers that calls for sweeping changes to antitrust laws calls Amazon, Apple, Facebook and Google “the kinds of monopolies we last saw in the era of oil barons and railroad tycoons.” 
  • Facebook banned QAnon across all its platforms this week, days after 17 Republicans declined to condemn the pro-Trump conspiracy movement. “Of course, this all could have been done sooner, before Q factions aligned with militia groups and anti-vaxxers, to curtail the spread of medical misinformation and the mobilization of vigilante groups,” said Joan Donovan, research director of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School. 

  • On CJR’s Galley, Tow founding director Emily Bell talked to Matthew Ingram about disinformation ahead of the election: “We have become hypnotised by this idea of 'fighting misinformation' which is such a popular theme, but like the 'war on terror' it is both a fantasy and on the terms under which it is framed, it is not a winnable fight.”

  • In the Los Angeles Review of Books, Tow fellow Anthony Nadler writes about his research on the conservative news ecosystem: “Interviewing conservative news consumers led me to one of the most sensitive nerves that runs through this genre: the portrayal of a demonic left, whose primary goal is to shame conservatives and their families.” 

  • Tow fellow Damian Radcliffe’s new report for Sovrn, the Publisher’s Guide to Navigating COVID-19, “dives into the implications of the coronavirus crisis and highlights what it means for publishers, with a specific focus on revenue and subscription strategies, as well as tactics for building loyalty and generating engagement with audiences.”


Copyright © 2020 Tow Center for Digital Journalism, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp