When university researchers fanned out across the U.S. to gather opinions on internet trustworthiness recently, they heard the same thing wherever they went: People wanted ways to check up on what they’re reading, to know more about how articles, posts and videos are put together, the way teachers once asked you to show your work when you handed in assignments.
The interviews, conducted in 10 cities in the U.S. and abroad, were the first stage of something called the Trust Project, which just launched a fresh way for news organizations to reveal in minute detail what’s behind every article they publish. How do you know what you’re reading is reliable, whether there’s an agenda behind it, and what the ingredients are?
Dozens of projects are now under way across the U.S. to answers these questions, including a year-long set of experiments at McClatchy, which publishes The Kansas City Star. Foundations, universities and think tanks have all waded into this thicket.
In the space of five years, news once delivered on websites, apps and in print now relies largely on social media and search to reach you. Facebook, Google, Instagram, Twitter and Snapchat now deliver more than half of many news organizations’ work. It can be hard to tell who produced them without following links.
As content is broken apart and reassembled on tech platforms, news and opinion, investigative work and fluff are mixed together. This has enabled falsified reports to easily slip into a flow sorted by a blend of algorithms and reader likes. The technology companies that built these platforms are getting a beating, in congressional hearings and even on their own sites. A string of new books has taken Facebook, Google and Amazon, in particular, to task.
In response, some of these firms are helping to fund the search for solutions and publicly accepting responsibility. Richard Gingras, Google’s vice president, wondered at a panel on credibility at Washington, D.C.’s Newseum recently: “Can we really defeat this monster we’ve created?”
And yet there’s plenty of blame to go around. News organizations, McClatchy included, willingly turned our content over to the technology giants in exchange for readership many times larger than a decade ago. Only now are publishers thinking in earnest about what readers need to navigate this chaos.
An ambitious project called the News Integrity Initiative, funded by Craig Newmark Philanthropy, Ford, Facebook, the Knight Foundation and others at the City University of New York, has launched 10 separate efforts. They seek to fight misinformation, increase reader engagement, strengthen investigative reporting, address political polarization and heighten diversity on news staffs.
The Trust Project, funded by Google, Newmark, Knight and the Democracy Fund, two weeks ago unveiled the results of a two-year effort to build what they describe as a “nutrition label’’ for content. Publishers can post a “Trust’’ symbol on stories in exchange for providing a laundry list of information, such as the author’s background and expertise and details on standards, policies, the ownership, diversity and correction practices.
McClatchy, the publisher of this and 30 other newspapers and websites, has signed on to a project directed by Arizona State University and funded by Facebook and the Integrity Initiative that sets up a kind of laboratory to see what really works.
Newsrooms in Kansas City, Modesto and Macon, Ga., will spend the coming year sorting through the best ideas for improving credibility and transparency and seeing how readers and users respond.
At a time when an avalanche of information is available with a swipe or click, editors, publishers and tech companies need to work together to deliver what readers have made it clear you want: An easy way to decide if you can trust what you read.
You’ll get to decide if we deliver on this with a simple but powerful message — whether you keep reading or go elsewhere.
Anders Gyllenhaal is a senior editor at McClatchy.