What is "fake news?" Which websites peddle it? How do you know if you’re reading it?
Four college students say they’ve created a browser extension, “Open Mind,” that will display a warning screen if someone goes to a website known to disseminate "fake news," and alert a reader if a story shared on social media is "fake" or biased.
It will also track the stories people read, suggest alternative articles from “valid sources to help balance a user’s news diet,” and produce graphs that show readers how positively or negatively slanted the news is that they’re reading on particular topics.
The students developed it early this month during a 36-hour “hackathon” at Yale University that was put on by the Poynter Fellowship in Journalism and also sponsored by Facebook.
The winning team had two students from Yale’s doctoral psychology program, Michael Lopez-Brau and Stefan Uddenberg, an undergraduate in machine learning from California Institute of Technology, Alex Cui, and a computer science student from the University of Waterloo, Jeff An.
Lopez-Brau said he wants Open Mind to help people get out of the ideological bubbles they naturally gravitate to when consuming news via social media.
"Social media sites grow bubbles," Lopez-Brau told the Associated Press. "They make it extremely easy for people to only follow people with similar interests, so often there is no real opportunity for them to be confronted with an opposing viewpoint. They've allowed us to silo people off at a distance."
In addition to making readers more aware of their news consumption habits, the team also wants to add a “share verification” feature that would ask people if they want to share a story they have not yet read.
Facebook spokeswoman Ruchika Budhraja said the company would like to speak with the students about their work.
“We're building products, many of which are very similar to what the students came up with at Yale," said Budhraja.
Only a few days ago, Facebook announced it was ending the short-lived program of adding a red “disputed flag” to stories suspected of being "fake news." One reason, the company said, was that two fact-checkers had to review an article before a flag could be added. Another is that Facebook suspected the red flag was actually prompting readers to share a “fake news” story.
Facebook will now give readers a Related Articles list, which only requires one fact-checker. The company said readers are less likely to share a “fake news” story when presented with related articles than when the story is red-flagged.
Facebook will also knock down a story in the News Feed if its team of fact-checkers deems it “fake news.” That team includes ABC News, the Associated Press, FaceCheck.Org, Politifact, and Snopes — organizations that many consider to have a left-leaning bias.
The creators of Open Mind said they plan to create a research project to measure whether volunteers change their browsing habits thanks to their browser extension.
Lopez-Brau said that Open Mind will “stay away from the far left and the far right” when suggesting articles from opposing viewpoints to readers, citing Mother Jones and The Blaze as two examples.