header copy.png

Determining Reliability in News

header copy.png
 

Determining Reliability in News: Accura is a news credibility tool created as a way to counter the misinformation and fake news currently prevalent on the Internet.

The project was made as part of Tech Media Democracy, a first-of-its-kind program that brings together six of New York City's universities -- Cornell Tech, Columbia University, City University of New York, New York University, Parsons School of Design and Pratt Institute -- partnering journalism, design, media studies, and engineer students together to defend and support journalism and independent news media, as they are increasingly under threat.

 

Team: Jenni Wu, Pratheek Irvathur, Pratik Jain, Yijia Wang
My Role: Research, Storyboard, Wireframes, User Experience Design, User Interface Design
Tools Used: Sketch, Marvel

 

Problem

Tech Media Democracy's first hackathon revolved around addressing threats to the free press, journalism, and the media. My team decided to take on the challenge of credibility and reliability in news. With fake news and discrediting campaigns on the rise, how can we encourage people to be critical and aware of credibility in the news that they consume?

Research

The problem with fake news lies in its hidden adeptness; with over 50% of Americans reading their news on social media platforms like Facebook and Twitter, algorithms and bots sway and influence the political conversation, making spreading factually incorrect stories easier, and distinguishing real news from fake ones harder. [Source]

My team approached the problem from two angles:

How can we encourage fact checking and people to be more informed of the news they consume?

How can we determine reliability in the news?

20171229-illinois-newsletter-inline.jpg

During our research, we found an example of how vigorous journalists are at fact-checking their articles. It's a tedious and extensive process of making an Excel spreadsheet of facts to be checked and double-checking it again. This was an a-ha! moment for our team as we realized how highlighting and fact-checking could go hand-in-hand as a tool, and be maintained by a community similar to Wikipedia. Next we had to consider how this system would work.

 

Brainstorming

IMG_9615.JPG

Through some quick brainstorming, we decided to create a web browser plugin/extension that encourages community assessment of stories and their credibility using three metrics: a journalist/expert rating, the general public rating, and an algorithmic rating. Similar to Googling a movie and seeing ratings from imDB, Rotten Tomatoes, and film critics, showing readers all ratings from three sources has the potential to be transparent and indicative of a news article's content.

Using the browser plugin, a reader can highlight a section of the article and flag it as inaccurate, while providing a source. Other users of the plugin can see highlighted inaccuracies and upvote or downvote it.

 

User journeys

IMG_2411.JPG

I drew out two user journeys for our browser extension in order to get a better understanding of how the user flow and interaction should be. Since I was concerned about managing identity verification and preventing bots, I decided that one way of identifying experts and journalists would be by sending an email verification to their organization email (e.g @newyorktimes.com). As for the general, everyday user, signing up through another social media account like Google, Facebook, or Twitter, and going through captcha can help mitigate an onslaught of bots.

Sketches

IMG_9160.JPG

Wireframes

wireframes.png

 

Clickable prototype

A clickable prototype I made using Sketch and Marvel:

Reflections

It was a crazy, intense, and rewarding 2-day hackathon. I would love to be able to develop this into a functioning prototype and test it by inviting journalists and professors to use it. Next steps would be to think about how to prevent or calculate for throngs of users with biased or disruptive intentions downvoting verified fact-checks, and how to incentivize a community of people to maintain the plugin, similar to Wikipedia's diligent community of fact-checkers. This was also my first time collaborating with different universities, and working with people from different backgrounds was both enlightening and humbling at the same time. After learning so much from my peers, I'm already looking forward to the next hackathon!

Mentors

Our team's mentors included David Carroll, Associate Professor at Parsons; Marc Lavallee, Editor at The New York Times; Ranjan Joy, Founder of The Edge Group; and Mor Naaman, Associate Professor at Cornell Tech.