//Shubha Bala /July 24 / 2012
Maintaining accuracy when crowdsourcing data
[stextbox id=”info” caption=”Tips for crowdsourcing” collapsing=”false” collapsed=”false” mode=”css” float=”true” align=”right” width=”225″]
• Have a clear editorial focus for your crowdsourced project.
• Use crowdsourced reports to focus, not as a substitute, for your own reporting.
• Be clear about what crowdsourced reports are verified and unverified.
• Attempt to verify reports using multiple sources.
• Take care that multiple sources are not simply using the same source.[/stextbox]
Digital media can open reporting to everyone and crowdsourcing – a technique for collaboratively collecting information or analysing data – can be a powerful editorial technique that helps you tap into the expertise and experiences of your audience. However, crowdsourcing isn’t simply turning your journalism over to the masses. To be successful, crowdsourced projects need clear editorial goals and require oversight to maintain editorial accuracy and integrity. Credibility is what separates independent journalism from mere gossip.
Lauren Wolfe, the director of the Women Under Siege Project, was recently interviewed about the importance of accuracy in their project to map sexual violence in conflict, specifically in Syria. The project used Ushahidi, a crowdsourced reporting platform, to collect and display the initial accounts. Ushahidi has been used to help gather information for a range of stories including monitoring elections and also disaster response such as the earthquake in Haiti. The platform was recently used to map the impact of flooding in Kuban, Russia.
Coverage in Syria has been especially challenging for the media, as well as campaigners like Wolfe, because the Syrian regime has severely limited the access of international journalists to cover the conflict. To overcome these controls, Women Under Siege first collected reports using Ushahidi in what Wolfe described as a “live, crowd-sourced mapping project”.
We put together a team of epidemiologists, Syrian activists, and what we’ve done is put it out to the world, with different ways that people can report. They can come directly to the site, fill out a form. They can email us or they can use a Twitter hashtag – #RapeinSyria#. What it’s doing is it’s tracking perpetrators, so we have it broken down by is it the Shabiya, is it the government forces, non-government, what kinds of acts.
While Women Under Siege is a project to raise awareness of rape being used as a weapon in conflict, rather than a journalism project, Wolfe, a journalist who has written for the International Herald Tribune and CNN.com, believed she couldn’t sacrifice accuracy for advocacy. After a US Senator Joseph Lieberman raised the issue of rape in Syria, Wolfe called up his office to find out the sources that supported his allegations. She and her colleagues found:
…two out of three sources Lieberman’s office cites are the same source, one that is easily understood to contain information on the rape of men in detention in Syria but no substantiation of “widespread” rape and sexualized violence.
In our daily work of putting together reports of rape and other kinds of sexualized violence for our Syria crowdmap we have become very conscious of an echo chamber. It’s a concept all reporters know (or should): that information gets repeated from source to source until it begins to sound like fact—it’s just the thing that happened, everybody knows that. But when it comes to something as sensitive as rape in war, we have to be particularly mindful of the delicate propaganda war being played by all sides.
She has identified a problem that applies not just to crowdsourced reporting but also to traditional journalism, and the kind of rigour she employs should be part of all of our journalism, not just crowdsourced projects.
In crowdsourced projects, the verification process often happens in stages. Some reports will be easy to verify, while other reports will require more time and effort. During this process, it’s important to communicate to audiences which reports you have been able to verify and which reports you haven’t. In an interview with NPR’s On the Media (below), Wolfe talked about balancing the tensions of advocacy and accuracy, and about “walking the tightrope that is reporting rape in Syria”:
Information is coming out but it’s coming out with agendas attached to it. And there’s nothing more propagandistic than saying, you know what? The opposing side is brutalizing our women. What we have to do is try to triangulate and verify. So I’m working with Physicians for Human Rights and a team of epidemiologists at Columbia. What we’re doing is marking down all the reports as unverified, and the hope is as things calm down, we’ll be able to figure out which ones might have happened and which ones were really just false.
However, she goes on to say that, in a region such as Syria where there are many reasons for a woman not to report rape, it is important to create an environment where women feel comfortable coming forward with reports. Crowdsourcing reports from people who might be at risk if it was known they had participated requires care and a sense that their information and identity is secure.
Collaborating with your audience can bring new information and sources to your reporting. It can build a stronger bond with your audience, but collaborative projects have to be done in a way that maintains one of your most precious assets, your credibility. Make sure that you develop and maintain an editorial process for crowdsourced projects that is consistent with your editorial values.
Wolfe’s interview with On The Media:
Article by Shubha Bala
Leave your comment