For criminal investigators, seeing is not believing. The keys to their work are skepticism, multiple hypotheses, and guarding against bias. It takes specialized training to apply that mindset in the digital world where yearly, a trillion photographs and videos are uploaded. Teaching students how to rigorously verify open source material found on social media is the mission of the UC Berkeley Human Rights Center’s Investigation Lab at Berkeley Law. This fall, a diverse group of undergraduate and graduate students drawn from the journalism, public policy, and law schools are learning those tricks and techniques in the weekly seminar Open Source Investigations at Boalt Hall.
While national media outlets work on fact-checking tools to combat the spread of false information, the objectives of the HRC lab are different. “Communication has increasingly shifted to digital platforms,” says Alexa Koenig, HRC’s executive director since 2012. “The ability to investigate such networks is key to documenting tangible links between a high-level perpetrator and the person on the ground who pulled the trigger or committed the crime.” Koenig oversaw the lab’s creation in 2015 as an offshoot of the Center’s Human Rights and Technology Initiative. Applications for student researchers to receive training and work on program initiatives began in Fall 2016. The response was overwhelming. “It felt like a startup,” Koenig laughs. Last year, 42 students worked in the lab. This year, the number jumped to 60.
The common thread for all projects? “We want to establish a rigorous, transparent process,” says Koenig, “that raises this material to a level where it can be legally admissible in court as evidence.”
In addition to projects, participants are required to take the formal class, Open Source Investigation, team-taught by Koenig, Andrea Lampros, communications director and lab manager, and HRC faculty director Eric Stover. After decades working as a journalist and investigator of human rights abuses in Latin America, Stover joined the two-year-old Center in 1996.
Describing the Center’s work as the intersection where law, journalism, science, and technology meet, Stover says, “Our mantra is that facts matter, and verified facts matter even more.” The need to apply “digital forensics” to human rights work, he says, arose from “a tsunami of material from ‘accidental or citizen journalists’ capturing events with a smart phone in their hands.” This material is valuable, but it must follow chain-of-custody considerations in order to hold up in court. “Because when you go to court,” he says, “you have the opportunity to hold people accountable and clarify the historical record of disputed events.”
An International Impact
On a recent Friday morning, Lab students gathered in Boalt Hall to hear a special guest lecturer. Sam Dubberley, manager of Amnesty International’s Digital Verification Corps, leaned on a table as the students filed in, his laptop opened to satellite images from Google Earth that were projected on large monitors around the lecture hall.
The British-born Dubberley, who had flown in from Berlin for three days of student training, distributed a curated list of URLs of public websites featuring advanced searching tools for geographic mapping, weather history, aviation and transportation data, and more. Students were already familiar with browser extensions that conduct reverse images searches, such as TinEye, which checks whether an image has already been uploaded to the Internet. If it has, it provides the date and URL where the photo previously appeared.
“One of my biggest surprises in doing this work,” says Koenig, “was learning how frequently photographs and videos circulated for human rights purposes are appropriated from different conflicts and different time periods. People may purposefully do that because they think it adds a sense of urgency. But oftentimes, it’s by accident when people don’t realize the sources they’re recirculating are not what they claim to be.”
Dubberley guides the students towards a deeper layer of verification, walking through a series of examples. He shows them a bird’s-eye view of green farmland with plumes of white smoke against a background of mountains. A store’s nighttime surveillance footage of a dark parking lot and low retaining wall. In another, foreign demonstrators and riot police face off in a street.
In each case, he directs the students to examine details within the images and match those elements in other photos or clearly marked maps. He draws their attention to unique features in the landscape, unusual markings along a street or curb, storefront signs, military uniforms and insignias, architectural shapes, unusual buildings, rain versus sunlight, perhaps billboards or signage in a foreign language.
“Think of your task as fitting together pieces of jigsaw puzzle,” he says. “You don’t need every piece to understand the image.”
The pieces Dubberley hopes the students will document include whether the image is first-use or original content; who uploaded it; the event’s date and time; the location’s longitude and latitude; any additional corroborating evidence of the event from other sources; and if possible, the motivation for sharing the image.
“Can you draw any conclusions? It’s all right if you can’t,” Dubberley reassures them. “That’s equally important for us to know.”
At the end of the hour, Dubberley lets the class have a turn. A quiz at Geoguesser.com shows an untitled street-view image and asks them to pin the location on a world map. Points are awarded based on how close they get to the actual location
Dubberley calls out, “Anyone get over 10,000 points?” Several people raise their hands. “Tell us how you did it,” he prompts.
“And don’t forget self-care,” he adds. Describing how exposure to violent images can lead to feelings of vicarious trauma, Dubberley says, “In our next workshop, we’ll look at how you mitigate those effects.”
An Upsurge in Domestic Hate Crimes
On another weekday morning, one of the teams gathers in the small office quarters of the HRC Investigation Lab. These eight students, including six undergraduates participating through Berkeley’s Undergraduate Research Apprentice Program (URAP), are working on ProPublica’s “Documenting Hate” project. They arrange themselves around a small table with their laptops open as the team leader, Olivia Rempel, sets up the week’s work schedules.
“Discovery is way more difficult than verification,” says Rempel, a native of Vancouver, BC, studying at UC Berkeley’s Graduate School of Journalism. “Working in pairs helps you brainstorm ideas for new search terms. It’s also a form of self-care. When we do find material, it’s better psychologically not to be alone.”
At 10 a.m., Kim Bui, a Los Angeles–based video journalist with Now This, and coordinator of ProPublica’s student volunteers, checks in with the group via Skype. It is her first call with the Berkeley students, one of five university teams crowd sourcing the material for ProPublica. The other four are University of Miami School of Communications, CUNY Graduate School of Journalism, Texas State University School of Journalism and Mass Communication, and Wake Forest University.
After making introductions, Bui gives them an overview of this project. “Statistics on hate crimes are based on FBI data, which is inaccurate,” she explains. “People rarely report these incidents to police. And police stations have the option to inform the FBI. Some do. Some don’t.”
In the wake of the 2016 presidential election, organizations including ProPublica and the Southern Poverty Law Center, noticed a sharp uptick in the number of hate crimes and bias incidents. “We wondered,” Bui says, “if this is true or just amplified by the larger role social media plays in our life in bringing these things to our attention.”
ProPublica, working with other civil rights organizations, newspapers, and media outlets, launched “Documenting Hate” in early 2017. Information collected online from self-reporting individuals are fed into ProPublica’s Collect database, verified, and sent to a larger database, named “Landslide,” that is used by more than a hundred subscribing newsrooms and data journalists who will draw on the material to aid local reporting.
“But not everyone knows or cares about filling out forms,” says Bui. “They go on Twitter or Facebook and vent. That’s where you come in.”
Bui explains how to use TweetDeck to search Twitter, and advanced tools for combing through Facebook. She, too, has a curated list of search tools for deeper digging. The app Ban.jo can display every geo-tagged social media post on a mapped area during a 24-hour period. Facebook’s Signal app both archives and displays a map of live video broadcasts on the platform.
“Remember,” Bui adds, “you’re not just trying to find comments or photos. You’re trying to verify an event.” If graffiti appears on a school building, the students should look for a follow-up statement by a school administrator, for example.
“Look at people’s networks, their list of friends,” she says, showing how easy it is to find a name and telephone number of a person who simply comments on a photo in Facebook. “That’s terrifying,” a student murmurs. “They own us,” another replies.
A student asks about a photo Rempel shared with the group. It is a series of racial and ethnic insults written in chalk on distinctive brick walkway. “Are we supposed to figure out who wrote that?” she asks.
Bui says no. “That’s probably impossible,” she replies. “But you can investigate that brick pattern to determine the probable location. And you can look at the metadata of the photo to get at least a window on the time and date when it was written.”
In closing, Bui tells the students not to get discouraged. “These are rare events,” she says. “You won’t find something every day. And that’s good, in the larger scheme of things.”
The Next Wave
The product of a lab investigation typically includes a meticulously sourced final report for a Human Rights Center client or partner. Such reports contain the verification work on videos and photos, all URLs, and a contextual narrative that describes what has taken place, how the people and events are linked.
Equally important, in Koenig’s view, is to begin training judges and lawyers in addition to students. “Judges need to see the utility of information derived from social media,” she says, drawing a comparison with the introduction of DNA evidence in the 1980s. “If you don’t understand the underlying technology, there’s a dangerous tendency to write it off completely. Or you embrace it wholeheartedly because you assume there’s an expert out there who got it right.” She sighs. “There needs to be a middle ground. From a defense perspective, we want them to push back.”
Koenig is skeptical that new tools alone will solve the problem of digital verification. “Tools are constantly changing,” she says. “What’s important is training the next generation to understand the problems, context, and value of looking into digital space for their investigations.” She gestures optimistically. “I predict that wave is coming.”
For more tools to help verify digital material, explore the resources available at First Draft, a nonprofit educational consortium affiliated with the Shorenstein Center on Media, Politics, and Public Policy at Harvard University.
Barbara Tannenbaum is a journalist and fiction writer based in San Rafael. Her pieces on science, arts, and culture have appeared in The New York Times, San Francisco Magazine, The Los Angeles Times, and Science Today, the news channel of California Academy of Sciences.