You’d probably never heard of him last week, but his face has gone viral this week. Dr. David Dao, the man who was dragged off of United Airlines flight #3411 on Sunday, was trying to get home to see patients. Instead, he ended up getting violently dragged off of a flight—to provide space for United crewmembers.1

If you consume news at all, you’re probably very familiar with Dr. Dao’s face, as are most of the people in the small community he lives in. His attorneys have shared a statement asking for privacy, but thanks to unedited videos of him circulated online, he’s not going to get much of it. He’s likely to be re-victimized repeatedly by press or unwanted conversations about what happened. In fact, very unflattering portrayals of him have already been published in several newspapers (and no, we won’t be linking to those articles here).

But it didn’t have to be that way. Just last February, we wrote about how to use YouTube’s improved custom blur  tool when uploading videos to protect people’s identity. Thanks to advocacy from WITNESS, this tool allows any user to selectively blur faces and other identifying information such as tattoos . The video could have looked like this (warning: this video depicts police violence):

Though I’ve taken hours of footage of the police, I’m new to video editing. But I was able to use YouTube’s tool and create this edited video in ten minutes.  

And we need tools for protecting identity now more than ever. Unfortunately, disturbing incidents of racial and religious discrimination and sexism are continuing to proliferate on airlines. And they’re also happening every day all around us, along with horrific violence and other human rights abuses. Whether you’re documenting police violence in Brazil’s favelas or an incident of hate in the US, human rights abuses are everywhere, and we encourage people to film them.

But we also encourage you to protect the identities of the people you’re filming. Police and other government officials who are carrying out abuses don’t require protection. But individuals can face repercussions even as important evidence of what happened to them is shared everywhere, often by very well-intentioned people.

Dr. Dao’s privacy is in shambles. In some cases, the consequences can be even more severe. In fact, as we’ve pointed out, some perpetrators of hate crimes use “online video distribution as a threat.” Exposing the identities of survivors of violence can lead to them being targeted for further violence, by the government or other parties. Videos that “document abuse, [can] simultaneously inflict it.”

WITNESS is going to continue to advocate for features like blurring that make documenting human rights abuses safer and more ethical. We hope to see more platforms adopt such tools. After all, as of last year “Users watch[ed] 100 million hours of video per day on Facebook,” and video of Dr. Dao initially circulated mostly on Twitter. We hope these companies will see the role privacy protecting features have to play in such videos.

But in the meantime, we’re glad YouTube has taken the lead in this area. Check out our how-to video on YouTube’s blurring feature, and our material on concealing identities. Film and share with sensitivity. We want to expose human rights abuses, not subject survivors to further abuse.

We’re aware that including Dr. Dao’s name and linking to an article that includes unblurred videos is spreading his information further. It was impossible to write this story without doing so, but we considered very strongly not linking to any articles because, as we’ve written in the past “Whether you are a reporter, social curator, or just a viewer, you may come across videos where identities should have been obscured but weren’t, and it’s important to think about privacy before you share on.”

Leave a Reply

Your email address will not be published. Required fields are marked *