By Tanya O’Carroll. Tanya is interning with our Cameras Everywhere Initiative. She is a Master’s candidate in Human Rights Studies at Columbia University. Read her previous posts on the authentication of citizen video in Syria and elsewhere.
The latest in the Carnegie Council’s lunchtime workshops for Ethics in Business took place on Tuesday, September 20, 2011 in New York and brought together a range of business stakeholders to discuss the unique challenges that currently face the ICT sector. This intimate session, “Yahoo! and YouTube: Balancing Human Rights and Business,” delved into the grey areas and rapidly evolving implications of the new technology environment for human rights. Fortunately, for those of us who were not on the invite-only guest list, the entire panel discussion was webcast, with a chance to interact and ask questions virtually.
Encouragingly, both panelists and guests were asking similar questions to those we have been thinking about at WITNESS, and touched on many of the themes we covered in our recent “Cameras Everywhere” report.
The purpose of the session was to present an emerging framework for how businesses can navigate the new global reality in information sharing. Panelist Abbi Tatton, Manager of Global Communications at YouTube, posed the question: Who creates the news today, now that a citizen who shoots the disturbing video of a young woman’s death in Iran is as important a news documentarian as established news networks? Her co-panelist, Ebele Okobi Harris, Director of Yahoo!’s Business and Human Rights Program, asked what should guide company decision-making when scenarios emerge that go beyond, and are sometimes even inconsistent with, the originally intended uses of these platforms and services?
It was exciting to see the importance that many participants in the room placed on the vital documentary value of content uploaded by citizens around the world both for the news and for human rights. There seemed to be a high level of consensus in recognizing that, as one participant commented, “the access of information is changing our world” so that unlike with “the tragedies of the past…the world will know more.” Companies like YouTube and Yahoo! as major channels for getting that information out are asking what this means for them, and how they can begin to incorporate a human rights perspective into company decision making processes regarding the content they allow on their sites.
In order to offer context on the experiences of corporations in this digital age, the panel included Rachel Davis, Advisor on the UN’s Guiding Principles on Business and Human Rights (which in June was adopted at the Human Rights Council as a global point of reference on standards for business operations), and Susan Morgan, Executive Director of the Global Network Initiative, a multi-stakeholder initiative to integrate human rights assessments in the practices of ICT companies.
The whole session is available on their webcast. It is well worth watching as it explores some of the real cases that Yahoo! and YouTube responded to in relation to photo or video content uploaded during the revolutions in the Middle East (including examples we cite in the “Cameras Everywhere” report), inviting participation from other companies to flesh out and work through the dilemmas posed from a business and human rights perspective.
For anyone who does not have the time to watch the whole 1.5-hour session, here are a few of my takeaways:
- Businesses need to start adapting/interpreting their community guidelines and rules in line with emerging human rights considerations. A platform like Flickr (owned by Yahoo!) or YouTube was not built with activists in mind. Flickr’s Community Guidelines still reflect that it was a platform built for photographers to share their work and is consequently inadequately prepared to deal with moderation decisions that involve the removal of activist’s content. A company´s values statement such as Google’s “Do no evil” is at best a point of reference, but as Ebele commented, this is not enough when it comes to unforeseen scenarios with unintended harmful consequences for human rights. A reassessment of processes that can help companies to better understand the potential human rights impacts of their business decisions is vital. “If people are using it differently, and it has a human rights impact then we need to be open to discussing it and evolving,” was Ebele’s take on it. Whether this means new Community Guidelines or exceptional terms of service for activist users is a question that requires open and robust conversation with human rights organizations and activists themselves.
- Technology is going to move faster than policy. Companies therefore have to make quick judgment calls about how they operate in the new media environment and about their role as information sharing channels. They may not be legally obligated to keep up a video or photo set that is inconsistent with their Community Guidelines but they do need to think broadly about how their platform is plugging a vital news gap in many countries or is serving as a critical information lifeline for activists under extreme pressure. Abbi explained that YouTube factored in these contextual changes when deciding to keep up the deeply disturbing video of Neda’s death in Iran in 2009 because “a person with a video camera became the news anchor that day,” meaning that YouTube understood this was a video the world needed to see. Companies also have to engage with these social implications because in an environment where, as one participant noted, “access is so open, immediate and global,” activists will keep sharing whether it takes place on Flickr or somewhere else.
- In the words of Ebele, “within the ICT sector this is just so new.” It was interesting to realize that ICT companies count themselves as in the middle of a learning process that is still very new. Companies find themselves facing the unintended consequences of their services and are beginning to take seriously the need to integrate human rights processes that, as Rachel suggested, can “continuously scan for new risks”. Ebele echoed Monty Python’s quip by joking, “Nobody expects the Egyptian revolution.” Her point was that unanticipated scenarios surfaced during the Arab Spring that threw into light sharp ethical and business dilemmas for companies like Yahoo!
- Multi-stakeholder dialogue is going to be vital for companies as they move forward. It was particularly encouraging to see how many times this point was reiterated from various voices in the room, and how often it came up in suggestions from the participants about how to deal with a particular tricky case on content moderation. It is clear that companies are accepting that the social implications of video or photo takedowns go farther and farther, and recognize that they are not always equipped to make decisions on content without advice and guidance from outside experts, from human rights organizations and from activists themselves.
One attendee commented to the panelists that “the dramatic effect of the video image is going to have implications we cannot see at the moment…so, I urge you, leave it on because these kinds of things are changing our world.”
The negative human rights consequences of content removals by companies such as Yahoo! and YouTube is not going to disappear overnight, but engaging the questions is a start. This is where the expertise and insight of human rights organizations and activists themselves will be critical to help companies anticipate the evolving implications and risks of their services. Equally, human rights arguments must continue to be reiterated strongly and in one voice if ICT companies are to be persuaded to move beyond talking about the issues to making the changes that protect the work and physical safety of activists who use their platforms.
In WITNESS’ own recommendations to technology companies and developers in the “Cameras Everywhere” report we focus on four sets of changes–to policy, functionality, editorial content, and engagement. We believe that making these changes would not only positively affect the entire environment for online and mobile video, but would also free up resources in civil society that are currently dedicated to addressing negative implications of policies. These are our direct recommendations to technology companies:
- Put human rights at the core of user and content policies: Reevaluate current policies using human rights impact assessments, create human rights content categories that are not vulnerable to arbitrary takedowns and highlight key values around context and consent, and ensure content is preserved wherever possible.
- Put human rights at the heart of privacy controls and allow for anonymity: Make privacy policies more visible and privacy controls more functional using principles of privacy by design, and allow for visual privacy and anonymity with the help of new products, apps and services.
- Create dedicated digital human rights spaces: Support curation of human rights videos, facilitate user education and understanding of human rights issues, make takedown and editorial policies transparent, employ Creative Commons licensing, and support users in dealing with ethics and safety issues.
- Engage in wider technology-human rights debates and initiatives: Draw on expertise across companies in order to collaborate on human rights guidelines, participate in multi-stakeholder initiatives, such as the Global Network Initiative, and address supply chain and environmental impact issues.
For more detail, please take a look at the Recommendations to Technology Companies (PDF) section in the “Cameras Everywhere” report.