Clear all

5 results found

reorder grid_view

Public Scrutiny of Automated Decisions: Early Lessons and Emerging Methods

February 27, 2018

Automated decisions are increasingly part of everyday life, but how can the public scrutinize, understand, and govern them? To begin to explore this, Omidyar Network has, in partnership with Upturn, published Public Scrutiny of Automated Decisions: Early Lessons and Emerging Methods.The report is based on an extensive review of computer and social science literature, a broad array of real-world attempts to study automated systems, and dozens of conversations with global digital rights advocates, regulators, technologists, and industry representatives. It maps out the landscape of public scrutiny of automated decision-making, both in terms of what civil society was or was not doing in this nascent sector and what laws and regulations were or were not in place to help regulate it.Our aim in exploring this is three-fold:1) We hope it will help civil society actors consider how much they have to gain in empowering the public to effectively scrutinize, understand, and help govern automated decisions; 2) We think it can start laying a policy framework for this governance, adding to the growing literature on the social and economic impact of such decisions; and3) We're optimistic that the report's findings and analysis will inform other funders' decisions in this important and growing field.

The Illusion of Accuracy How Body-Worn Camera Footage Can Distort Evidence

November 1, 2017

Our research has revealed that many police departments are failing to adopt adequate safeguards to ensure that constitutional rights are protected. In particular, we have discovered that year after year, the vast majority of the nation's leading police departments with body-worn camera programs allow unrestricted footage review – meaning, officers are permitted to review footage from body-worn cameras whenever they'd like, including before writing their incident reports or making statements.This report seeks to illuminate the ways that unrestricted footage review places civil rights at risk and undermines the goals of transparency and accountability. We urge police departments to instead adopt what we call "clean reporting," a simple two-step process where an initial report is recorded based only on an officer's independent recollection of an event and then a second, supplemental report can be added to a case file to address any clarifications after footage is reviewed. We make the case that in the interests of consistency, fairness, transparency and accountability, clean reporting should be adopted as a standard practice for all police departments with body-worn camera programs.

Police Body Worn Cameras: A Policy Scorecard

November 1, 2017

In the wake of high-profile incidents in Ferguson, Staten Island, North Charleston, Baltimore, and elsewhere, law enforcement agencies across the country have rapidly adopted body-worn cameras for their officers. One of the main selling points for these cameras is their potential to provide transparency into some police interactions, and to help protect civil rights, especially in heavily policed communities of color.But accountability is not automatic. Whether these cameras make police more accountable — or simply intensifies police surveillance of communities — depends on how the cameras and footage are used. That's why The Leadership Conference, together with a broad coalition of civil rights, privacy, and media rights groups, developed shared Civil Rights Principles on body-worn Cameras. Our principles emphasize that "[w]ithout carefully crafted policy safeguards in place, there is a real risk that these new devices could become instruments of injustice, rather than tools for accountability."This scorecard evaluates the body-worn camera policies currently in place in major police departments across the country. Our goal is to highlight promising approaches that some departments are taking, and to identify opportunities where departments could improve their policies.

Data Ethics: Investing Wisely in Data at Scale

September 1, 2016

"Data at scale" -- digital information collected, stored and used in ways that are newly feasible -- opens new avenues for philanthropic investment. At the same time, projects that leverage data at scale create new risks that are not addressed by existing regulatory, legal and best practice frameworks. Data-oriented projects funded by major foundations are a natural proving ground for the ethical principles and controls that should guide the ethical treatment of data in the social sector and beyond.This project is an initial effort to map the ways that data at scale may pose risks to philanthropic priorities and beneficiaries, for grantmakers at major foundations, and draws from desk research and unstructured interviews with key individuals involved in the grantmaking enterprise at major U.S. foundations. The resulting report was prepared at the joint request of the MacArthur and Ford Foundations.

Stuck in a Pattern: Early Evidence on Predictive Policing and Civil Rights

August 1, 2016

The term "predictive policing" refers to computer systems that use data to forecast where crime will happen or who will be involved. Some tools produce maps of anticipated crime "hot spots," while others score and flag people deemed most likely to be involved in crime or violence.Though these systems are rolling out in police departments nationwide, our research found pervasive, fundamental gaps in what's publicly known about them.How these tools work and make predictions, how they define and measure their performance and how police departments actually use these systems day-to-day, are all unclear. Further, vendors routinely claim that the inner working of their technology is proprietary, keeping their methods a closely-held trade secret, even from the departments themselves. And early research findings suggest that these systems may not actually make people safer — and that they may lead to even more aggressive enforcement in communities that are already heavily policed.Our study finds a number of key risks in predictive policing, and a trend of rapid, poorly informed adoption in which those risks are often not considered. We believe that conscientious application of data has the potential to improve police practices in the future. But we found little evidence that today's systems live up to their claims, and significant reason to fear that they may reinforce disproportionate and discriminatory policing practices.