Content Discussion History

ALGORITHM TOOLKIT RELEASED TO REDUCE BIAS AFFECTING RESIDENTS FROM AUTOMATED DECISIONS MADE BY LOCAL GOVERNMENTS

FOR IMMEDIATE RELEASE:  September 17, 2018

Contact: Connie Ress, cress@jhu.edu

CENTER FOR GOVERNMENT EXCELLENCE RELEASES FIRST-OF-ITS-KIND ALGORITHM TOOLKIT TO REDUCE BIAS AFFECTING RESIDENTS FROM AUTOMATED DECISIONS MADE BY LOCAL GOVERNMENTS

Today, the Center for Government Excellence (GovEx), part of Johns Hopkins University, DataSF, part of the City and County of San Francisco, the Civic Analytics Network (Harvard University) and Data Community DC, announced the release of a first-of-its-kind algorithm toolkit that will help local leaders ensure that decisions made based on algorithms are unbiased and deliver the best outcomes for residents.

Complex algorithms learn from data, identify patterns and make predictions, sometimes with minimal human intervention, and are used across industries to provide services to residents. Algorithms are used in the criminal justice system, higher education processes, and social media networks. Yet, there are unintended consequences that arise when algorithms have significant bias and decisions are made without careful review or human input.

The main goal of the new algorithm toolkit is to ensure automated decisions are fair and the unintentional harm to constituents is minimized. The toolkit provides a risk management approach and helps users better understand the risks and benefits associated with algorithm-based decision making in local government. As cities throughout the country work to address issues of inequity as a result of algorithms and the negative impact it has on residents, the toolkit helps local leaders to proactive ask specific questions to quantify risks and also provides recommendations on ways to handle those risks.

“Governments want to ensure fairness and transparency for their residents, but with more and more algorithms being used to make determinations that impact the lives of those residents, trying to mitigate bias has been difficult,” said Andrew Nicklin, Director of Data Practices, who led the project for GovEx. “Government employees do not have a process or tool to evaluate how risky their algorithms are, nor how to manage those risks. That is, until now.”

“Instead of wringing our hands about ethics and AI, our toolkit puts an approachable and feasible solution in the hands of government practitioners – something they can use immediately, without complicated policy or overhead,” said Joy Bonaguro, Chief Data Officer for the City and County of San Francisco.

GovEx provides technical assistance and training to cities in the Bloomberg PhilanthropiesWhat Works Cities initiative, which helps local governments across the country use data and evidence effectively to tackle the most pressing challenges and improve residents’ lives. The new toolkit is the latest in GovEx’s compendium of resources.

-30-

Comments

add comment

Your comment will be revised by the site if needed.