In case after case, courts reshape the rules around AI

In case after case, courts reshape the rules around AI

AI Now Institute recommends improvements and highlights key AI litigation

Written by
Edited by Michael Morisy

When undercover officers with the Jacksonville Sheriff’s Office bought crack cocaine from someone in 2015, they couldn’t actually identify the seller. Less than a year later, though, Willie Allen Lynch was sentenced to 8 years in prison, picked through a facial recognition system.

He’s still fighting in court over how the technology was used, and his case and others like it could ultimately shape the use of algorithms going forward, according to a new report.

His case was one of those discussed over the summer at New York University’s AI Now Institute 2019 “Litigating Algorithms” workshop. The organization focuses on researching algorithms and their effects on society and civil rights. Last month, it released a report, highlighting key algorithmic litigation and other recommendations for modernizing rules around legal discovery and protection of biometric information.

“One part of our research output is research to help build a greater understanding of these issues for legal and policy advocates who are working on the issues. We put out an algorithmic accountability toolkit, and ‘Litigating algorithms’ is a convening of people who have challenged government use of these systems,” said Rashida Richardson, AI Now’s Director of Policy Research. “We are trying to think through where there are vacancies or deficiencies that can help bolster existing work being done by legal and other advocates.”

Litigation involving algorithmic decision systems (ADS) will play an important role in shaping civil rights and privacy protections, the report says, but success should be measured by the structural changes that follow and supported by enough vigilance to keep shady vendors from simply peddling their wares to other communities.

Government agencies have increasingly turned to ADS over the last decade. In addition to facial recognition, pre-trial risk assessment, and criminal justice applications, there has also been increased use of the systems in the allocation of social services and resources.

Idaho’s adoption of a secret ADS formula to allocate benefits to residents with developmental disabilities led to widespread disruption. When the program was enacted, a number of those previously receiving benefits found that the amount of assistance for which they qualified dropped.

Though the state acknowledged that this was a result of the new formula, it would not release that information under the claim that it was a protected “trade secret.” The ACLU then brought a class action case, K.W. v. Armstrong, against the state in 2012, settling it four years later, which required the state to fix the formula to be less arbitrary and require regular testing.

“There are tons of cases. In many of them, the government has to abandon the system, like in the Medicaid allocation system cases, because those were class actions. They’re now in the process of trying to figure out how to fix the system, because now they’ve like spent so much money in it not working, then getting sued, and now like they’re obliged by court order to fix it,” Richardson said. “And they’re kind of like, ‘We don’t know how to fix it,’ or they’re realizing that these may not be problems that are best solved using some type of automated decision system, and it’s you only find that out after millions of dollars are spent and a few people die.”

A similar case in Arkansas affected physically-disabled individuals and their access to nurses, which significantly decreased after the state began using an algorithm to determine the needed amount of care. The Legal Aid Society of Arkansas won that case, and the state’s Department of Human Services began work on a new ADS.

In addition to lawsuits specific to particular algorithms, important court decisions are being made under new laws that are finally being tested. In particular, the Illinois Biometric Information Privacy Act is the basis for new challenges related to the ability to collect and protect very personal biometric information, like children’s fingerprints at Six Flags, as in Rosenbach v. Six Flags. AI Now recommends more states should adopt similar laws.

Other recommendations made by the researchers, lawyers, and activists involved with the workshop including enacting a policy of openness during legal discovery for any cases involving automated decision systems and the development of training for people in the criminal defense sector, so that they may better understand how these automated systems are affecting the work they do and the due process rights of their clients.

You can read the full report below.


Creative Commons License
Algorithmic Control by MuckRock Foundation is licensed under a Creative Commons Attribution 4.0 International License.
Based on a work at [https://www.muckrock.com/project/algorithmic-control-automated-decisionmaking-in-americas-cities-84/](https://www.muckrock.com/project/algorithmic-control-automated-decisionmaking-in-americas-cities-84/).

Image via Supreme Court of Arkansas via Facebook.