Blog by Sumana Harihareswara, Changeset founder

18 Sep 2017, 13:35 p.m.

Supporting Int 1696-2017 for Source Code Transparency in New York City

Hi, reader. I wrote this in 2017 and it's now more than five years old. So it may be very out of date; the world, and I, have changed a lot since I wrote it! I'm keeping this up for historical archive purposes, but the me of today may 100% disagree with what I said then. I rarely edit posts after publishing them, but if I do, I usually leave a note in italics to mark the edit and the reason. If this post is particularly offensive or breaches someone's privacy, please contact me.

The principle at stake in California v. Johnson: due process requires that we be able to examine the evidence used to convict someone. Kern County got a $200,000+ grant and started using closed-source software to perform a new kind of DNA testing for criminal forensics. You are not allowed to audit the software to check for bugs, but the company founder will fly in and testify in court to say he attests to the validity of the results it finds. Uh, no, we need to check, and the ACLU and EFF have just filed amici curiae* briefs before California's Court of Appeal for the Fifth District, saying so.

Man at lectern in front of screen displaying 'Winning Raffle Numbers: 12345 12345 12345 12345', photo (used by permission) by Mike Pirnat at the PyCon PyLadies auction in 2017As I've written and even testified, we need more auditability, transparency, and security in software governments use in laboratories and field tests. Heck, we need it in software governments use to make decisions more generally -- lotteries for visas, school assignments, parole and prison sentencing, and so on.

So I was delighted to learn of bill Int 1696-2017, currently before New York City's City Council. Summary:

This bill would require agencies that use algorithms or other automated processing methods that target services, impose penalties, or police persons to publish the source code used for such processing. It would also require agencies to accept user-submitted data sets that can be processed by the agencies' algorithms and provide the outputs to the user.

I applaud James Vacca, chair of the council's Committee on Technology, for introducing and sponsoring this bill, and for citing/shouting out to danah boyd, Kate Crawford, and Cathy O'Neil as people whose work has shaped this legislation. The New York Times says: "As a committee chairman, he plans to convene hearings before he leaves office in December." I'm looking forward to attending those hearings.

If you live in New York City, you can contact your councilmember and suggest they cosponsor this bill. If you live elsewhere, consider telling your local elected officials that they oughta introduce legislation like this. When writing or calling, if you're a programmer or other technology expert, say so -- our voice matters.

I have more links in the algorithmictransparency tag on Pinboard.


* Many years ago, Seth Schoen made me an illustration that we still have somewhere. Reconstructed from memory:

[one smiling stick figure, male, near a courthouse] Sum amicus curiae.
[one smiling stick figure, female, near a courthouse] Sum amica curiae.
[many smiling stick figures of various genders, near a courthouse] Sumus amici curiae.
[one stick figure, male, holding a finger to his mouth as though shushing you, near a courthouse] Tacit! Sum inimicus curiae!

Edited Tuesday Sept. 19th to add: The Committee on Technology is holding a public hearing to discuss Int 1696-2017 on Monday, October 16th.