The Blog


ColorCode: Statement on COMS 4771 “Stop and Frisk” Competition

Photo Courtesy of Color Code

On Thursday, the ColorCode committee learned that Columbia University Computer Science professor Satyen Kale assigned to his Machine Language (COMS 4117) class a competition “to produce the eponymous cyborg law enforcer.” Drawing on data from the NYPD’s “Stop, Question and Frisk” records, students have been asked to create a machine learning algorithm to “decide when a suspect it has stopped should be arrested” based on characteristics ranging from “sex” and “race” to “suspect was wearing unseasonable attire”, “suspicious bulge”, and “change direction at sight of officer”. Stop­ and ­Frisk is a violently racist program that allows police to stop, question, and frisk any pedestrian who arouses “reasonable suspicion.” Numerous studies and investigations of the NYPD’s own data have shown that Stop­ and ­Frisk disproportionately targets Black people. It has torn apart Black communities in the city and contributes to a system of mass incarceration and policing that brutalizes, incarcerates, and kills Black people across the nation. The program has even been deemed unconstitutional by federal courts.

That a Columbia professor would ask students to implement a program that reproduces and aids Stop­ and Frisk policing with zero acknowledgement of the violence and harm inflicted by the actual program­­–and in fact suggest that machine learning algorithms like this constitute “the future” of machine learning applications— is an egregious example of racist, ahistorical, and irresponsible pedagogy. Data are not apolitical. Algorithms are not objective. To teach technical skills without also teaching anti­racist, anti­oppression developing principles is unforgivable, despicable, and dangerous. For us, as students of color who also are coders, entrepreneurs, and engineers, assignments like this confirm feelings of exclusion and isolation accumulated over many semesters here–­­­being one in a only handful of Black students in a lecture hall, for example, or graduating from SEAS not having had even a single Black professor. It confirms the department and university’s disregard for our wellbeing as students of color, which always is intertwined with the wellbeing of our communities.

Moving forward, ColorCode demands that this Machine Learning assignment be revoked, and that the professor issue an apology addressing the concerns above. We demand that students in the class be provided with alternate ways to receive credit. We demand that the professor and the department acknowledge these concerns, apologize, and make significant, structural changes to ensure this does not happen again. Finally, we support the demands of Mobilized African Diaspora/BCSN and in particular add our voices to demand that the School of Engineering commit to hiring more Black professors and underrepresented professors of color.

ColorCode is a group focused on getting people of color into the technology sector. To respond to this op-ed or submit one of your own, email submissions@columbialion.com

Comments ( 5 )
  • Rationalist says:

    To build a program that uncovers the inherent biases of stop and frisk is far from racist. Perhaps the professor did not give the inclination that one should think critically on the results of the algorithm, but the assignment should hardly be revoked. Data is apolitical, what it means and how it’s presented is not

  • Kshitij Lauria says:

    I speak as a person of color who has been unfairly stopped and frisked thrice since moving to New York. I work professionally as a computer scientist, often in machine learning tasks.

    The point of this assignment was exactly the opposite of what the students protesting it seem to think. It is a cutting-edge research topic to design machine learning algorithms that abide by laws against discrimination, or to catch people who hide racist policies behind an opaque machine-learning algorithm. This assignment was a great way for students to see these issues firsthand. This is intersectional research of the best kind: technology meets a pressing social issue. What better forum to discuss this than in an Ivy League machine learning classroom, with real data at hand?

    It seems the students who want the assignment taken down have not cared to search Google even once, for if they had, they would have learned how important this issue is in the machine learning community.

    Instead, they jumped to the bad-faith conclusion that the professor is ignorant or worse, racist. No! He is the very opposite!

    The professor has since posted an explanation that supports my argument: the point of this assignment was precisely to critique policies like stop-and-frisk, while at the same time bringing important and interesting research into the classroom. A knee-jerk, witch-hunt response by the ColorCode Committee has deprived students of the chance to learn about this topic in a classroom setting. What a travesty.

    Think about the missed opportunity here: students who had completed and discussed this assignment would be better prepared to catch instances in the real world where people abuse machine learning technology to enact racist policies. But no, the ColorCode Committee would rather shy away from a clear-eyed discussion of issues in the classroom. It’s a shame.

  • jyc says:

    See the professor’s response to this here: http://www.satyenkale.com/coms4771/statement.html

    You completely missed the point of the assignment. Please, Google shit before you post. The professor clearly knows all this and that was WHY he created the assignment in the first place. This issue is currently a hotly debated topic in Machine Learning and he was trying to bring a bleeding-edge conversation to the class. People are in courts right now over stuff like this, and he was trying to give you an introduction to this issue and illustrate how algorithms can and are being used in politicized, problematic ways.

    HE IS ON YOUR SIDE. Shame on him for trying to create an engaging and relevant assignment, right? Great job, now you’ve deprived an entire class from being aware of this issue and we will graduate another class of ML programmers who are unaware of the possible political ramifications of their work.

    There aren’t many CS professors at Columbia who are willing to touch issues like this, and you can bet there are going to even fewer after this. And you kept students from being able to work on a cool and illuminating assignment. Congratulations, you played yourself.

  • Kshitij Lauria says:

    I would also like to point out the irony in forcing a brown-skinned professor to take down an assignment that critiques stop-and-frisk. Brown-skinned people, like other people of color, are disproportionately affected by the racist stop-and-frisk policy.

    This professor has probably been stopped-and-frisked! And now his contribution to the conversation has been silenced by the people of this committee, who are overly-sensitive to the point of ignorance, so quick to take to their keyboard that they didn’t even Google his name and look at a picture of him. Congratulations!

  • Dexter Callender says:

    All of the demands were met this morning. The assignment was revoked, a sincere apology was given, and all of the social/racial implications of the assignment addressed. It is unfortunate that this article was written before the professor had a chance to do so, yet still incriminates him, by name.

    I request that his name be removed from this article, if not the article be rescinded altogether.

Leave a Reply