Category: STEM

This week, ColorCode was pleased to learn that Professor Kale revoked the Robocop competition and issued a full apology for the original assignment, which, as he writes, “failed to provide adequate context” for a data set laden with historical and political racial trauma. We appreciate Professor Kale’s explanation of the assignment’s intended impact––to lead students to interrogate the policy implications of ML classifiers trained on racist data––and hope that future assignments can convey this lesson with the clarity that this assignment lacked. We sincerely applaud Professor Kale’s timely and appropriate correction, and hope that all professors at Columbia can follow his example in responding to student concerns with empathy and accountability.

 

Since our last statement, some of our peers have questioned whether the assignment’s revocation has deprived the class of an ethics lesson in handling politically challenging data sets. Lessons should not come at the cost of direct harm to the most marginalized groups involved. While we agree with Professor Kale’s professed intentions in assigning the Robocop competition, we stand by our original assessment (with which Professor Kale himself has agreed): that the assignment in its original form could not have produced the intended pedagogical outcome and discussion on data responsibility in Machine Learning. And while this particular incident has been sufficiently redressed by Professor Kale himself, we think it’s important to locate the Robocop assignment in the context of a larger department and school that excludes and silences Black students and students of color. We are studying computer science in a department with few Black students and no Black faculty, in an engineering school that builds on a legacy of close collaboration with the U.S. military and NYPD, at a university that is gentrifying Harlem to build its newest science center. From casual remarks about our intelligence by classmates, TAs, and professors, to academic policies not intended to help the most marginalized of us succeed– these experiences contribute to an academic atmosphere that repeatedly dismisses and delegitimizes our pain by “intellectualizing” academic work with horrific, racist implications and impacts. Computer Science at Columbia is steeped in a history of racism that still persists today. Within this context, an assignment “welcoming” students to a “future” of “cyborg law enforcers” trained on racist, violently-collected data is inexcusable.

 

We therefore point to the Robocop incident as evidence that massive reform is needed within the department to support Black students and other students of color, low income students, and other marginalized people in STEM. Professor Kale’s swift response gives us a lot of hope that change can happen here at Columbia. We will continue to hold professors, departments, and the university accountable to the impact of their academic work. We join Mobilized African Diaspora in demanding greater academic support for marginalized students of color, especially the hiring of Black faculty in Computer Science and SEAS. We also ask that SEAS as a whole reaffirm its commitment to its most marginalized students by expanding course offerings on research ethics and incorporating requirements in African American Studies and Ethnic Studies. We ask this with the recognition that technical knowledge is dangerous without an analysis of race and power. Finally, we urge current professors to build on pedagogy and research that is explicitly anti-racist and anti-oppressive, that gives students the opportunity to work on projects that uplift and liberate communities of color and other marginalized people.

 

We thank the following groups for their explicit support (running list). Please reach out to colorcodeboard@gmail.com if your organization would like to co-sign:

National Society of Black Engineers– Columbia

The Lion

No Red Tape

Students for Justice in Palestine

Divest Barnard

Photo Courtesy of Color Code

On Thursday, the ColorCode committee learned that Columbia University Computer Science professor Satyen Kale assigned to his Machine Language (COMS 4117) class a competition “to produce the eponymous cyborg law enforcer.” Drawing on data from the NYPD’s “Stop, Question and Frisk” records, students have been asked to create a machine learning algorithm to “decide when a suspect it has stopped should be arrested” based on characteristics ranging from “sex” and “race” to “suspect was wearing unseasonable attire”, “suspicious bulge”, and “change direction at sight of officer”. Stop­ and ­Frisk is a violently racist program that allows police to stop, question, and frisk any pedestrian who arouses “reasonable suspicion.” Numerous studies and investigations of the NYPD’s own data have shown that Stop­ and ­Frisk disproportionately targets Black people. It has torn apart Black communities in the city and contributes to a system of mass incarceration and policing that brutalizes, incarcerates, and kills Black people across the nation. The program has even been deemed unconstitutional by federal courts.

That a Columbia professor would ask students to implement a program that reproduces and aids Stop­ and Frisk policing with zero acknowledgement of the violence and harm inflicted by the actual program­­–and in fact suggest that machine learning algorithms like this constitute “the future” of machine learning applications— is an egregious example of racist, ahistorical, and irresponsible pedagogy. Data are not apolitical. Algorithms are not objective. To teach technical skills without also teaching anti­racist, anti­oppression developing principles is unforgivable, despicable, and dangerous. For us, as students of color who also are coders, entrepreneurs, and engineers, assignments like this confirm feelings of exclusion and isolation accumulated over many semesters here–­­­being one in a only handful of Black students in a lecture hall, for example, or graduating from SEAS not having had even a single Black professor. It confirms the department and university’s disregard for our wellbeing as students of color, which always is intertwined with the wellbeing of our communities.

Moving forward, ColorCode demands that this Machine Learning assignment be revoked, and that the professor issue an apology addressing the concerns above. We demand that students in the class be provided with alternate ways to receive credit. We demand that the professor and the department acknowledge these concerns, apologize, and make significant, structural changes to ensure this does not happen again. Finally, we support the demands of Mobilized African Diaspora/BCSN and in particular add our voices to demand that the School of Engineering commit to hiring more Black professors and underrepresented professors of color.

ColorCode is a group focused on getting people of color into the technology sector. To respond to this op-ed or submit one of your own, email submissions@columbialion.com

Meet Mathew Pregasen. Mathew is a sophomore in the Engineering College and a Computer Science major from Orlando, Florida. He is the founder of The Undergraduate Times, a worldwide, student-run undergraduate publication; co-founder of UProspie, a matching service for prospective undergrads and current students; co-founder of Podira, a smart flashcard-learning tool; and other tech startups. The Lion sat down with him to talk about his work and entrepreneurship at Columbia.

Continue Reading..