This week, ColorCode was pleased to learn that Professor Kale revoked the Robocop competition and issued a full apology for the original assignment, which, as he writes, “failed to provide adequate context” for a data set laden with historical and political racial trauma. We appreciate Professor Kale’s explanation of the assignment’s intended impact––to lead students to interrogate the policy implications of ML classifiers trained on racist data––and hope that future assignments can convey this lesson with the clarity that this assignment lacked. We sincerely applaud Professor Kale’s timely and appropriate correction, and hope that all professors at Columbia can follow his example in responding to student concerns with empathy and accountability.
Since our last statement, some of our peers have questioned whether the assignment’s revocation has deprived the class of an ethics lesson in handling politically challenging data sets. Lessons should not come at the cost of direct harm to the most marginalized groups involved. While we agree with Professor Kale’s professed intentions in assigning the Robocop competition, we stand by our original assessment (with which Professor Kale himself has agreed): that the assignment in its original form could not have produced the intended pedagogical outcome and discussion on data responsibility in Machine Learning. And while this particular incident has been sufficiently redressed by Professor Kale himself, we think it’s important to locate the Robocop assignment in the context of a larger department and school that excludes and silences Black students and students of color. We are studying computer science in a department with few Black students and no Black faculty, in an engineering school that builds on a legacy of close collaboration with the U.S. military and NYPD, at a university that is gentrifying Harlem to build its newest science center. From casual remarks about our intelligence by classmates, TAs, and professors, to academic policies not intended to help the most marginalized of us succeed– these experiences contribute to an academic atmosphere that repeatedly dismisses and delegitimizes our pain by “intellectualizing” academic work with horrific, racist implications and impacts. Computer Science at Columbia is steeped in a history of racism that still persists today. Within this context, an assignment “welcoming” students to a “future” of “cyborg law enforcers” trained on racist, violently-collected data is inexcusable.
We therefore point to the Robocop incident as evidence that massive reform is needed within the department to support Black students and other students of color, low income students, and other marginalized people in STEM. Professor Kale’s swift response gives us a lot of hope that change can happen here at Columbia. We will continue to hold professors, departments, and the university accountable to the impact of their academic work. We join Mobilized African Diaspora in demanding greater academic support for marginalized students of color, especially the hiring of Black faculty in Computer Science and SEAS. We also ask that SEAS as a whole reaffirm its commitment to its most marginalized students by expanding course offerings on research ethics and incorporating requirements in African American Studies and Ethnic Studies. We ask this with the recognition that technical knowledge is dangerous without an analysis of race and power. Finally, we urge current professors to build on pedagogy and research that is explicitly anti-racist and anti-oppressive, that gives students the opportunity to work on projects that uplift and liberate communities of color and other marginalized people.
We thank the following groups for their explicit support (running list). Please reach out to email@example.com if your organization would like to co-sign:
National Society of Black Engineers– Columbia
No Red Tape
Students for Justice in Palestine