Computing, Ethics, & Society

CS 396: Fall 2021

Computing, Ethics, & Society

Assignments > Identity Journal 8

Due on Tue, 11/09 @ 11:59PM.

Pick one of the following prompts to respond to:

Prompt 1: Cognitive Distance

Consider this quote from last Thursday:

“Back in 1976, when AI scientist Joseph Weizenbaum wrote his scathing critique of the field, he …argued that data systems allowed scientists during wartime to operate at a psychological distance from the people “who would be maimed and killed by the weapons systems that would result from the ideas they communicated.” The answer, in Weizenbaum’s view, was to directly contend with what data actually represents: “The lesson, therefore, is that the scientist and technologist must, by acts of will and of the imagination, actively strive to reduce such psychological distances, to counter the forces that tend to remove him from the consequences of his actions. He must — it is as simple as this — think of what he is actually doing.”

Describe a time when you were working with / looking at some data, and you didn’t really consider the context or the underlying phenomenon at all. What made this possible? What practices might you put in place for yourself to get you to really think about the context / mitigate this cognitive distance?

Prompt 2: Algorithmic Bias, Human Bias

Many new technologies have emerged that claim to mitigate human bias (e.g. screening resumes based on a set of quantifiable metrics). However, we have seen many examples of how these technologies still discriminate (and sometimes make things worse).

This raises the question: don’t algorithms and computer-mediated systems also have the potential to lead to more just and fair decisions? Can’t computers help us formalize better rules and procedures to help us act in ways that promote important social values?

What do you think? Can technology help us be less biased and more fair? And if so, what’s getting in the way / what needs to happen to make these systems better?