Algorithms of Oppression

We as technological consumers (and products) have spent so long thinking of technology as a neutral force that will usher us into our Jetson-esque future that it is easy to forget about the human knowledge and effort underpinning technological development. We have been encouraged by technological companies to think more about the products themselves and less about the people who make them possible. It requires a bit of a leap to remember that the tech people (mostly tech bros) who have moved to San Francisco, led to the displacement of longtime residents from marginalized communities (people of color and from within the LGBTQ+ spectrum), and have repeatedly been awful human beings, are also largely responsible for the technological products that we engage with an use on a regular basis. That level of abstraction is why it’s easy to lose sight of how the biases that these people express or hold in their personal lives almost certainly carry over into the work they do in their professional lives.

Ms. Noble’s point about tech companies’ framing a lack of representation in their hiring practices as a pipeline issue, rather than a continuation of exclusionary hiring practices, calls to mind last week’s discussion about how treating homelessness as a personal, rather than systemic, issue means that the real problem is never addressed properly. Google’s tone-deaf parternship with Black Girls Code, while still producing products that portray women of color as sexual objects, rather than people, seems especially galling. It’s not as though these companies have not been made aware of their problems, but they have taken a position of waiting until an uproar happens to tweak the results. Worse, similar to Los Angeles county, they are holding those who have suffered most responsible for the improvement of these situations, without taking any steps to dismantle the systems that created the inequalities in the first place. The lack of discussion of these issues in the formal training undertaken by people in the technological sphere ensures that a many biases, perhaps unconscious, get transmitted from tech people into the products that they release to the public.

Noble points out something that has struck me throughout this course: the feigned helplessness of tech companies when called out for their products’ deficiencies and problems. Whether it’s their culpability for allow third parties to access users’ profiles or for the racist and/or sexist results of their search algorithms, the response is always the same: this isn’t our fault. They blame others, or glitches, but they never acknowledge that what they’ve put out into the world is in any way responsible for the newest problem that has been identified. They positiopn themselves as both all-powerful (let Facebook sign you into everything on the internet, let Google manage everything about your on and offline life) and hapless victims (how could we have known that our products would be used this way/would reveal such awful results??). And while it’s likely that at least some of these problems do not spring from intent, the unwillingness of companies like Google to name and then address these issues makes me less interested in the why than the what.

As I read this book, the librarian in me identified several ways in which information literacy skills would benefit people who use products like Google and Facebook (so, pretty much everyone). First, knowing the authority of what you’re reading is key. Dylan Roof’s framing of his initial Google query illustrated that he already held negative, perhaps unconscious, beliefs about black people, but if he had known how to check the reputation of the sites that he got his information from, he might have been cautious about the beliefs they espoused (or not, but at least he would have knowningly been consuming false and racist information). But my idea again targets the people at the end of the problem, rather than those at the beginning.