Author Archives: Nicole Williams

Algorithms of Oppression

We as technological consumers (and products) have spent so long thinking of technology as a neutral force that will usher us into our Jetson-esque future that it is easy to forget about the human knowledge and effort underpinning technological development. We have been encouraged by technological companies to think more about the products themselves and less about the people who make them possible. It requires a bit of a leap to remember that the tech people (mostly tech bros) who have moved to San Francisco, led to the displacement of longtime residents from marginalized communities (people of color and from within the LGBTQ+ spectrum), and have repeatedly been awful human beings, are also largely responsible for the technological products that we engage with an use on a regular basis. That level of abstraction is why it’s easy to lose sight of how the biases that these people express or hold in their personal lives almost certainly carry over into the work they do in their professional lives.

Ms. Noble’s point about tech companies’ framing a lack of representation in their hiring practices as a pipeline issue, rather than a continuation of exclusionary hiring practices, calls to mind last week’s discussion about how treating homelessness as a personal, rather than systemic, issue means that the real problem is never addressed properly. Google’s tone-deaf parternship with Black Girls Code, while still producing products that portray women of color as sexual objects, rather than people, seems especially galling. It’s not as though these companies have not been made aware of their problems, but they have taken a position of waiting until an uproar happens to tweak the results. Worse, similar to Los Angeles county, they are holding those who have suffered most responsible for the improvement of these situations, without taking any steps to dismantle the systems that created the inequalities in the first place. The lack of discussion of these issues in the formal training undertaken by people in the technological sphere ensures that a many biases, perhaps unconscious, get transmitted from tech people into the products that they release to the public.

Noble points out something that has struck me throughout this course: the feigned helplessness of tech companies when called out for their products’ deficiencies and problems. Whether it’s their culpability for allow third parties to access users’ profiles or for the racist and/or sexist results of their search algorithms, the response is always the same: this isn’t our fault. They blame others, or glitches, but they never acknowledge that what they’ve put out into the world is in any way responsible for the newest problem that has been identified. They positiopn themselves as both all-powerful (let Facebook sign you into everything on the internet, let Google manage everything about your on and offline life) and hapless victims (how could we have known that our products would be used this way/would reveal such awful results??). And while it’s likely that at least some of these problems do not spring from intent, the unwillingness of companies like Google to name and then address these issues makes me less interested in the why than the what.

As I read this book, the librarian in me identified several ways in which information literacy skills would benefit people who use products like Google and Facebook (so, pretty much everyone). First, knowing the authority of what you’re reading is key. Dylan Roof’s framing of his initial Google query illustrated that he already held negative, perhaps unconscious, beliefs about black people, but if he had known how to check the reputation of the sites that he got his information from, he might have been cautious about the beliefs they espoused (or not, but at least he would have knowningly been consuming false and racist information). But my idea again targets the people at the end of the problem, rather than those at the beginning.

Automating Inequality

I find it sad that automated systems that are supposed to help the most vulnerable people in our society are often used to further discriminate against and disenfranchise these people. Thinking critically about the results of programs like Los Angeles’ VI-SPDAT and Allegheny, Pennsylvania’s AFST helps identify the harmful assumptions at the foundation of these tools’ creation. They perpetuate the idea that poverty in the United States is the result of individuals’ inherent weakness or poor decisions, instead of the result of systemic legal, medical, gendered, racial, and educational inequalities that make it difficult for those who are already poor to experience improved circumstances.

Los Angeles’ housing match system has solved some problems, including getting some unhoused people into housing and making it easier for community organizations with similar missions to reach as many people as possible. These are great benefits, but there are large costs as well. The data that is collected from applicants can be kept for seven years and shared with 168 organizations, as well as several local and federal government entities. Applicants do not get to see what their information looks like before it is distributed, and the algorithmic score that their data yields is not shared with them. The flow of information is one way only. Because of this lack of transparency, it’s difficult to understand why some unhoused people are able to find homes with relative ease and others can apply several times with no success. In addition to the amount of information that is required to apply, making applicants responsible for obtaining documentation such as birth certificates is rather short-sighted, considering that a lot of both chronic and crisis unhoused people may lack the financial and/or technological resources to get the required documents. The author’s point in her introduction that the sheer time it takes for individuals to navigate these systems is not something afforded to everyone is so important to keep in mind when reading these stories.

Habeas Data

Tinfoil hat time: The government’s lack of proactivity regarding laws that address the current and future concerns of digital life does not strike me as coincidental. I think that there is, at least on some level, and intentionality to the logic that permits law enforcement LPR systems to scan and keep location data on thousands of license plates that are not implicated or involved in crimes. The foundational documents of this country were written by people who could never have imagined email, or data centers, or Wikileaks.

While technology has advanced beyond what anybody could have imagined in even the 1980s, when most households didn’t own a computer, it seems especially troubling that the government has used these advancements to exponentially expand their abilities to monitor the populace, and has not acted like an institution that is supposed to exist within a framework of checks and balances. The combination of secrecy and incompetence exhibited by the government when trying to get information from Lavabit is especially troubling, and I can’t decide whether it’s a good or bad thing that they’re so bad at this stuff.

Other thoughts: I have a lot of different email addresses, all free. With the professional and academic addresses, I have no expectation of privacy and conduct myself accordingly. WIth the other address, most of which are through Google, I’ve been pretty lax about considering how my data is used. I have browser add-ons that disable ads, so I don’t even remember that I should be seeing targeted ads. The adage that Lavabit founder Ladar Levison cites (“If you’re not paying for the product, you are the product.”) is one that makes complete sense, but is hard to keep at the forefront of my mind when compared to the ease of using Gmail/Google mail for business. Related: Yahoo has to pay $50b due to breaching mail users’ data

Facebook is super happy for you to know about its Fort Worth data center

I never know whether to attribute changes that have occurred between the time something is written and when we read it to cultural/attitude shifts, or mere coincidence. In Jennifer Holt and Patrick Vonderau’s chapter “Where the Internet Lives’: Data Centers as Cloud Infrastructure,” published in 2015, the authors discuss how secretive companies often are about the physical spaces of data centers. Google is cited as one exception to this. Has thinking about data centers changed so much in the last three years? Because it seems that Facebook is also happy to attach a certain level of visibility to their data centers.

A blog post on the web site of Dallas Innovates magazine includes information such as their Fort Worth data center’s design elements, physical location, and cooling strategies. The post also includes photos of the data room, massive fans, and the pipes that move water throughout the center, but its overall the focus seems to be oriented more towards the site’s employees; there are a lot of pictures of artwork (much of it tied directly to the region and/or produced by local artists) and employee spaces than on the practical infrastructure needed to power and run a massive data  center. The short post does not say whether Facebook directed or limited what could be photographed, or if the magazine’s photographer/editor chose what to shoot and feature. This visit happened recently, but I found several older articles/blog posts from the North Texas area from others who had been granted access to the data center; Facebook definitely isn’t trying to keep this place under wraps, but their openness with information about their data centers just makes me wonder what they’re misdirecting people’s attention from.

Perhaps unsurprisingly, the data center also has its own Facebook page. Its posts seem evenly split between positive press releases, encouraging people to apply for a job at the campus, and information about Facebook’s charitable contributions in the region. My favorite comments came from the disgruntled former employees (top gripe: how out of the way the campus is and the lack of bus service to get there). Other Facebook data centers are linked to from that page, leading me to believe that this openness about their data centers is the rule, rather than an exception.