Data of Free Food Events

I’ve tried to visualize the data I collected using Excel but I only succeed in visualizing one section. So I am attaching the visualization and the link to the cleaned data here.

https://docs.google.com/spreadsheets/d/1r5QUq76QCrn2AjU3_s8wY7VpK3eCRY2iJRJHR0Pcvhw/edit?usp=sharing

The data I collected is about the “Free Food Events” I had been to during the semester. As a poor international student whose meagre income earned by working under 20 hours per week and occasional translation gigs rarely make ends meet , I go to free food events to save money on fundamental costs of living and to ensure the quality of food I eat, as well as to learn some new knowledge about what is happening in academia and elsewhere in New York. Free food events has become a part of my life and I have gained tremendous food for the body and food for the soul attending those events. Therefore I collected data on the free food events I had been to during the semester and the meta data are as follows: the name of the events, the starting and ending time of the events, how long the events last, the dates when the events happened, and what food was served during those events. I have attempted to use Excel to visualize the data I collected but I only succeed in visualizing the “hours” column, which shows how long the events lasted. I attempted to use the “map” function of Excel to visualize the locations of the events but somehow Excel couldn’t recognize the  addresses in the coloumn. I also used Voyant tools to visualize the data in the “food column” and below is what I have. Through my investigation, I think the food I eat is mostly nutritious and the diet I have had through the free food events is balanced. I can lower the hours of the free food events so I can have more time to deal with other commitments in life. I also hope that I can visualize the locations of the free food events with a map so I can learn more about the places I go to for free food events and save more time on commuting to free food events.

 

school pleasure reading comparison

As I said in class, I noticed a few things.

1) I should have included more kinds of readings, at least work reading, and quite possibly news reading (which I lumped into pleasure reading, though it’s not really that.)

2) That one spike when I was researching two projects at once has shown me that I need to juggle things better.

3) I tend to read for pleasure at meals (at least while taking classes.) Even when I had free time this term, I didn’t spend much of it reading. I think that was a change from before I started this program, but I’m not sure.

4) I wonder about the time of day I did these readings. I feel like I did most of the school reading in the mornings, but I don’t know for sure.

5) When I was reading for pleasure, I was mostly re-reading books I enjoyed. I don’t know why. Maybe it;s because I didn’t feel like I had enough time to spend it on a new book that I may not like.

Overall, taking a look at one of my habits was interesting and it has made me think not just about how much I read, but the patterns of my reading.

Algorithms of Oppression

Introduction
• technological redlining and algorithmic oppression
• big data and algorithms are anything but benign, neutral, or objective
• “glitches” don’t suggest that the “organizing logics of the web” are broken, but instead something just going wrong with a “near-perfect system.”
• “Google’s position is that it is not responsible for its algorithm.”
Searching for Black Girls
• lack of diversity framed as “pipeline” issues in hiring, instead of racism and sexism
• “search engine results perpetuated particular narratives that reflect historically uneven distributions of power in society.” (71)
• theories of racial formation vs. theories of structural white supremacy (79)
• “white supremacy as the dominant lens and structure through which sense-making of race online can occur” (84). “just google it.”
• “Google/Alphabet is a broker of cultural imperialism that is arguably the most powerful expression of media dominance on the web we have yet to see.” (86)
• “contextualize information as a form of representation, or cultural production, rather than as seemingly neutral and benign data that is thought of as a ‘website’ or ‘URL’ that surfaces to the top in a search.” (106)
Searching for People and Communities
• “cloaked websites”
• “search results belie any ability to intercede in the framing of a question itself” (116)
• “search engine results also function as a type of personal record and as records of communities” (116)

Resisting Oppression

Oppression can take many forms. Algorithms of Oppression reveals how modern technology exerts detrimental influence on the oppressed. As is demonstrated in other readings, Algorithms are not a neutral tool composed of numbers and mathematical formulations. It is full of human biases that are harmful to other people. When used as a tool of power, the harm becomes true. This reading focuses on the intersection of some of such biases – race and gender. It discusses in detail how algorithms reduces a racially and gender minority group to sex objects and capitalize on it, helping the oppressing to keep exploiting the vulnerable oppressed. The reading reveals that the classification of the oppressed as exotic sex objects, which is deployed by the most powerful search engine in the world, is actually from pornography in the U.S. Racial and gender prejudice, discrimination, and injustice are all at the core of it. It is unbelievable and outrageous that such abhorrent human biases are coded into the algorithms of the most powerful search engine around the world. Such invisible power that the oppressed are subject to is overwhelmingly prevalent that social injustice is increasingly exacerbated with the development of technology and an increasingly connected global economy. Resisting such vicious power is an onerous task for everyone in the society who cares about the public good.   

algorithms of misleading

Searching algorithm is not only bias,but also misleading.
Compared to Google’s search engine, Baidu, the “Google of China”,is much more revenue-oriented when present the searching results. Its algorithm is pretty straightforward for make money. They rely on ads and ranking of your search result is basically influenced by ads. The higher you pay for each word click, the higher you’ll appear in the search results. In 2016,there was a case of deceased student looking for ill treatment on Baidu. Misled by the firstly ranked advertisement in Baidu, not only the young student delayed treatment of his ill, but also he used up all of his savings on the artificial treatment recommended by Baidu’s search engine. It is incredible that Baidu’s algorithm is so simple: ranking by price. The higher price paid, the advertisement stay ahead, without considering the correlation between the page and key words of searching.

Also in order to attracting user, Baidu could even “misunderstand” very innocent key words. In 2016, there is a post which compares the totally different result when searching words “tender and smooth”. In Baidu, the result were porn images, whereas in Google, the result were very delicious breakfast with fried egg and pudding. The post also went viral on internet and Baidu modified their algorithm over night.

Algorithms of Oppression

We as technological consumers (and products) have spent so long thinking of technology as a neutral force that will usher us into our Jetson-esque future that it is easy to forget about the human knowledge and effort underpinning technological development. We have been encouraged by technological companies to think more about the products themselves and less about the people who make them possible. It requires a bit of a leap to remember that the tech people (mostly tech bros) who have moved to San Francisco, led to the displacement of longtime residents from marginalized communities (people of color and from within the LGBTQ+ spectrum), and have repeatedly been awful human beings, are also largely responsible for the technological products that we engage with an use on a regular basis. That level of abstraction is why it’s easy to lose sight of how the biases that these people express or hold in their personal lives almost certainly carry over into the work they do in their professional lives.

Ms. Noble’s point about tech companies’ framing a lack of representation in their hiring practices as a pipeline issue, rather than a continuation of exclusionary hiring practices, calls to mind last week’s discussion about how treating homelessness as a personal, rather than systemic, issue means that the real problem is never addressed properly. Google’s tone-deaf parternship with Black Girls Code, while still producing products that portray women of color as sexual objects, rather than people, seems especially galling. It’s not as though these companies have not been made aware of their problems, but they have taken a position of waiting until an uproar happens to tweak the results. Worse, similar to Los Angeles county, they are holding those who have suffered most responsible for the improvement of these situations, without taking any steps to dismantle the systems that created the inequalities in the first place. The lack of discussion of these issues in the formal training undertaken by people in the technological sphere ensures that a many biases, perhaps unconscious, get transmitted from tech people into the products that they release to the public.

Noble points out something that has struck me throughout this course: the feigned helplessness of tech companies when called out for their products’ deficiencies and problems. Whether it’s their culpability for allow third parties to access users’ profiles or for the racist and/or sexist results of their search algorithms, the response is always the same: this isn’t our fault. They blame others, or glitches, but they never acknowledge that what they’ve put out into the world is in any way responsible for the newest problem that has been identified. They positiopn themselves as both all-powerful (let Facebook sign you into everything on the internet, let Google manage everything about your on and offline life) and hapless victims (how could we have known that our products would be used this way/would reveal such awful results??). And while it’s likely that at least some of these problems do not spring from intent, the unwillingness of companies like Google to name and then address these issues makes me less interested in the why than the what.

As I read this book, the librarian in me identified several ways in which information literacy skills would benefit people who use products like Google and Facebook (so, pretty much everyone). First, knowing the authority of what you’re reading is key. Dylan Roof’s framing of his initial Google query illustrated that he already held negative, perhaps unconscious, beliefs about black people, but if he had known how to check the reputation of the sites that he got his information from, he might have been cautious about the beliefs they espoused (or not, but at least he would have knowningly been consuming false and racist information). But my idea again targets the people at the end of the problem, rather than those at the beginning.

Thoughts on Algorithms of Oppression

There are a lot of thoughts swimming through my head after this reading. In one of my other classes, we talk about all the different ways that technologies can perpetuate different systems of oppression in a theoretical / anecdotal way, but this book takes it to another to show how Google actively perpetuates racist, sexist, etc systems. Reading about Dylann Roof was terrifying because of what happened. Considering that most of the people who built the internet and are continuously building the technological tools that everyone is either using or beginning to use are white (and increasingly Asian) men, it is not unexpected that the tools they construct have their biases embedded in the technologies. Talking about Google as a tool of American Imperialism was something I had never thought about before either. When I am thinking about the pervasiveness of American technological tools on a global scale, I’m now thinking of all the different ways that the -isms that exist in American and Western cultures are being shared throughout the world.

Something else that was also extremely interesting was learning about long tail words. Everything one types onto the internet, especially in a search bar, is collected as data. How does this work with search engines that are not google? In Noble’s work, she writes that search results for “Asian girls” and “Latina girls” still contain a lot of sexually implicit or explicit content compared to “Black girls” after Google took some steps to reorganize the results. When I searched these terms on Duckduckgo (which is the search engine that I use) and Ecosia (another search engine that uses the money they make to plant trees), the results that come up are sexualized women.

I went to high school with a lot of people who now work at different tech companies (Facebook, Google, Snapchat, etc). They’re definitely the problematic Asians that Noble references here and there in the books. I’ve been thinking a lot about how to politicize my peers and how to get them to think critically about the tools that they are building because technology is never apolitical! It is also not possible to be apolitical if they tools that are being built have a political agenda being attached to them. I don’t know. This is definitely something that I need to think more about. How can we imagine a world past all these different systems of oppression if the people building and constructing them don’t even know that they are perpetuating these systems?