Category Archives: Uncategorized

Data of Free Food Events

I’ve tried to visualize the data I collected using Excel but I only succeed in visualizing one section. So I am attaching the visualization and the link to the cleaned data here.

https://docs.google.com/spreadsheets/d/1r5QUq76QCrn2AjU3_s8wY7VpK3eCRY2iJRJHR0Pcvhw/edit?usp=sharing

The data I collected is about the “Free Food Events” I had been to during the semester. As a poor international student whose meagre income earned by working under 20 hours per week and occasional translation gigs rarely make ends meet , I go to free food events to save money on fundamental costs of living and to ensure the quality of food I eat, as well as to learn some new knowledge about what is happening in academia and elsewhere in New York. Free food events has become a part of my life and I have gained tremendous food for the body and food for the soul attending those events. Therefore I collected data on the free food events I had been to during the semester and the meta data are as follows: the name of the events, the starting and ending time of the events, how long the events last, the dates when the events happened, and what food was served during those events. I have attempted to use Excel to visualize the data I collected but I only succeed in visualizing the “hours” column, which shows how long the events lasted. I attempted to use the “map” function of Excel to visualize the locations of the events but somehow Excel couldn’t recognize the  addresses in the coloumn. I also used Voyant tools to visualize the data in the “food column” and below is what I have. Through my investigation, I think the food I eat is mostly nutritious and the diet I have had through the free food events is balanced. I can lower the hours of the free food events so I can have more time to deal with other commitments in life. I also hope that I can visualize the locations of the free food events with a map so I can learn more about the places I go to for free food events and save more time on commuting to free food events.

 

school pleasure reading comparison

As I said in class, I noticed a few things.

1) I should have included more kinds of readings, at least work reading, and quite possibly news reading (which I lumped into pleasure reading, though it’s not really that.)

2) That one spike when I was researching two projects at once has shown me that I need to juggle things better.

3) I tend to read for pleasure at meals (at least while taking classes.) Even when I had free time this term, I didn’t spend much of it reading. I think that was a change from before I started this program, but I’m not sure.

4) I wonder about the time of day I did these readings. I feel like I did most of the school reading in the mornings, but I don’t know for sure.

5) When I was reading for pleasure, I was mostly re-reading books I enjoyed. I don’t know why. Maybe it;s because I didn’t feel like I had enough time to spend it on a new book that I may not like.

Overall, taking a look at one of my habits was interesting and it has made me think not just about how much I read, but the patterns of my reading.

Resisting Oppression

Oppression can take many forms. Algorithms of Oppression reveals how modern technology exerts detrimental influence on the oppressed. As is demonstrated in other readings, Algorithms are not a neutral tool composed of numbers and mathematical formulations. It is full of human biases that are harmful to other people. When used as a tool of power, the harm becomes true. This reading focuses on the intersection of some of such biases – race and gender. It discusses in detail how algorithms reduces a racially and gender minority group to sex objects and capitalize on it, helping the oppressing to keep exploiting the vulnerable oppressed. The reading reveals that the classification of the oppressed as exotic sex objects, which is deployed by the most powerful search engine in the world, is actually from pornography in the U.S. Racial and gender prejudice, discrimination, and injustice are all at the core of it. It is unbelievable and outrageous that such abhorrent human biases are coded into the algorithms of the most powerful search engine around the world. Such invisible power that the oppressed are subject to is overwhelmingly prevalent that social injustice is increasingly exacerbated with the development of technology and an increasingly connected global economy. Resisting such vicious power is an onerous task for everyone in the society who cares about the public good.   

algorithms of misleading

Searching algorithm is not only bias,but also misleading.
Compared to Google’s search engine, Baidu, the “Google of China”,is much more revenue-oriented when present the searching results. Its algorithm is pretty straightforward for make money. They rely on ads and ranking of your search result is basically influenced by ads. The higher you pay for each word click, the higher you’ll appear in the search results. In 2016,there was a case of deceased student looking for ill treatment on Baidu. Misled by the firstly ranked advertisement in Baidu, not only the young student delayed treatment of his ill, but also he used up all of his savings on the artificial treatment recommended by Baidu’s search engine. It is incredible that Baidu’s algorithm is so simple: ranking by price. The higher price paid, the advertisement stay ahead, without considering the correlation between the page and key words of searching.

Also in order to attracting user, Baidu could even “misunderstand” very innocent key words. In 2016, there is a post which compares the totally different result when searching words “tender and smooth”. In Baidu, the result were porn images, whereas in Google, the result were very delicious breakfast with fried egg and pudding. The post also went viral on internet and Baidu modified their algorithm over night.

Algorithms of Oppression

We as technological consumers (and products) have spent so long thinking of technology as a neutral force that will usher us into our Jetson-esque future that it is easy to forget about the human knowledge and effort underpinning technological development. We have been encouraged by technological companies to think more about the products themselves and less about the people who make them possible. It requires a bit of a leap to remember that the tech people (mostly tech bros) who have moved to San Francisco, led to the displacement of longtime residents from marginalized communities (people of color and from within the LGBTQ+ spectrum), and have repeatedly been awful human beings, are also largely responsible for the technological products that we engage with an use on a regular basis. That level of abstraction is why it’s easy to lose sight of how the biases that these people express or hold in their personal lives almost certainly carry over into the work they do in their professional lives.

Ms. Noble’s point about tech companies’ framing a lack of representation in their hiring practices as a pipeline issue, rather than a continuation of exclusionary hiring practices, calls to mind last week’s discussion about how treating homelessness as a personal, rather than systemic, issue means that the real problem is never addressed properly. Google’s tone-deaf parternship with Black Girls Code, while still producing products that portray women of color as sexual objects, rather than people, seems especially galling. It’s not as though these companies have not been made aware of their problems, but they have taken a position of waiting until an uproar happens to tweak the results. Worse, similar to Los Angeles county, they are holding those who have suffered most responsible for the improvement of these situations, without taking any steps to dismantle the systems that created the inequalities in the first place. The lack of discussion of these issues in the formal training undertaken by people in the technological sphere ensures that a many biases, perhaps unconscious, get transmitted from tech people into the products that they release to the public.

Noble points out something that has struck me throughout this course: the feigned helplessness of tech companies when called out for their products’ deficiencies and problems. Whether it’s their culpability for allow third parties to access users’ profiles or for the racist and/or sexist results of their search algorithms, the response is always the same: this isn’t our fault. They blame others, or glitches, but they never acknowledge that what they’ve put out into the world is in any way responsible for the newest problem that has been identified. They positiopn themselves as both all-powerful (let Facebook sign you into everything on the internet, let Google manage everything about your on and offline life) and hapless victims (how could we have known that our products would be used this way/would reveal such awful results??). And while it’s likely that at least some of these problems do not spring from intent, the unwillingness of companies like Google to name and then address these issues makes me less interested in the why than the what.

As I read this book, the librarian in me identified several ways in which information literacy skills would benefit people who use products like Google and Facebook (so, pretty much everyone). First, knowing the authority of what you’re reading is key. Dylan Roof’s framing of his initial Google query illustrated that he already held negative, perhaps unconscious, beliefs about black people, but if he had known how to check the reputation of the sites that he got his information from, he might have been cautious about the beliefs they espoused (or not, but at least he would have knowningly been consuming false and racist information). But my idea again targets the people at the end of the problem, rather than those at the beginning.

Thoughts on Algorithms of Oppression

There are a lot of thoughts swimming through my head after this reading. In one of my other classes, we talk about all the different ways that technologies can perpetuate different systems of oppression in a theoretical / anecdotal way, but this book takes it to another to show how Google actively perpetuates racist, sexist, etc systems. Reading about Dylann Roof was terrifying because of what happened. Considering that most of the people who built the internet and are continuously building the technological tools that everyone is either using or beginning to use are white (and increasingly Asian) men, it is not unexpected that the tools they construct have their biases embedded in the technologies. Talking about Google as a tool of American Imperialism was something I had never thought about before either. When I am thinking about the pervasiveness of American technological tools on a global scale, I’m now thinking of all the different ways that the -isms that exist in American and Western cultures are being shared throughout the world.

Something else that was also extremely interesting was learning about long tail words. Everything one types onto the internet, especially in a search bar, is collected as data. How does this work with search engines that are not google? In Noble’s work, she writes that search results for “Asian girls” and “Latina girls” still contain a lot of sexually implicit or explicit content compared to “Black girls” after Google took some steps to reorganize the results. When I searched these terms on Duckduckgo (which is the search engine that I use) and Ecosia (another search engine that uses the money they make to plant trees), the results that come up are sexualized women.

I went to high school with a lot of people who now work at different tech companies (Facebook, Google, Snapchat, etc). They’re definitely the problematic Asians that Noble references here and there in the books. I’ve been thinking a lot about how to politicize my peers and how to get them to think critically about the tools that they are building because technology is never apolitical! It is also not possible to be apolitical if they tools that are being built have a political agenda being attached to them. I don’t know. This is definitely something that I need to think more about. How can we imagine a world past all these different systems of oppression if the people building and constructing them don’t even know that they are perpetuating these systems?

Turns out “Move Fast and Break Things” is not the same as “Bringing the World Closer Together”. Well, duh.

Just wanted to bring your attention to one of the articles the NYT published this week regarding the internal Facebook emails released by the UK.

Find the full article here: https://www.nytimes.com/2018/12/05/technology/facebook-emails-privacy-data.html?smid=fb-nytimes&smtyp=cur&fbclid=IwAR0UdDnZaJ65zejZtQ7tDUEUT6fGwFX8ojThJyyq9gBfTyyMaY5I6KQSXlk

In addition to discussing work-arounds for collecting data without notifying users, they have also engaged in some very interesting business practices when it comes to outlasting their competition. In our class, we’ve spent a lot of time discussing the many ethical questions regarding Facebook’s privacy policies. Reading a bit more about the large-scale ways that Facebook dominates was a useful perspective that has been partially absent from our conversations. In short, Facebook has been making decisions about how to interact with other app start-ups based on their potential to keep their place in the market cornered. For example, one of the reasons the video-app, Vine, was so successful was their link to Facebook. You’d sign up for Vine and it would suggest other Vine users to connect with based on your Facebook contacts. Upon realizing that they were the fuel to another company’s success, they not only restricted this connection, but also released Instagram video (Facebook has owned Insta since 2010). This example would suggest that Facebook is interested in being self-contained to maintain dominance, but this isn’t quite true. Ultimately, Facebook decided to grant free reign of other apps on the Facebook platform, as long as those apps send the data they’ve collected back to Facebook.

So where does this leave us? In the social media space, Facebook users are merely a product sold to companies seeking data. But other, smaller, companies are also part of that product and are also being sold by FB. Upon thinking more about Facebook’s aggressive business practices, I’m finding that they’ve become so unimaginably powerful that in a world separated by businesses and people, Facebook’s scale is so big that apps, companies, and people are all just ant-sized products. I’m left wondering what this means about Social Media Economics (if that isn’t already a term, it should be). Is this a whole new layer of our “consumer-based” system? Or does Facebook, perhaps, take the place of a meta-power that had already existed?