Transparency of Algorithm

In Weapons of Math Destruction, one of the issues discussed is the transparency of algorithms, especially those that are deployed to “measure” human beings and have wreaked havoc on them. This reminds me of a very vivid example that I saw on the New York Subway. It is advertisement for Seamless, an online food order service. In this interestingly funny advertisement, there are several pictures with words that describe the characteristics of New York neighborhoods based on the data Seamless collected through their services and their interpretation of the data analysis results. For example, one of the picture says “The most tender neighborhood- Fordham, Bronx Based on the number of orders of chicken tenders”. In this picture, there is a macho guy who holds a piece of chicken tender in one hand  while holding a cute little kitty in the other hand. Obviously, this interpretation of analyzed data is completely for commercial purpose. It uses the play-on word of “tender” to achieve a humorous effect so that viewers of the advertisement can have a deep impression of their service through exposure to this unreasonable but funny connection they make between data analyzed by algorithms and their product. The company is transparent in revealing the way in which data and algorithm are used to draw conclusions, which may impose a stereotype on a neighborhood. For people who do not agree with this conclusion, they will know how it is made through such transparency. In this series of advertisements, chelsea is named “the most homesick neighborhood” based on the order of a home-made dish and another neighborhood is named “the neighborhood with the most hot yoga” for having the most orders kumbacha. Every one of those names could have an impact on the perceptions and images of these neighborhoods. If such perceptions and images have detrimental effects on the people living there, as what the algorithms did to the math teachers who were labeled “incompetent” in the elementary school mentioned in the reading, then the methods through which those images and perceptions were created are of major significance as they are the keys to solving issues of injustice and inequality.

 

However, transparency of algorithms is still at the mercy of big corporations who keep them as top secret in spite of the harm they can cause to individuals who are subject to unfair “measuring” based on such algorithms. Naming or categorizing individuals is of high risk because there are millions of implications such names or categories are associated with that may change one’s life tremendously. A socially responsible approach to algorithms should be adopted by corporations, the government, and individuals to make sure that we are not hurt by what we create for a better life in the first place.