Author Archives: Lisa Gueldenzopf

Turns out “Move Fast and Break Things” is not the same as “Bringing the World Closer Together”. Well, duh.

Just wanted to bring your attention to one of the articles the NYT published this week regarding the internal Facebook emails released by the UK.

Find the full article here: https://www.nytimes.com/2018/12/05/technology/facebook-emails-privacy-data.html?smid=fb-nytimes&smtyp=cur&fbclid=IwAR0UdDnZaJ65zejZtQ7tDUEUT6fGwFX8ojThJyyq9gBfTyyMaY5I6KQSXlk

In addition to discussing work-arounds for collecting data without notifying users, they have also engaged in some very interesting business practices when it comes to outlasting their competition. In our class, we’ve spent a lot of time discussing the many ethical questions regarding Facebook’s privacy policies. Reading a bit more about the large-scale ways that Facebook dominates was a useful perspective that has been partially absent from our conversations. In short, Facebook has been making decisions about how to interact with other app start-ups based on their potential to keep their place in the market cornered. For example, one of the reasons the video-app, Vine, was so successful was their link to Facebook. You’d sign up for Vine and it would suggest other Vine users to connect with based on your Facebook contacts. Upon realizing that they were the fuel to another company’s success, they not only restricted this connection, but also released Instagram video (Facebook has owned Insta since 2010). This example would suggest that Facebook is interested in being self-contained to maintain dominance, but this isn’t quite true. Ultimately, Facebook decided to grant free reign of other apps on the Facebook platform, as long as those apps send the data they’ve collected back to Facebook.

So where does this leave us? In the social media space, Facebook users are merely a product sold to companies seeking data. But other, smaller, companies are also part of that product and are also being sold by FB. Upon thinking more about Facebook’s aggressive business practices, I’m finding that they’ve become so unimaginably powerful that in a world separated by businesses and people, Facebook’s scale is so big that apps, companies, and people are all just ant-sized products. I’m left wondering what this means about Social Media Economics (if that isn’t already a term, it should be). Is this a whole new layer of our “consumer-based” system? Or does Facebook, perhaps, take the place of a meta-power that had already existed?

Michelle Alexander weighs in on WM(ath)Ds

Michelle Alexander, the author of The New Jim Crow, wrote an op-ed for the Times this week about the way algorithms will reinforce racist policing. Referencing Cathy O’Neil a few times, she claims that some of our recent systemic victories like Amendment 4 in Florida are only temporary fixes that disguise the new technology waiting in the wings of the mass-incarceration stage.
Alexander’s book, which was published in 2010, explains how the Jim Crow era never ended; only changed shapes. Through racist police policies, the war on drugs, and private prisons, a price has been put on the head of people of color (specifically black men) encouraging prosecutors to imprison them. Once they’ve been funneled into these private prisons – often for small drug charges – inmates will be given mandatory work for just pennies an hour. Many of our largest, name-recognized companies produce products through this wrangled labor force, including Victoria’s Secret.
She argues, alongside O’Neil, that by digitizing criminal enforcement, the US will be setting racism in mathematical stone that will be beyond reproach due to its opacity. She goes further, discussing e-carceration, a potential upcoming “Newest Jim Crow.”
Read the whole piece here:

https://www.nytimes.com/2018/11/08/opinion/sunday/criminal-justice-reforms-race-technology.html

On Undersea Infrastructure

An important question raised in “Fixed Flow” is that of timing. The average user does not have the opportunity to voice their opinions or preferences in terms of the paths their data takes. A more informed user would likely desire keeping all information on nearby servers or, if they are less concerned with privacy, would perhaps desire high speeds. But they will not experience either of these due to the media/tech companies that have control over the paths themselves. It is beneficial for them to move data as inexpensively as possible, regardless of effect on the user. One of the only exceptions may be financial companies that plan data routes so fast they can have a minscule head start in stock trades. The user is left out, at the whim of those whose names are on the cables and servers around the world. This is where the opacity of networks becomes relevant. Staroslieski recognizes that most people imagine the cloud as a magical, overhead ether. Thinking about some of our other readings this semester, there’s a strong argument to be made that this public perception of the cloud is no accident. But there is no magical ether – if you’re not storing your data on your own hard drive, you’re storing it on someone else’s.
The relationship that this particular undersea infrastructure has on the media passed through it is interesting. Cables are installed to prepare for an expected media influx while media is simultaneously expanded to get efficient use of of these systems. What are the secret impacts of a media infrastructure used by the masses but quietly controlled by the few? In a addition to superficial concerns like video game play there are huge security and privacy concerns. The user does not know to be concerned about their information passing through the servers of other countries or companies, but they probably should be.
There was one area where I disagreed with Starsolieski. She claims inequality to access will grow due to the high cost of cable networks. But most technology gets drastically less expensive over time due to the continued advancements in the field. So I would argue that it is unwise to draw predictions of future access based on current cost analyses.

10-30 Reading Comments

In the introduction to Signal Traffic, Parks and Starosielski define media infrastructure.  They have identified that this is an important focus because the way media is transferred and stored does impact the media itself.  I was initially struck by the elegance of building new technological infrastructure over old industrial infrastructure.  It reminded me of old subway cars being dumped into the ocean to encourage coral reef growth.  One of the first examples discussed is a Google data center that used to be a paper mill.  The authors note that 100 employees work where 650 used to.  I wonder what their hours and wages are like.  There are some other serious downsides to building infrastructure that relies on old infrastructure.  I think about the NYC Subway; even the updated stations still, in part, rely on the old switches they’ve had in use since 1904.  The authors introduce the other pieces in the book, explaining the importance of scale and perspective.

In “Where the Internet Lives”, Holt and Vonderau discuss the physical data storage that occurs around the globe, focusing on Sweden.  Under the guise of transparency, Google releases information about what these places look like, the way the wiring works, and plans for temperature control.  But they do not go out of their way to give any details about the actual storage mechanisms used.  How safe is the private information being stored in Sweden?  Are there backups? Do the storage drives communicate with one another or are they each a separate black box?  After reading this chapter, I got the impression that these are the questions Google does not want us asking which explains why they overshare the more superficial information about their business.  Google’s treatment of their data centers reminds me of politicians who dump thousands of documents on investigators while concealing the important ones.

Post Re: Cathy O’Neil

Before I begin my comments on WMDs, I would like to share with you that on my way home from class last week, I passed a sign outside of a Bank of America advertising their mobile “assistant”.  Her name was Erica.

O’Neil, in her conclusion, says that “big data processes codify the past.  They do not invent the future.”  This neatly sums up the arguments she’s made.  The examples are clear.  In schools, the “codifying of the past” is done wildly inaccurately.  The performance indicators of teachers simply do not measure what they are meant  to.  This is the first type of problem introduced by WMDs.  The response to the inaccuracies are not surprising.  In the name of ease or of streamlining, or most likely in the name of cost minimization, teachers are held to standards that bring inherent contradiction.  How can a school measure the value added by a teacher of underperforms with the same algorithm that it measures teachers of overachievers?  The outcomes are not important here, only the seemingly priceless impact of essentially digitizing employee review.  While pretending that taking humans out of the judgement process will level the playing field, it actually codifies the human error.

The example of teacher evaluation is the least threatening of the examples given by O’Neil in the assigned reading.  Worse is the outright and blatant codification of existing systems and structures.  Where value-added educator evaluation is an original model of measurement with new flaws, in the case of the use of WMDs in the financial industry is the codification of unoriginal, existing models that have unfairness baked deep within already.  By using existing data, choices are made about the value of individuals without consideration of the data that has not already been collected – like using the zip code as a weapon despite an unmeasured propensity toward frugality, for example.  This, arguably more dangerous, form of WMD highlights O’Neil’s point about “codifying the past”.

Reading these chapters, I thought about our prior conversations about digitization and datafication. The data had already been collected; vast swaths of information exists about individual insurance risk, policing patterns, or political motivations.  The use of WMDs seems to me a type of digitization of our existing social structures and patterns.  This begs a new perspective.  Why are we looking at the success of data systems to fix the world when we cannot even create data systems that properly express the world as it is?  O’Neil’s answer is this: the mathematical tools discussed can be used for good or for evil, for equity or inequality, to codify or to “create” our society.  It is the human component that decides how to use these tools.  Unfortunately, it appears that the same players involved in codifying, datafying, and digitizing our reality have very little interest in the human component at all – likely underestimating or even devaluing their roles.

Post Re: Mayer-Schonberger & Cukier

The first section focuses on the difference between data and technology, two terms that are often conflated.  The anecdote about seafaring is used as a clear lens through which to read the rest of the chapter.  I was interested in the claim that “Amazon understand the value of digitizing content, while Google understands the value of datifying it.”  My initial reaction was that this feel  uninformed as it fails to consider the ways that Amazon does datify, just not in plain sight.  Yes, I can see bar graphs on Google of other people’s searches, but Amazon has quietly datafied in such a way that the items marketed to me are no coincidences.  In general, this was something that I had wished was discussed more in the chapters we read (although it’s probably discussed in other parts of the book).  Data is incredibly useful to us when we are aware of it, but what about when we are not?

Reading these chapters, I was forced to reconsider my preconceptions about human advancement.  The idea that tech is our next “big thing” is basically undermined by the disparateness between data and tech.  The “technological age” is not on the same timeline as say, the stone or industrial ages.  This begs the categorical question.  If our tech-crazed culture is not, in itself, a landmark moment of advancement, then what is?  Our ability for complex communications?  What does it mean that so many get left behind from these cultural changes?

One of the other items discussed is very timely.  They introduce social credit ideas, which I’ve seen on Black Mirror and is literally underway in China right now.  It will be interesting to see how the mass data is used to help (or more likely hurt) society at large.

I loved the Privacy/Punishment chapter that references Minority Report.  It’s always been one of my favorite movies (don’t judge me) so I’ve thought a lot about this.  The ethical questions need to precede the technical ones, but I suspect they won’t.  Why are people trying to predict crime for punitive purposes?  It’s frightening that this what we build by default, as opposed to a data-driven system that could prevent crime by preventing the CAUSES of certain crime.

At the end of chapter eight, the authors bring up the precarious future of free will.  This, to me, seemed like the most likely place to find our society’s next large-scale change.  Digitizing collected data makes our world more accessible, but becoming collected data would change our relationships with ourselves and each other.