Category Archives: Uncategorized

Compression and Digital Media Infrastructures

The word compression is used to describe a technical process that renders a mode of representation adequate to its infrastructures. Humanists and engineers, still evaluate media in terms of their ability to produce authentic effects. Sterne presented the idea of compression with a historical background. He compared the phenomenon with packing a suitcase. “You have too many clothes to fit in it,” he said, “so you roll them up, you squish them until you run out of space.” This I thought was the best way to describe the phenomena. If compression transforms representation for the purposes of technical media, it also transforms media to render them adequate to representation. This leads to the question why and how efficiency and effectiveness, along with an authenticity to get the reliable experience, is a driving concern in the theory of media. Sterne also sheds light on the importance of understanding how and why lower-definition experiences are sometimes among the most intense, significant and meaningful moments in modern life. As new communication infrastructures come into existence, aesthetic representation becomes an engineering problem. Specifically, where people make representational demands upon infrastructures that exceed the carrying capacity.

Furthermore, to sum up the media structures readings, I would say that it contributes towards the development of digital media. Its emphasis was on how digital media represents a physical, concrete and tangible infrastructure. It was a good read, shedding light on several kinds of technologies and implications of digital media infrastructures- which I would never have stopped to think about twice, including data centers, media infrastructures, cloud, and digital compression processes. Now we can easily see the several consequences of these infrastructures on society, especially in relation with media and environmental sustainability. It makes one think how much of the world of technology and data does not meet the eye.

Thoughts on Compression

Sterne’s “Compression: A Loose History” discusses the historical and theoretical contexts under which compression of thought and language, audio, and data have emerged. In regards to the piece, I am especially interested in the variations between the high definition media and the low definition media, especially when thinking on their potential to shape education and knowledge. When discussing the alternative high definition files that need to be compressed – low definition ‘aesthetic’ files, I was surprised to see that the author made no mention of In Defense of the Poor Image, written by artist Hito Steryerl.  Steryerl makes several points about the affordances of the poor image, one of them being that poor images are able to spread quickly and easily its low resolution is able to accommodate a variety of infrastructure, especially in places that do not have access to high bandwidth connections. She writes “The poor image is no longer about the real thing—the originary original. Instead, it is about its own real conditions of existence: about swarm circulation, digital dispersion, fractured and flexible temporalities. It is about defiance and appropriation just as it is about conformism and exploitation. In short: it is about reality.”

This brings us to the discussion of verisimilitude. In this context, it means the ability for technology to portray what we perceive to be reality, what we perceive to be true. Sterne introduces his work with the sentence “this is the story of communication as being about the anxiety over the loss of meaning through a succession of technical forms. The assumption here is that progress in technology comes through its ability to produce verisimilitude” (32). However, it is important to think about whose truth technology is perpetuating. As we have discussed in our previous classes, technology and data are not all knowing apolitical machines that have the ability to be objective because the biases of the creators are built into them. As we are able to produce more and more media and construct more infrastructure to accommodate this media, whose truths are we prioritizing?

Low definition images are powerful because they are able to be spread more quickly, they can be created by anyone, and they are unusual to the vast majority of media consumers at this point. With “bootleg aesthetics”, Lucas Hilderbrand “finds these videos all the more affectively powerful because of their low definition” (34). I believe low definition images are more powerful than the ability to compress because of the context under which they exist in this day and age. When compressing a concept, image, audio byte, there is information that has to be trimmed and therefore lost during the process of compression and transmission. With low definition images, what the user captures is what is shared. The user captured their truth with the creation of that media source. In terms of information compression, if the creator is not the compressor, an outside source is determining what is and is not important. How might their biases skew the information being transmitted in an unintentional way?

Side note: It might be interesting to explore the relationship between coding and cryptography, but I don’t know enough about cryptography enough to have a coherent statement or opinion about that relationship.

Michelle Alexander weighs in on WM(ath)Ds

Michelle Alexander, the author of The New Jim Crow, wrote an op-ed for the Times this week about the way algorithms will reinforce racist policing. Referencing Cathy O’Neil a few times, she claims that some of our recent systemic victories like Amendment 4 in Florida are only temporary fixes that disguise the new technology waiting in the wings of the mass-incarceration stage.
Alexander’s book, which was published in 2010, explains how the Jim Crow era never ended; only changed shapes. Through racist police policies, the war on drugs, and private prisons, a price has been put on the head of people of color (specifically black men) encouraging prosecutors to imprison them. Once they’ve been funneled into these private prisons – often for small drug charges – inmates will be given mandatory work for just pennies an hour. Many of our largest, name-recognized companies produce products through this wrangled labor force, including Victoria’s Secret.
She argues, alongside O’Neil, that by digitizing criminal enforcement, the US will be setting racism in mathematical stone that will be beyond reproach due to its opacity. She goes further, discussing e-carceration, a potential upcoming “Newest Jim Crow.”
Read the whole piece here:

https://www.nytimes.com/2018/11/08/opinion/sunday/criminal-justice-reforms-race-technology.html

On Undersea Infrastructure

An important question raised in “Fixed Flow” is that of timing. The average user does not have the opportunity to voice their opinions or preferences in terms of the paths their data takes. A more informed user would likely desire keeping all information on nearby servers or, if they are less concerned with privacy, would perhaps desire high speeds. But they will not experience either of these due to the media/tech companies that have control over the paths themselves. It is beneficial for them to move data as inexpensively as possible, regardless of effect on the user. One of the only exceptions may be financial companies that plan data routes so fast they can have a minscule head start in stock trades. The user is left out, at the whim of those whose names are on the cables and servers around the world. This is where the opacity of networks becomes relevant. Staroslieski recognizes that most people imagine the cloud as a magical, overhead ether. Thinking about some of our other readings this semester, there’s a strong argument to be made that this public perception of the cloud is no accident. But there is no magical ether – if you’re not storing your data on your own hard drive, you’re storing it on someone else’s.
The relationship that this particular undersea infrastructure has on the media passed through it is interesting. Cables are installed to prepare for an expected media influx while media is simultaneously expanded to get efficient use of of these systems. What are the secret impacts of a media infrastructure used by the masses but quietly controlled by the few? In a addition to superficial concerns like video game play there are huge security and privacy concerns. The user does not know to be concerned about their information passing through the servers of other countries or companies, but they probably should be.
There was one area where I disagreed with Starsolieski. She claims inequality to access will grow due to the high cost of cable networks. But most technology gets drastically less expensive over time due to the continued advancements in the field. So I would argue that it is unwise to draw predictions of future access based on current cost analyses.

network latency

Network latency, is an expression of how much time it takes for a packet of data to get from one designated point to another. It exists and varies by distance, media of data traffic runs by and on. The media includes optical fiber, copper cable, wireless electronic ware, etc. Generally, the longer distance, the more latency. That means it use more time to deliver a package of data and to receiver the response. We can try the demand of ping to test the latency between your pc and the pinged serve.

For example:
It takes 45/71 milleseconds and 274/337 milleseconds to finish the sending package and receive a response to/from amazon.com hosted at US and amazon.cn hosted at China.

C:\Users\pcadmin>ping amazon.com
Pinging amazon.com [176.32.103.205] with 32 bytes of data:
Reply from 176.32.103.205: bytes=32 time=45ms TTL=232
Reply from 176.32.103.205: bytes=32 time=71ms TTL=232
Reply from 176.32.103.205: bytes=32 time=70ms TTL=232
Reply from 176.32.103.205: bytes=32 time=62ms TTL=232

C:\Users\pcadmin>ping amazon.cn
Pinging amazon.cn [54.222.60.218] with 32 bytes of data:
Reply from 54.222.60.218: bytes=32 time=286ms TTL=225
Reply from 54.222.60.218: bytes=32 time=274ms TTL=225
Reply from 54.222.60.218: bytes=32 time=303ms TTL=225
Reply from 54.222.60.218: bytes=32 time=337ms TTL=225

Although the latency is at the level of decades of milleseconds within one country or continent, and at the level of hundreds of milleseconds between continents. It affects much the user’s experience of network-based application. Foe example, I can login the Chase APP within 2 seconds in USA. In contrast, it takes me 10 seconds to get into my Chase account when I travel to China. The difference of login speed lies in the variance of network latency between my cell phone located in USA and China and Chase APP server in USA. I can tolerate the 10 seconds of waiting for login my bank app, after all I do not use it too much every day.
However, In business area. Network latency bring much influences to business model, competitive force, especially to the operation with high reliance on time-efficient. In the book, Flash Boy, introduced the story of re-constructing a fiber between New York and Chicago. The new fiber is more straight and shorter, less nodes, decrease the network latency from 17 milleseconds to 13 milleseconds. This latency of 4 less milleseconds means faster communication between the two trade center in Chicago and New York.It creates much advantages for those high-frequency-trades who use this high-speed fiber over others.

http://5b0988e595225.cdn.sohucs.com/images/20171011/09988639262a4773835a9f62791b32ee.jpeg

https://gizmodo.com/getting-lost-down-the-rabbit-hole-of-private-infrastruc-1562782580

In wireless areas ,there is also much difference among different protocols, technologies. The standard of 5G was just formed and come to agreement and accepted by all the related associations. Just like what Nole Starosielski described in “Fixed Flow – Undersea Cables as Media Infrastructure”, the construction,investing, and maintenance of undersea cables were mixed with much impurely technical factors, such as political interference, inequality, inception, business competitiveness and chaotic, etc. These factors were also entangled inside process of forming the 5G standard. Anyway, 5G era is coming and let’s google, facebook, wechat more and faster…

10-30 Reading Comments

In the introduction to Signal Traffic, Parks and Starosielski define media infrastructure.  They have identified that this is an important focus because the way media is transferred and stored does impact the media itself.  I was initially struck by the elegance of building new technological infrastructure over old industrial infrastructure.  It reminded me of old subway cars being dumped into the ocean to encourage coral reef growth.  One of the first examples discussed is a Google data center that used to be a paper mill.  The authors note that 100 employees work where 650 used to.  I wonder what their hours and wages are like.  There are some other serious downsides to building infrastructure that relies on old infrastructure.  I think about the NYC Subway; even the updated stations still, in part, rely on the old switches they’ve had in use since 1904.  The authors introduce the other pieces in the book, explaining the importance of scale and perspective.

In “Where the Internet Lives”, Holt and Vonderau discuss the physical data storage that occurs around the globe, focusing on Sweden.  Under the guise of transparency, Google releases information about what these places look like, the way the wiring works, and plans for temperature control.  But they do not go out of their way to give any details about the actual storage mechanisms used.  How safe is the private information being stored in Sweden?  Are there backups? Do the storage drives communicate with one another or are they each a separate black box?  After reading this chapter, I got the impression that these are the questions Google does not want us asking which explains why they overshare the more superficial information about their business.  Google’s treatment of their data centers reminds me of politicians who dump thousands of documents on investigators while concealing the important ones.

Facebook is super happy for you to know about its Fort Worth data center

I never know whether to attribute changes that have occurred between the time something is written and when we read it to cultural/attitude shifts, or mere coincidence. In Jennifer Holt and Patrick Vonderau’s chapter “Where the Internet Lives’: Data Centers as Cloud Infrastructure,” published in 2015, the authors discuss how secretive companies often are about the physical spaces of data centers. Google is cited as one exception to this. Has thinking about data centers changed so much in the last three years? Because it seems that Facebook is also happy to attach a certain level of visibility to their data centers.

A blog post on the web site of Dallas Innovates magazine includes information such as their Fort Worth data center’s design elements, physical location, and cooling strategies. The post also includes photos of the data room, massive fans, and the pipes that move water throughout the center, but its overall the focus seems to be oriented more towards the site’s employees; there are a lot of pictures of artwork (much of it tied directly to the region and/or produced by local artists) and employee spaces than on the practical infrastructure needed to power and run a massive data  center. The short post does not say whether Facebook directed or limited what could be photographed, or if the magazine’s photographer/editor chose what to shoot and feature. This visit happened recently, but I found several older articles/blog posts from the North Texas area from others who had been granted access to the data center; Facebook definitely isn’t trying to keep this place under wraps, but their openness with information about their data centers just makes me wonder what they’re misdirecting people’s attention from.

Perhaps unsurprisingly, the data center also has its own Facebook page. Its posts seem evenly split between positive press releases, encouraging people to apply for a job at the campus, and information about Facebook’s charitable contributions in the region. My favorite comments came from the disgruntled former employees (top gripe: how out of the way the campus is and the lack of bus service to get there). Other Facebook data centers are linked to from that page, leading me to believe that this openness about their data centers is the rule, rather than an exception.