East Contemporary

Mél Hogan, Joshua Neves: transmediale Marshall McLuhan Lecture 2020 “Expanded Networks“

Berlin, Embassy of Canada, 19:00, January 29, 2020

The Marschall McLuhan lecture is a one of the traditional events opening the lecture program of transmediale. Mél Hogan is an assistant professor in Communication, Media and Film at the University of Calgary. Her research is on server farms and data centers – their social implications and environmental impacts. Joshua Neves is Canada Research Chair and Director of the Global Emergent Media (GEM) Lab at Concordia University (Montréal). His research work focuses on Digital Media, Cultural and Political Theory, and problems of development and legitimacy.

Joshua Neves (L), Mél Hogan (R). Photo: Laura Fiorio, transmediale, CC BY-SA 4.0

Joshua Neves touched on two topics and their intermingling, or a lack of this intermingling in social discourse accompanying their spread: The increase of technological gadgets that promise to augment user experience and smoothen out user interface interaction on one hand, and the increase (I would say repeated increase) of biotechnological approaches aiming to augment and optimize the human brain and body through the use of new chemical substances or through direct neural stimulation of the brain (via electric current) on the other.

The first development is often known under the term IoT, Internet of Things. These devices no longer need human input, but continuously collect data and “independently” (according to preprogrammed patterns) act on it. This data harvesting behavior is justified by an ease of use and continuous product improvement based on this data. However another “use” of the collected (big) data consists of data mining and analytics processes that produce/extract value from this data by reselling it in the form of information products. This description would hint at the well-known triadic model of “free” (social) media platforms, where the product is in fact the user. But even this approach keeps one established assumption that these tools’ or platforms’ existence is driven by “providing a service to the user”.

Through an anecdote, Neves turned this concept around. He told the story of a P&G marketing executive who visited a store to find out that a number of their products where not on stock anymore. There was also no follow up order to stock up these products from P&G yet. The result behind this was simply the “human error” of the shop keeper, who was too busy with other things to precisely track stock numbers and place orders with a necessary buffer when needed. This marketing executive thought “wouldn’t it be wonderful if the shelf itself knew it is running out of stock and informed P&G directly about this fact, so that there is a constant uninterrupted supply or products on the shelve?” This wish is, I think, already a reality of technologically up to date retailers. The interesting thing, in contrast to the previous paragraph is, that in this case, the motivation for an IoT technology implementation itself is to remove the dependency of a commercial process on error-prone human action. A certain conditioned activity (place order when stock is low, click like to signify your approval, insert your body measurements to calculate the BMI…) is replaced by a frictionless automated process. Seen from this perspective, the question can be posed whether all data hungry IoT appliances are here to provide additional agency to the user, of if, in fact these appliances are here to smoothen out the unpredictability of human (re)actions to optimize the business processes of the entity actually in control of the data collection technology. If the latter is the case, it would suggest that the opposite of “the enhancement of human agency through technology” (something that McLuhan also liked to claim – media as extensions of human communicative capacity) is true: That, in fact, the data capture is taking agency away, reducing users to passive suppliers of a raw data/energy.

The second development discussed by Neves was called bio-tech and bio-hacking. Again introduced via a couple of anecdotes (a youtube video of a guy sending electric shocks to his brain via a self-made headband and reporting that he is seeing flashes in front of his eyes, but does not feel more intelligent…), Neves pointed out the denominator common to technological devices and bio-technological substances, an ideology of self-improvement and self-optimization. The tagline of one biotech company “Humans are the next platform.” seemed fitting here. It signified a shift of the boundary that so far has been the human skin towards the human brain and body interior itself. Users “self-optimize” for maximum efficiency and ability, but based on which standard? By continuous cycles of improvement, the human body is set into a constant state of crisis. Biotechnological stimulants, generic modifications, neural enhancements are used to counteract this critical state, resulting in reaching another critical state. There is no normality, no equilibrium or optimal performance, there is only a continuous drive from crisis to crisis. The human body is dragged along, and the control is surrendered to entities in charge of developing – via technological or marketing means – new (bio) technological products and solutions to the latest state of crisis. The definition of the virtuosity or viciousness of this cycle is a matter of ideology. The direction that human agency and freedom is taking should be obvious.

The second lecture has been delivered by Mel Hogan. She talked about data centers and the huge amounts of energy that they use. The energy impact of data collection as well as generation for consumer, industrial and scientific purposes is huge, and appears to be a negative externality not yet accounted for when introducing “innovations” such as 5G networks (that are in turn a prerequisite for the aforementioned IoT solutions). After a while, Hogan turned to the topic of genetic science and DNA decoding/encoding. The main connection with the previous topic was the huge amount of data storage needed to perform DNA mapping. However her main concern, as I understood it, was the investigation of the possibility of using DNA itself as a data storage medium. Numerous examples expanded on this endeavor. At the moment it is possible to sequence an artificial DNA with encoded information, but it does not practically work as a storage medium, because it decays. Also, current examples showed the ability of sequencing an artificial DNA outside of the body, but did not so far address the question of its reproduction between cells. A propagation among cells would on one hand decrease the error rate (more copies available), but it would also introduce the issue of mutation: How to ensure a stability of the encoded information. Other examples presented by Hogan went more in the direction of showing the misuse of DNA-rhetoric for commercial or ideological purposes (DNA dating, identifying sexual orientation, etc.). The majority of these claims are according to her false. I found this examples as distracting from the main point that Hogan was making (DNA as data storage), as their base was too generic and ideological, and they were on the whole more about reading the DNA then writing and storing it. This breadth made me lose track of the main thread she was following.

The idea of DNA as a data storage itself – and Hogan acknowledges this herself – is to be questioned. Why is it necessary to store and process such huge amounts of data? (Side note: It just came to my mind that Hogan did not make a difference between storing and processing data. It would be interesting to investigate the relationship/proportion between these two). The question regarding the need for data storage again links into the topics raised by Neves: Isn’t the data accumulation that is taking place in the name of furthering knowledge in fact an amassment of digital capital for purposes of power and profit? The cryptocurrency developments provide a clear example of the latter; data processing energy is simply burned through the roof in the chase of an elusive vision of independent equality. However, other areas of data storage should be equally questioned, and hopefully (a pipe dream) the data processors and keepers will one day be presented with a bill that includes their environmental impact.

The environmental undertone in Hogan’s lecture was a bit difficult for me, because of the increasing ambiguity of this position, as government and corporations buy into this rhetoric (environmentalism is becoming mainstream), and it is harder and harder to distinguish creative marketing and ideological misinformation from honestly meant attempts to create a concrete positive impact. Making blanket statements about a “huge” environmental impact of (something) smells like a clickbait on the first page of a fashionable research grant request presentation. Sadly, of course, because the care for our environment and the inclusion of negative externalities is a real issue. Here I would wish more specific numbers calculating the financial (to the corporation – that should be relatively easy) and environmental (that is much more difficult) impact, and maybe also a hint at a future direction.

Overall, however, I enjoyed how Neves’ and Hogan’s lecture intertwined and mutually referenced each other. Hogan was maybe a bit more accessible in her tone, but also a bit more moralizing and activist. Neves came across more detached an analytic, which made him sound more like an observer keeping his good old scientific “objectivity”. Neves set the stage, and a number of points presented by Hogan could lean on the developments explicated by him. That means, the sequencing of the two lectures was the right one. Both lectures approached related topics from different perspectives, and in the end, they matched together like two pieces of a puzzle.

Comments are closed.