Hang the DJ: Designing the Panopticons of our time

CCTV panopticon“CCTV panopticon” by nicolasnova is licensed under CC BY-NC 2.0

Panic on the streets of London
Panic on the streets of Birmingham
I wonder to myself
Could life ever be sane again?
[…]
Burn down the disco
Hang the blessed DJ
Because the music that they constantly play
It says nothing to me about my life
The Smiths, Panic

Due to my seemingly ever-increasing entagelment with the study of emerging technologies, I’ve recently been binging the Black Mirror series. As narrative fictions can sometimes make us think in a more open and relaxed way what the so-called rational gaze might, I’ve wanted to explore how Charlie Brooker and the team have explored the impact of various kinds of technologies.

In Season 4, episode 4, “Hang the DJ”, we meet people who live in a walled society where an AI match-making technology controls partners they meet and how long they will spend time together. This in order for the system to find them a perfect match for life. Everyone carries this little round device with an AI that shows who they should meet, when, how long they have with them, in addition to answering all of their questions (within the range that the AI understands them). There’s more to the episode than this, but this is what is important for this post.

In the beginning, the main protagonists, Frank and Amy, appear new to the system. They follow, slightly hesitantly but still follow, what their little device guides them to do. They meet for a dinner that has been set for them, go to a certain location to spend the night and have sex. Although all of this feels somewhat odd to them, they follow the system as it allegedly knows what it is doing (it has a “99,8% success rate!”), and as there is a constant feeling of being watched.

In the morning, reluctantly, they have to say goodbye. They have enjoyed each others company, but need to part as the time together set by the system has ended (and there should not be violations). After that, although they have barely recovered from the (emotional investment of their) date, the device notifies for another date with different people. Same process, but a different companion and different length of time (allegedly, the system uses this procedure to process their data for perfect match, and because with the system, “everything happens for a reason”).

Frank and Amy meet many different people, but none of them feels quite the same what they felt with each other. At some point, Amy even starts experiencing out of body experiences in order to cope with the process. At some point, the system matches Frank and Amy for a date together once more. At that point, as the synopsis of the episode goes, they “start to question the system’s logic”.

What makes this episode compelling when thinking about human-technology experience, is how it portraits the flip that takes place when technology does not serve its user, but tries to affect their reality and behavior in ways that feel wrong. Some might not like the word “feel”, but as some in interaction design have suggested, it is just this “feel” users actually care about, “feel” through interaction.

When the system, or interaction with it, stops feeling right, cognitive dissonance might emerge. Surprisingly often in such cases people blame themselves, that there is something wrong with them or their way of interacting with the system, instead of blaming the system itself. Naturally, in many cases, such as it is in the episode, people do not seem to have a choice to question the meaning of using the system in the first place (as with those numerous systems at work we need to use these days for aims we do not understand, in ways that do not seem to serve us). Often this drives people to cognitive dissonance, ethical problems or because the system does not seem to be connected to their reality or even ethics, to play the system to be a good sport.

Recently I read an academic article on facial recognition technology and implementing it in education (tweet below).

In short and without trying to sound too academic here: the authors suggest that this creeping surveillance in schools and higher education institutions might not be that good idea. There are several good points and concerns in the article, if you want to read it, find it open access at https://t.co/HSKUE6gwFk.

Things presented in the article also connect with Bentham’s Panopticon prison design, although, as in this thoughtful piece in The Guardian, some have argued Panopticon as a metaphor in digital surveillance might not actually be entirely the same phenomenon (but it might still lead to somewhat similar behavior).

And this behavior is what is important here. People have a tendency to publicly and now online, to learn to behave and appear the way that presents them well in the eyes of technology and surveillance (in a broad sense of the word).

Implementing facial recognition as a new form of surveillance (as the article also notes, educational institutions have indeed had other forms of surveillance preceding it) in schools and higher education institutions might lead to other, even unwanted, “learning outcomes”. Educational institutions affect learning in various subtle ways, which are not always present in the curriculum. Could it implicitly teach “the ones to be taugh” skills, attitudes and behaviors for the ever-increasing surveillance era we live in? If it does, what kind?

The simple outcome of facial recognition schooling will be the same as with any kind of schooling: students will learn new skills on how to behave and appear under surveillance, which will inevitably affect their future behavior and attitudes in many unknown ways that will either unfold in the future or become hidden in the future social fabric for some poor researcher to deconstruct.

This is not a difficult thought to agree with. Think about yourself sitting in a regular staff meeting: in general, you know how to and are (usually) able to conduct yourself in a way that portraits you in a good light in the eyes of other people, even if inside you you are boiling due to the stupid agenda and increasingly long discussions about pointless department matters (disclaimer: this is a hypothetical example, and not any way reflects the author’s thoughts about staff meetings).

Take this out to the subways with cameras and schools with facial recognition: when you get familiar with the idea that you are under a constant watch, and feel the need to play it, you will. And as we have seen with social media platforms such as Facebook, everyone just seems to have the perfect life, right?

The same behavior can be witnessed with systems that you are required to use to report your work activities and the time spent doing them. Often the core of the underlying assumptions of these systems’ design is the idea of an employee working with a task, in person, in a building, for x amount of hours per day, as in a factory – and we all know it is not entirely a perfect picture of the current reality of professional organisations (although it might sometimes feel like it). For example, when you travel for a conference, you do not spend the whole time with the conference, but several parallel tasks run at the same time because we cannot get away from systems that change the quality of time and space, in short, emails and social media. Still, the corporate system might have a logic where you have to report your activities for example 8 hours per day. No one really works 8 hours of the 8 hours they work, but the system directs them to report (i.e., lie) that they do. This process changes human behavior and impacts people’s emotions feel, sometimes causing cognitive dissonance and even worse, depression.

Such technological developments that impact human experience pose several important questions that we should really starting taking into account: how will the Panopticons of our time affect our lives and behavior, and the authenticity of these? Also, how much power, and to whom, will the DJs (systems designers) design, those who increasingly play the music that we need to dance to, even when it does not seem to say much about our lives?