The unwanted touch of the digital era
“Our bodies, with the old genetic transmission, have not kept pace with the new language-produced cultural transmission of technology. So now, when social control breaks down, we must expect to see pathological destruction.” –Donna J. Haraway
Last summer (which now seems like centuries ago) I was on the beach with my toddler when I saw another woman playing with her children in the sand. Her face was familiar, but I couldn’t remember who she was exactly. My first reaction was to reach for my phone and ask the Internet, like I would normally do when my mind is blanking me on something.
The specific technology that allows for such easy violations of our private moments is not yet available for everyday use. Soon enough though, it may very well be. Facial recognition is already mainstream and widespread in the workplace, airline boarding, and law enforcement –even if ethical considerations, privacy laws or the lack of necessary infrastructure have so far protected us from its full potential.
Want to read this story later? Save it in Journal.
In the not-so-distant future, it may even be a built-in feature of our smartphones: all we will need to do is take a picture of a person, upload it to an app, and be able to identify anyone with a public profile in a second.
At that moment on the beach, the realisation that I could so easily pry upon this woman (and that random strangers could do the same to me) gave me the chills –quite literally. I started shaking and sweating, with butterflies in my stomach. I felt dizzy and disoriented. My body gave a deeply visceral reaction to this scenario, which –in a certain way –felt like the digital version of unwanted touching.
How come something so abstract, like an imaginary digital intrusion, can be so deeply felt in our bodies?
The psychological impact of being watched
There’s a long history of scholarly thinking on the psychological impact of being watched. In surveillance studies, the Panopticon-effect has become a key metaphor for the modern disciplinary society, analysed widely to describe any system of control where people cannot tell whether they are being monitored or not, and, as a result, choose to refrain from certain acts. (Most recently, Simone Brown has built on the concept to describe how certain groups, notably Black people, can be excluded from society on the basis of their future potential behaviour, determined through profiling and surveillance.)
But the pathological effects of being watched do not stop at self-control or discipline.
Psychiatrist and critical theorist Frantz Fanon depicts how constant monitoring creates other alarming physical effects like nervous tensions, insomnia, and fatigue. Randolph Lewis claims that certain forms of dignity can only exist when we are ‘truly alone’, and thus modern day digital societies evoke unprecedented levels of anxiety.
Building on their work, I believe that a potential way to reflect on the impact of everyday digital intrusions is through the lens of bodily integrity.
Broadly speaking, bodily integrity is the inviolability of the physical body –our right to self-determination over our own physical boundaries.
From the perspective of human rights, any violation to our bodily integrity is regarded as intrusive, undignifying, or downright criminal.
Such violations can range widely –from seemingly harmless acts like piercing a baby girl’s ears, to forms of violence like sexual abuse, to medical treatment administered without a patient’s consent or against their wishes, to abortion bans, and more. Notably, children, women, the LGBTQI community, people of colour, and people with disabilities are disproportionately affected by such violations –either because of their inability to give or refuse consent, or because of their increased vulnerability and exposure to violence.
Regardless of whether we’re talking about its severe or more subtle forms, bodily integrity violations always have long-lasting psychological impact –even when they do not result in direct trauma.
According to mainstream child psychology, even harmless acts like forcing a kid to kiss a stranger may have damaging effects on their behavioural development. What these seemingly minor bodily trespasses signal to a child is that their agency is limited, that they do not have full control over their body, and their well-being will always depend on a higher authority –usually an adult. Other, more severe forms of bodily integrity violations create even longer-lasting damage, through affecting our ability to respond to danger, to make informed decisions, and sometimes even to recognise abuse and oppression.
Our data bodies
By 2020 technology has become the primary platform for almost all major human interaction –a tendency that has only been exacerbated by the Covid-19 pandemic. And while respect for bodily integrity may have become the norm in our physical world (though it still seems to be a privilege of dominant social groups, notoriously white people), the digital realm did not at all follow course.
In fact, emerging technologies have re-created power dynamics that build on coercion, control, discipline and objectification.
Think of facial recognition at airline boarding, where face-scanning technology is used to verify your identity, or affect recognition in recruitment that claims to “read” your inner emotions by interpreting the micro-expressions on your face, or the tone of your voice. (Notably, this lacks scientific consensus as to whether it can ensure valid results, warn the experts. Have you ever smiled without actually feeling happy? Right.)
What these technological solutions have in common is their aggressive, coercive and oftentimes non-consensual approach to taking what they want: the digital footprints of our bodies.
And yet, we still don’t seem to have the language, nor the conceptual framework, to describe how these everyday digital violations affect the human psyche in the long run. We tend to think of whatever happens to our bodies as only casually related to our feelings, and our feelings as being inferior to our rational senses. Since our physical boundaries and sensory processes are so difficult to translate into the digital realm, the connection between everyday online intrusions and our emotional well-being often gets neglected. To a certain extent, it seems to escape our attention that privacy is not just a matter of law and ethics, but also our fundamental well-being –both on the individual and societal level.
Data doesn’t belong to us — data is us
Writing for the Deep Dives, Tricia Wang says our primary language for conceptualising the data we produce is through privacy, which “treats our personal information as separate from us, a piece of property that can be measured, negotiated over, sold, and reused”. But in reality, she argues, data doesn’t belong to us in the ways that our physical properties (like a house or car) do — data is us. “It is like a quantum particle that can exist in two places at the same time, as both a representation of who you are and also a commodity that can be sold.”
Wang also emphasises that in our age, privacy and personhood are both mediated digitally.
What I’d add is that because of that, digital footprints of our bodies aren’t simply content anymore either: they are — and should be treated as — an extension of our will and agency. Any intrusion to that integrity (and thus our sense of wholeness) can feel disruptive, regardless of whether it’s happening to our physical bodies, or our ‘data bodies’.
(According to Our Data Bodies, a group of researchers and organisers, the data that is collected about us and stored digitally is both a manifestation of our relationships with our communities, as well as the institutions of privilege, oppression, and domination.)
Reimagining where the self begins and ends
This might explain why, in the world of digital sexual assault, the distinction between the digital representation of the body, versus the body itself, can feel so wrong. For survivors of non-consensual pornography, as PJ Patella-Rey argues, even just knowing that people are passing their body around can create an unshakable feeling that reminds of real sexual abuse.
Through their experience, technology forces us to reimagine where the self begins and where it ends.
But if it’s so easy to recreate the unwanted touch in the digital era, how can we expect an entire industry that was built on exploitative and coercive practices to suddenly ignore its underlying business premise? If we agree that the spread of facial recognition technology is the digital equivalent of being touched by a stranger, what kind of education, regulatory changes, awareness raising and organising needs to happen so that we can design digital futures that build on meaningful consent and respect?
Technology is never neutral
In her book Race After Technology where she examines how emerging technologies recreated (and often deepened) structural racism, Ruha Benjamin says if we keep assuming that technology is inherently neutral, that will only result in more injustice and inequity.
Using virtual incarceration (like ankle bracelets) as an example, she explains how even well-intentioned digital solutions can create new forms of inequities and oppression — in fact, she argues, there is now a sea of ‘technical fixes’ that ended up causing more harm than good.
And yet, regardless of whether we’re talking about benevolent tech running amok, or intentional attempts to control and dominate through digital means, the fact that technology isn’t inherently neutral also means that protecting ourselves against already existing digital solutions won’t be enough. Instead, we will need to fundamentally challenge our assumptions about the norms that guide our technical design in the first place, and why, in our current age, coercive, controlling and predatory dynamics are the digital default. If we do not allow those norms to guide our social fabrics in our physical reality, why do we so loudly embrace them in our digital sphere, all in the name of innovation?
Emerging approaches to our digital future
Luckily, there are a number of emerging approaches that may help us rethink our relationship with technology and data.
In their book that outlines why data science needs more feminism, Catherine D’Ignazio and Lauren Klein say if we want to make our data-driven systems more humane, we will need to value multiple forms of knowledge, including the type that “comes from people as living, feeling bodies in the world”. The world of data is often presented to us as something cold, rational, objective and neutral — in the arena of machines and facts there is no room for feelings and emotions.
But humans, as D’Ignazio and Klein argue, are not just “two eyeballs attached by stalks to a brain computer.” They are multi-sensory beings with cultures, memories, fears and feelings that are primarily lived, processed and mediated through their bodies.
And the list of inspiring approaches is long. Kelly Dobson’s concept of data visceralisation emphasises that data is for all senses of our body, not just the eyes — thus providing a powerful alternative to data visualisations, which are often seen as the most objective way to present information. Sasha Costanza-Chock’s design justice approach offers a new way to think about design that is led by marginalised communities, with the explicit aim to challenge structural inequalities. And while not specifically focused on technological design, adrienne maree brown’s book, Emergent Strategy provides helpful insight into how systematic change happens and how we can shape the (digital) futures we want to live in.
What these frameworks have in common is that they all — directly or indirectly — advocate for something similar: a fundamental behavioural shift in how our societies approach creation and innovation. They provide us with a radically different take on change and growth, challenging our prevailing assumptions about the very systems that govern our lives.
These alternative approaches offer new and refreshing pathways for how our techno-social systems could be designed in the first place: what considerations should be taken into account, who should partake in the process to avoid harmful consequences, and whose voice should be heard to make sure no one gets left behind.
The end of the unwanted digital touch
That day on the beach what I realised was this: I don’t want my daughter to grow up in a world where the unwanted digital touch is the norm, and I hate the idea that random strangers can take digital footprints of her body whenever they want. I want to raise her in a way so that she can easily say no to digital violations of her bodily integrity, and also quickly recognise when such trespasses happen to others.
To put it simply, I want to live in a world where respect for people’s bodily integrity is the digital default, instead of accepting that “moving fast and breaking things” is a necessary side effect of innovation.
When it comes to the design of digital systems that have such a great impact on our lives, it is becoming increasingly clear that a dramatic shift is needed. Or as Ruha Benjamin put it, technology needs to start “moving slower and empower” people.
For that to happen, we will need fundamentally different levels of social awareness around technology’s impact on our bodies, and a much more nuanced understanding of how technology impacts our overall well-being — beyond the legislative frameworks that aim to restrict the industry’s predatory impact.
The journey will be long, but it will ultimately be worth it.
More from Journal
There are many Black creators doing incredible work in Tech. This collection of resources shines a light on some of us: