Living with Algorithms Summer Workshop @ RHUL

I’m very excited with what looks like a fascinating workshop with a great set of people (I don’t believe the list is finalised yet, so will hold off on that) on living with algorithms. This is being hosted by Royal Holloway, University of London. I’m particularly excited with the short 5 – 10 minute provocations that this call for papers asked for – I’m sure they’ll be a some pretty diverse, and contentious, contributions on the day.

Below is my abstract for the workshop:

The kiss of death: an algorithmic curse.

Malicious softwares slither through noise of systems, of cyberspaces, attempting to avoid the signal, to defy their abnormality to their surrounding ecological condition. It’s a parasitical existence, to avoid the kiss of death.

Algorithmic detection: The curse of malwares. The curse of humans?

In similar techniques deployed against humans, the collection of data, its analysis, abnormality breathes from ever-modulating normalised abstractions. Or that is the intention, at least. Modern malware mostly emerges as malicious through the deployment of the detecting algorithm. Circulation and mobility are absolutely necessary for malwares to carry out their deeds of infection, exfiltration and so on. Yet precisely this circulation is its downfall. To be malware, it must move. Yet in moving it changes its ecological condition. Two cultures emerge at the point of software’s algorithmic detection; one becoming-human, one becoming-malware. Indeed, it is tempting to focus on humanly responses, looking at things in relation to ourselves. Yet how does algorithmic detection expose the malicious intentionality of otherwise ‘normal’ software? What human-malware normality is required?

We, malwares and humans, are rightly concerned with algorithmic detection. This is where our cultures converge in a more-than-human political project. We are unlikely to ever sense each other in that way however. Humans and malwares develop everyday practice at certain sites, sometimes technological, other times not. These include anti-virus programs, the secure connection to banking credentials, stealing ‘confidential’ big data, in organisational practice, in the virtue of the software programmer, in the chatter of politicians. When malware is detected, when it becomes known, it is sealed-off, destroyed, deleted. As humans, can similar algorithmic detection mechanisms come from our dividualisation? Can looking to a more-than-human offer potential futures of hope and resistance to the dominance of algorithms? A way for us to slither through our spatial registers?

 

%d bloggers like this: