Rethinking Space in Cybersecurity + ‘Algorithmic Dimensionality’

After initially volunteering to give a ‘lightning’ talk at the CDT in Cyber Security joint conference (Programme) at Royal Holloway next week (3 & 4 May), I was given the opportunity to speak at greater length for 30 minutes. This has provided me the breathing space to consider how I have been conceptualising space in cybersecurity – and is likely to form the basis for the last chapter of my thesis and a subsequent paper I wish to develop out of this (and what I see I’ll be doing post-PhD).

This draws further upon the talk I gave just over a week ago at Transient Topographies at NUI Galway, Ireland. In this, I explored the formation of the software ⇄ malware object, and how this relates to concepts of topography and topology in geography and beyond. In this, I explored how space is thought through in cybersecurity; whether through cartesian representations, cyberspace, or the digital. In my re-engagement with material in software studies and new media, I have intensified the political spheres of my (auto)ethnographic work in a malware analysis lab . Namely, how we come to analyse, detect, and thus curate malware (in public opinion, in visualisations, in speeches and geopolitical manoeuvres) as something that affects security and society. This is not something I claim as anything new, by the way, with Jussi Parikka in Digital Contagions doing this on malware and ‘viral capitalism’, and the multiple works on the relation between objects and security.

Instead, I wish to trace, through my own engagements with malware and security organisations, how space has been thought of. This is in no way a genealogy which would be anything near some contributions (yet) on space and security – but I see this as a start on this path. In particular, how has computer science, mathematics, cybernetics, cyber punk literatures, the domestication of computing, and growing national security anticipatory action conditioned spatial understandings of malware? This has both helpful and unhelpful implications for how we consider collective cybersecurity practises – whether that be by government intervention, paid-for endpoint detection (commonly known as anti-virus) surveillant protection through scanning and monitoring behaviours of malware, attribution, senses of scale, or threat actors – among a variety of others.

Representation of a ‘deep-learning’ neural network. Each layer is connected to each other with different weights, according to each particular application. Some neurons become activated according to certain features.

This working of space in cybersecurity is tied with what I term ‘algorithmic dimensionality‘ in our epoch – where algorithms, and primarily neural networks, produce dimensional relations. What I mean by dimensions is the different layers, that come together to produce certain dimensions of what to follow at each consecutive layer, generating relationships that are non-linear; that can be used for malware detection, facial recognition, and a variety of other potential security applications. These dimensions exist beyond humanly comprehension; even if we can individually split neuron layers and observe what may be happening, this does not explain how the layers interact adequately. Hence, this is a question that extends beyond, and through, an ethics of the algorithm – see Louise Amoore‘s forthcoming work, which I’m sure will attend to many of these questions – to something that is more-than-human.

We cannot simply see ethics as adapting bias. As anyone who has written neural networks (including myself, for a bit of ‘fun’), weights are required to make it work. Algorithms require bias. Therefore reducing bias is an incomplete answer to ethics. We need to consider how dimensionality, which geographers can engage with, is the place (even cyberspatial) in which decisions are made. Therefore, auditing algorithms may not be possible without the environments in which dimensionality becomes known, and becomes part of the generation of connection and relationality. Simply feeding a black box and observing its outputs does not work in multi-dimensional systems. Without developing this understanding, I believe we are very much lacking. In particular – I see this as a rendition of cyberspace – that has been much vented as something that should be avoided in social science. However dimensionality shows where real, political, striations are formed that affect how people of colour, gender, and sexual orientation become operationalised within the neural network. This has dimensional affects that produce concrete issues; whether by credit ratings, adverts shown, among other variables that are hard to grasp or prove discrimination.

Going back to my talk at Royal Holloway (which may seem far from the neural network), I will attempt to

 enrol this within the conference theme of the ‘smart city’, and how certain imaginaries (drawing heavily from Gillian Rose’s recent thoughts on her blog) are generative of certain conditions of security. By this, how do the futuristic, clean, bright images of the city obscure and dent alternative ways of living and living with difference? The imaginings of space and place, mixed with algorithmic dimensionality, produce affects that must be thought of in any future imagining of the city. This draws not only from my insight from my PhD research on malware ecologies, in which I attempt to open-up what cybersecurities are and should include (and part of an article I am currently putting together), but also include feminist and queer perspectives to question what the technologically-mediated city will ex/in/clude.

I think space has been an undervalued concept in cybersecurity. Space and geography has been reduced to something of the past (due to imaginaries of the battlefield disappearing), and something that is not applicable in an ‘all-connected’ cyberspace. Instead, I wish to challenge this and bring critical analysis to cyberspace to explore the geographies which are performed and the resultant securities and inequalities that come from this. This allows for a maturity in space and cybersecurity – that appreciates that space is an intrinsic component of interactions at all levels of thinking. We cannot abandon geography, when it is ever more important in securities of everyday urban areas, in malware analysis, geopolitics, and even in the multi-dimensionality of the neural network. Hence space is an important, and fundamental, thing to engage with in cybersecurity which does not reduce it to the distance between two geometrically distant places.

 

RGS-IBG18: A Critical Geopolitics of Data? Territories, topologies, atmospherics?

Nick Robinson and I have put together a cfp for the RGS-IBG Conference that is below. This sort of movement to considering geopolitics is something that is becoming far more dominant in my work, and to have such a session helps to bring together some ideas that I’ve been having for many years, and particularly from the start of my PhD, on data and its relationship to territory (particularly after some excellent lecturing in my undergraduate days from Stuart Elden). However I look forward to understanding how data is/are constructive of geopolitics and how this may tie into some of the historical genealogies of the term.

A Critical Geopolitics of Data? Territories, topologies, atmospherics?

Sponsored by the Political Geography Research Group (PolGRG) and the Digital Geographies Working Group (DGWG)

Convened by Andrew Dwyer (University of Oxford) and Nick Robinson (Royal Holloway, University of London)

This session aims to invigorate lively discussions that are emergent at the intersection between political and digital geographies on the performativities of data and geopolitics. In particular, we grant an attentiveness to the emergent practices, performances, and perturbations of the potentials of the agencies of data. Yet, in concerning ourselves with data, we must not recede from the concrete technologies that assist in technological agencements that explicitly partake in a relationship with data, such as through drone warfare (Gregory, 2011), in cloud computing (Amoore, 2016), or even through the Estonian government’s use of ‘data embassies’ (Robinson and Martin, 2017).  Recent literature from critical data studies has supported an acute awareness of the serious and contentious politics of the everyday and the personal, with geographers utilising this such as around surveillance and anxiety (Leszczynski, 2015). In recent scholarship, a geopolitical sensitivity has considered the contingent nature of data, the possibilities of risk and the performances of sovereignties (Amoore, 2013), or even the certain dichotomies found in data’s ‘mobility’ and ‘circulation’, and its subsequent impact upon governing risk (O’Grady, 2017). In this, we wish to draw together insights from those on affective and more-than-human approaches in their many guises, to experiment and ‘map’ new trajectories, that emulsify with the more conventional concerns of geopolitics to express what a critical attention to data brings forth.

 

In this broadening of scope, how we question, and even attempt, to capture and absorb the complex ‘landscapes’ of data is fluid. How do our current theorisations and trajectories of territory, topology and atmospheres both elude and highlight data? Do we need to move beyond these terms to something new, turn to something else such as ‘volume’ (Elden, 2013) or indeed move away from a ‘vertical geopolitics’ in the tenor of Amoore and Raley (2017)? Do we wish to work and make difficult the complex lineages and histories that our current analogies provide us? Has geopolitical discourse, until now, negated the multitude of powers and affects that data exude? In this session, we invite submissions that offer a more critical introspection of data – its performativity, its affectivities, its more-than-human agencies – upon geopolitical scholarship, and even reconfigure what geopolitics is/are/should be.

 

Themes might include:

  • Data mobilities
  • Data through, across, and beyond the border
  • Data and its reconfigurations upon notions of sovereignty and territory
  • Vibrancies and new materialism
  • Legalities and net neutrality
  • Affectivities and non-representational approaches
  • Visceralities and broader attentiveness to the body
  • Atmospheres
  • Infrastructures
  • Diplomacy
  • Popular geopolitics

 

Session information

This session will take the form of multiple paper presentations of 15 minutes, each followed by 5 minutes of questions.

Please send a 200-word abstract, along with name and affiliation, to Andrew (andrew.dwyer@ouce.ox.ac.uk) AND Nick (Nicholas.Robinson.2014@live.rhul.ac.uk) by Monday 5thFebruary 2018.

Further information can be found on the RGS-IBG website here.

 

 

AAG2017 – Curating (in)security

I am thoroughly looking forward to the AAG with a session Pip Thornton and I have put together on ‘Curating (in)security: Unsettling Geographies of Cyberspace’. The programme is above, along with the original session outline below.

Curating (in)security: Unsettling Geographies of Cyberspace

In calling for the unsettling of current theorisation and practice, this session intends to initiate an exploration of the contributions geography can bring to cybersecurity and space. This is an attempt to move away from the dominant discourses around conflict and state prevalent in international relations, politics, computer science and security/war studies. As a collective, we believe geography can embrace alternative perspectives on cyber (in)securities that challenge the often masculinist and populist narratives of our daily lives. Thus far, there has been limited direct engagement with cybersecurity within geographical debates, apart from ‘cyberwar’ (Kaiser, 2015; Warf 2015), privacy (Amoore, 2014), or without recourse to examining this from the algorithmic or code perspective (Kitchin & Dodge, 2011; Crampton, 2015).

As geographers, we are ideally placed to question the discourses that drive the spatio-temporal challenges made manifest though cyber (in)securities in the early 21st century. This session attempts to provoke alternative ways we can engage and resist in the mediation of our collective technological encounters, exploring what a research agenda for geography in this field might look like, why should we get involved, and pushing questions in potentially unsettling directions. This session therefore seeks to explore the curative restrictions and potentials that exude from political engagement, commercial/economic interests, neoliberal control and statist interventions. The intention is not to reproduce existing modes of discourse, but to stimulate creative and radical enquiry, reclaiming curation from those in positions of power not only in terms of control, but by means of restorative invention.

Living with Algorithms Summer Workshop @ RHUL

I’m very excited with what looks like a fascinating workshop with a great set of people (I don’t believe the list is finalised yet, so will hold off on that) on living with algorithms. This is being hosted by Royal Holloway, University of London. I’m particularly excited with the short 5 – 10 minute provocations that this call for papers asked for – I’m sure they’ll be a some pretty diverse, and contentious, contributions on the day.

Below is my abstract for the workshop:

The kiss of death: an algorithmic curse.

Malicious softwares slither through noise of systems, of cyberspaces, attempting to avoid the signal, to defy their abnormality to their surrounding ecological condition. It’s a parasitical existence, to avoid the kiss of death.

Algorithmic detection: The curse of malwares. The curse of humans?

In similar techniques deployed against humans, the collection of data, its analysis, abnormality breathes from ever-modulating normalised abstractions. Or that is the intention, at least. Modern malware mostly emerges as malicious through the deployment of the detecting algorithm. Circulation and mobility are absolutely necessary for malwares to carry out their deeds of infection, exfiltration and so on. Yet precisely this circulation is its downfall. To be malware, it must move. Yet in moving it changes its ecological condition. Two cultures emerge at the point of software’s algorithmic detection; one becoming-human, one becoming-malware. Indeed, it is tempting to focus on humanly responses, looking at things in relation to ourselves. Yet how does algorithmic detection expose the malicious intentionality of otherwise ‘normal’ software? What human-malware normality is required?

We, malwares and humans, are rightly concerned with algorithmic detection. This is where our cultures converge in a more-than-human political project. We are unlikely to ever sense each other in that way however. Humans and malwares develop everyday practice at certain sites, sometimes technological, other times not. These include anti-virus programs, the secure connection to banking credentials, stealing ‘confidential’ big data, in organisational practice, in the virtue of the software programmer, in the chatter of politicians. When malware is detected, when it becomes known, it is sealed-off, destroyed, deleted. As humans, can similar algorithmic detection mechanisms come from our dividualisation? Can looking to a more-than-human offer potential futures of hope and resistance to the dominance of algorithms? A way for us to slither through our spatial registers?

 

Durham Moving Together Postgraduate Conference

I will be presenting at the Durham conference on May 4 2016 with the paper:

W32.Stuxnet: An Olympic Games.

Sprinting, jumping, throwing, shooting, running, leaping.

 

Siemens Programmable Logic Controller (PLC)? Seimens SIMATIC Step 7 Industrial Control Software? Yes… Next Step.

 

Welcome to the most wonderful of Olympic Games. A brilliant new, sophisticated cyber weapon has been created. A game against Iran, against its nuclear enrichment programme in Natanz. Those who played we can only deduce; the USA and Israel. Stuxnet is the name attributed to this multifaceted, modular, updating malicious software(s?). It slithers, propagating between machines, checking, stealthily, hiding, the joker of the system. What a game, to travel with this more-than-human. Enter this cyberspatial ecology, driven by a tension of potentiality, beyond virtual, the real. Collaborations between malware artists and their offspring, malwares, generate peculiar, novel methods of movement. USB sticks, Seimens PLCs, network shares, command and control servers. It is simultaneously divided and yet constituted, materialised. Its mobility disguised, tricking, mimicking normal flows. Through its movement it becomes known. Static analyses neglect the agential vibrancy this malware exudes; it is through flows it is malicious – to us humans – ultimately it is (simply) software. Experience how Stuxnet interacts with complex geopolitical interactions of Iran and the USA / Israel, confused engineers at their screens, Windows operating systems, zero-day exploits and modular malware engineering. Let’s explore what our expert human friends tell us of malware, the conflicting narratives of their movement, one that disjoints dominant human action from the ecology within which cyber security develops. Join us on a geographical adventure to experience an ever-incomplete picture of our destructive (productive?) compatriot.

The in(dividual)

Yesterday I went to a reading group within cyber security, and we talked about an interesting paper that was in Science January this year, called “Unique in the shopping mall: on the reidentifiability of credit card metadata” (paid subscription required). Though we talked about several of the issues with the paper and the reason for its appearance in Science for a start, this got me thinking about the wider concept of the ‘dividual’ that Deleuze details in a short article that was published (see paper here) in the publication October in 1992.

Through a fairly dense, but easy to read paper, Deleuze summarises that we have moved from Foucault’s disciplinary societies to control societies. For those with a background in this, please skip to the next paragraph. So, to potentially to give the work of Foucault great injustice in what I am about to say; Foucault identifies a transformation of society in the transition from the medieval to industrial period. These periods are obviously not solely independent and the mechanisms do not always belong to one and can be applied alongside one another. Hence the growth of institutions such as the school, the hospital, the barracks, the prison and so on all were a transition where bodies en masse were controlled and disciplined to work for the powerful.

To speed on from the simple explanation above, Deleuze (and Foucault himself in governmentality and biopolitics) identify a new movement in the development of their thought. This is one where individualism and the body not solely as an empty ‘space’ becomes a ‘place’ where thoughts and movements should be all-flowing and monitored. Modulation is the word Deleuze uses to express this new formation where we do not simply move between institutions as before but are constantly having to learn, self-police, healthcare services in the home and the burgeoning market in healthcare products. This means that the emphasis is on the individual to succeed (with its associated serpent, neoliberal capitalism).

So, why the societies of control or control societies? Unlike in the past where individuals were constructed in order to be disciplined, neoliberalism requires free movement but states (and other stakeholders seeking to control – think corporations, gated communities) still require extensive monitoring to ensure they maintain their power. This monitoring is aided through the use of technologies that track our movements through passes to enter buildings, touchless payment cards and mobile phone signals. Deleuze coins the word ‘dividual’ to capture the data that are produced by in(dividuals) where segments of the data are used to control; such as the ability to access buildings, access to credit according to financial transaction history, et cetera. The concept of the dividual makes more sense if we have discrete datasets. Yet, we live in the world of supposedly ‘big’ data where there is an increasing ability to cross-reference dividualised data to (re)construct an ‘in(dividual)’.

Returning to the paper that constructed my thoughts above, the authors claimed that they could easily reconstruct roughly 90% of unique credit card identifiers through four informational nodes. These could include the location of the shop, time of purchase, approximate cost and distance from next purchase for example. Though there are other issues of privacy and the unicity (the ability to reidentify unique individuals) of data, there is a philosophical question to grapple with that uses both the societies of control and disciplinary societies. I consider the ‘body’ (in its extension to producing non-human datas, movements across space and like) to be critical to arguing our current epoch is not one of pure dividuals – and displaying the geographies this produces.

I much prefer to use the ‘in(dividual)’ to present the current manifestation of our society. The formation of the internet and ever-increasing sharing of information has enabled disparate information to come together and provide ‘value’ to capitalism. This is epitomised in the valuation of social networks such as Facebook and Twitter, and the giant Google. This value requires these companies to in(dividual)ise. Let me explain what I mean here. So for ‘big’-data analytics to operate effectively it needs to dividualise my body(ie)’s movements through its limited collection points; through my credit card, my phone signal, my Facebook account, the cookies I leave lying around and so on. This enables a population-becoming whereby services can be focused on particular ‘groups'(?) and reflects the growing use of statistics in the development of biopolitics (see Louise Amoore’s article ‘Security and the claim to privacy‘ on ‘data derivatives’). Yet there is a requirement for personalised advertising where I must become in(dividual). I must form a group. I am gay. Therefore I get many ‘gay-themed’ adverts across the internet (some to my utter amusement!). This feedback loop, where I am classed as forming as ‘at risk’ group for example, if I was to apply for credit with a ‘poor’ rating, then the in(dividual) would come to play. My in(dividual) body’s movement influences ‘it’, and ‘it’ influences ‘me’.

Therefore how can one work against this? What playful acts can I working as an in(dividual) do? I could spend rather large amounts at different places (although probably not), use different cards, use other people’s cards? Or I could change my Facebook ‘likes’ or make completely false trails everywhere. This is where the power lays. This is where the kink in current society lies. Although I am partially determined by my allocation, what happens if I do not conform to any group – I do not only do it for myself, the data that feeds the group is also skewed. This is true play. To circumvate the rules, to not conform to one identity, but express the multiple identities the body inherently exudes. This in(dividual)ising both can have detrimental effects on how I operate as an in(dividual) as long as I play by the rules. The best play is one which bends them.

Why is the body critical to this? Critically the body is one which has truely emancipatory affect (though we must realise we live in a period where ‘able’ bodies tend to ‘succeed’ in comparison to less-able bodies). There are only a limited amount of collection points (though these are ever-increasing in size with sensors in the Internet of Things (IoT)) that mean that their comprehension of the world is always limited and non-pervasive. Therefore feeding certain nodes bits of information that our bodies produce incorrectly (such as hacking a wearable technology to send ‘healthy’ signals to an insurance company) enable small acts of powerful play that not only distort the in(dividual) but the dividualised groupings. We can use the ingenuity of the body (and here I refuse to use the mind-body dualism – useful to point out here) to claim the in(dividual) for ourselves, in whatever form ourself may take.