Rethinking Space in Cybersecurity + ‘Algorithmic Dimensionality’

After initially volunteering to give a ‘lightning’ talk at the CDT in Cyber Security joint conference (Programme) at Royal Holloway next week (3 & 4 May), I was given the opportunity to speak at greater length for 30 minutes. This has provided me the breathing space to consider how I have been conceptualising space in cybersecurity – and is likely to form the basis for the last chapter of my thesis and a subsequent paper I wish to develop out of this (and what I see I’ll be doing post-PhD).

This draws further upon the talk I gave just over a week ago at Transient Topographies at NUI Galway, Ireland. In this, I explored the formation of the software ⇄ malware object, and how this relates to concepts of topography and topology in geography and beyond. In this, I explored how space is thought through in cybersecurity; whether through cartesian representations, cyberspace, or the digital. In my re-engagement with material in software studies and new media, I have intensified the political spheres of my (auto)ethnographic work in a malware analysis lab . Namely, how we come to analyse, detect, and thus curate malware (in public opinion, in visualisations, in speeches and geopolitical manoeuvres) as something that affects security and society. This is not something I claim as anything new, by the way, with Jussi Parikka in Digital Contagions doing this on malware and ‘viral capitalism’, and the multiple works on the relation between objects and security.

Instead, I wish to trace, through my own engagements with malware and security organisations, how space has been thought of. This is in no way a genealogy which would be anything near some contributions (yet) on space and security – but I see this as a start on this path. In particular, how has computer science, mathematics, cybernetics, cyber punk literatures, the domestication of computing, and growing national security anticipatory action conditioned spatial understandings of malware? This has both helpful and unhelpful implications for how we consider collective cybersecurity practises – whether that be by government intervention, paid-for endpoint detection (commonly known as anti-virus) surveillant protection through scanning and monitoring behaviours of malware, attribution, senses of scale, or threat actors – among a variety of others.

Representation of a ‘deep-learning’ neural network. Each layer is connected to each other with different weights, according to each particular application. Some neurons become activated according to certain features.

This working of space in cybersecurity is tied with what I term ‘algorithmic dimensionality‘ in our epoch – where algorithms, and primarily neural networks, produce dimensional relations. What I mean by dimensions is the different layers, that come together to produce certain dimensions of what to follow at each consecutive layer, generating relationships that are non-linear; that can be used for malware detection, facial recognition, and a variety of other potential security applications. These dimensions exist beyond humanly comprehension; even if we can individually split neuron layers and observe what may be happening, this does not explain how the layers interact adequately. Hence, this is a question that extends beyond, and through, an ethics of the algorithm – see Louise Amoore‘s forthcoming work, which I’m sure will attend to many of these questions – to something that is more-than-human.

We cannot simply see ethics as adapting bias. As anyone who has written neural networks (including myself, for a bit of ‘fun’), weights are required to make it work. Algorithms require bias. Therefore reducing bias is an incomplete answer to ethics. We need to consider how dimensionality, which geographers can engage with, is the place (even cyberspatial) in which decisions are made. Therefore, auditing algorithms may not be possible without the environments in which dimensionality becomes known, and becomes part of the generation of connection and relationality. Simply feeding a black box and observing its outputs does not work in multi-dimensional systems. Without developing this understanding, I believe we are very much lacking. In particular – I see this as a rendition of cyberspace – that has been much vented as something that should be avoided in social science. However dimensionality shows where real, political, striations are formed that affect how people of colour, gender, and sexual orientation become operationalised within the neural network. This has dimensional affects that produce concrete issues; whether by credit ratings, adverts shown, among other variables that are hard to grasp or prove discrimination.

Going back to my talk at Royal Holloway (which may seem far from the neural network), I will attempt to

 enrol this within the conference theme of the ‘smart city’, and how certain imaginaries (drawing heavily from Gillian Rose’s recent thoughts on her blog) are generative of certain conditions of security. By this, how do the futuristic, clean, bright images of the city obscure and dent alternative ways of living and living with difference? The imaginings of space and place, mixed with algorithmic dimensionality, produce affects that must be thought of in any future imagining of the city. This draws not only from my insight from my PhD research on malware ecologies, in which I attempt to open-up what cybersecurities are and should include (and part of an article I am currently putting together), but also include feminist and queer perspectives to question what the technologically-mediated city will ex/in/clude.

I think space has been an undervalued concept in cybersecurity. Space and geography has been reduced to something of the past (due to imaginaries of the battlefield disappearing), and something that is not applicable in an ‘all-connected’ cyberspace. Instead, I wish to challenge this and bring critical analysis to cyberspace to explore the geographies which are performed and the resultant securities and inequalities that come from this. This allows for a maturity in space and cybersecurity – that appreciates that space is an intrinsic component of interactions at all levels of thinking. We cannot abandon geography, when it is ever more important in securities of everyday urban areas, in malware analysis, geopolitics, and even in the multi-dimensionality of the neural network. Hence space is an important, and fundamental, thing to engage with in cybersecurity which does not reduce it to the distance between two geometrically distant places.


Topographies and Automation: Directions of my malicious research

So I’m pleased to say that I’ve been accepted on two pretty different conferences, which both disseminate different parts of my DPhil (PhD) project, and I intend to structure into more formal papers after these events.

The first is ‘Transient Topographies‘, which will be in April in Galway, Ireland that will help explore the complex word ‘topography’. This is an exceptionally complicated term for me, coming from the dual lineages of human geography and computer science where they frequently have alternative and conflicting interpretations of what this word means. Yet, this conference allows me to explore in some depth the reason I became interested in malicious software in the first place: its spatial implications. For me, malicious software allow for us to blend the spatial understandings that come with ecology in both more-than-human geographies and the broad study of technologies or technics. This also allows me to explicitly move on from an awkward experience at last year’s RGS-IBG when I was just initiating these ideas.  I had not given it enough thought to this topic, and ultimately the core argument I wish to put forward was lost. To put clearly – I think our division between human and technic is unsustainable. I hope this will become clearer in this talk.

The second is an exceptionally interesting session put together by Sam Kinsley for this year’s RGS-IBG in Cardiff on the ‘New Geographies of Automation’. This tackles the more historical (and somewhat genealogical) aspect to my work which I never anticipated I would do in my PhD. However, it has been something which captured my attention during my ethnographic work in a malware analysis laboratory.  The complex array of tools and technologies with which we have come to sense, capture, analyse, and detect malware through both ‘static’ (conventional) and ‘contextual’ (based on data) approaches. On the whole, these are primarily tools that have automated the way we comprehend malware. Even from the most basic rendering of malware requires assumptions that are built into automation of displaying malware – such as assuming a certain path which the software follows. Yet these fundamental approaches to malware analysis and the entire endpoint industry are ever-present in contemporary developments. Though I claim there has always been a more-than-human collective in the analysis of malware, developments in machine-learning offer something different. If we look to both Louise Amoore and Luciana Parisi, there is a movement of the decision that is (at least) less linear than we have previously assumed. Thus automation is entering some form of new stage, which means we not only have more-than-human collectives, but now more-than-human decision-making that is non-deterministic.

Both abstracts are below:

Transient Topographies – NUI Galway


The Malicious Transience: a malware ecology

Keep the Computer Clean! Eradicate the Parasite!



Lab_Check(“Software_Form, Tools, Human_Analyst”) ==1

Flag(“machine_learning”)       ==1

; Check feeds from vendors and internal reputation scoring

Rep(“vendors”)                        ==1

Rep(“age”)                              ==>5000

Rep(“prevalence”)                  ==>300

; Intuition of human analyst to find matching structure

e89a500500    ; call sub_routine

48c1e83f         ; shr rax, 3fh

84c0                ; test al, al

7414                ; jz short loc_16475f

bf50384f00      ; move di, offset dangerous_thing

; Environmental behavioural checks

Env_Chk(“bad_access, registry_change, files_dumped”) =1 detect

; terminate unsuccessfully


; Detect as malware, report infection (Ri) and clean





Contemporary malware analysis is pathological. Analysis and detection are infused with medical-biological discourse that entangle the technological with good and bad, malicious and clean, normal and abnormal. The subjectivities of the human analyst entwine with the technical agencies of big data, the (de)compiler, the Twitter feed. Playing with the static rendering and contextual behaviour come alongside the tensed body, a moment of laughter. Enclosed, sanitised, manipulated ‘sandboxed’ environments simulate those out-there, in the ‘wild’. Logics of analysis and detection are folded in to algorithms that (re)define the human analyst. Algorithm and human learn, collaborate, and misunderstand one another.  Knowledges are fleeting, temporary, flying away from objective truths. The malicious emerges in the entanglement of human and technology, in this play. The monster, the joker, sneaks through the stoic rigour of mathematics and computation, full of glitch and slippage. Software is ranked, sorted, sifted. It is controlled, secured, cleansed, according to its maliciousness. Yet it is transient; its map forever collapsing. Time and space continue, environments refigured, maliciousness (re)modified. Drawing on ideas of technē, between communication, art and technology, this paper queries current pathological logics. This looks at what a broader grasp of more-than-humans through an ecological approach; to include the subjectivities, environments, and social relations after Félix Guattari, could achieve.


New Geographies of Automation – RGS-IBG


Automating the laboratory? Folding securities of malware

Folding, weaving, and stitching is crucial to contemporary analyses of malicious software; generated and maintained through the spaces of the malware analysis laboratory. Technologies entangle (past) human analysis, action, and decision into ‘static’ and ‘contextual’ detections that we depend on today. A large growth in suspect software to draw decisions on maliciousness have driven a movement into (seemingly omnipresent) machine learning. Yet this is not the first intermingling of human and technology in malware analysis. It draws on a history of automation, enabling interactions to ‘read’ code in stasis; build knowledges in more-than-human collectives; allow ‘play’ through a monitoring of behaviours in ‘sandboxed’ environments; and draw on big data to develop senses of heuristic reputation scoring.

Though we can draw on past automation to explore how security is folded, made known, rendered as something knowable: contemporary machine learning performs something different. Drawing on Louise Amoore’s recent work on the ethics of the algorithm, this paper queries how points of decision are now more-than-human. Automation has always extended the human, led to loops, and driven alternative ways of living. Yet the contours, the multiple dimensions of the neural net, produce the malware ‘unknown’ that have become the narrative of the endpoint industry. This paper offers a history of the automation of malware analysis from static and contextual detection, to ask how automation is changing how cyberspace becomes secured and made governable; and how automation is not something to be feared, but tempered with the opportunities and challenges of our current epoch.

Strava, Sweat, Security

Wearable tech, the ability to share your fitness stats, suggest routes, follow them, and so on have been a growing feature of (certain) everyday lifestyles. This ability to share how the body moves, performs, and expresses itself gives many people much satisfaction.

One of the popular methods is through Strava which is primarily used by runners and cyclists to measure performance (maybe to improve), and also share that information with others publicly. There are individual privacy settings, that allow you to control what you share and do not share. All seems good and well. An individual can express their privacy settings in the app: that should be the end of the story. Yet, Strava’s temptation is to share. Otherwise we could just other wearable tech that does not have such a user-friendly sharing ability, and be done with it.

Strava has recently shared a ‘Global Heatmap‘ that allows for an amalgamation of all these different individuals sharing their exercise, their sweat, their pace, to Strava for ‘all’ to access. Hence here we have a collective (yet dividualised – in the Deleuzian sense) body that has been tracked by GPS, the sweat allowing for an expression of joy, an affective disposition to share. This sharing allows for a comparison to what a normative Strava body may be, allowing for a further generation of sweaty bodies. Yet in the generation of these sweats, security becomes entangled.

This is where privacy and security comes entangled in the mist of a fallacy of the individual. The immediate attention of Strava’s release quite literally maps to concerns over ‘secret’ locations, such as  secret military bases, but also some more trivial such as how many use it around GCHQ, in the UK. This has led to calls for bans for those in military units to reduce this exposure. However, this does not address how the multiple individual choices using an app in which privacy is only one of anonymisation when this is ‘publicly’ shared by a person. This aggregated picture is ‘fuzzy’, full of traces of this dividual sweaty body. These sweaty bodies are flattened, treated as data points, then recalibrated as though all points are the same. In fact, they are not. Privacy and security are inherently collective.

So, why does this matter? If the individuals shared certain information then they are free to do so and their individual routes are still not (re)identified. Yet privacy in this concept is based on a western canon of law, where the individual is prime. There is a form of proprietary sense of ownership over our data. This is not something which I disagree with, as much in feminist studies informs us of the importance of the control over of our bodies (and therefore the affects it produces; the sweat, on our mobile devices, on Strava). Yet there has to be a simultaneous sense of the collective privacy at work. In this case, it is rather trivial (unless you run a secret military base). Yet it explodes any myths around the use of ‘big data’. Did Strava make clear it would be putting together this data, was explicit consent sought beyond the initial terms and conditions? Just because something becomes aggregated does not mean we lose our ability to deny this access.

The use of more internet-connected devices will allow for maps that gain commercial attention, but at the expense of any sense of collective privacy. Only states were able to produce this information; through a bureaucracy, yet there have been publicly, democratically-agreed steps to protect this information (whether this is effective is another question). Now we live in a world where data is seen to be utilised, to be open, freely accessible. Yet we need far more conversation that extends beyond the individual to the collective bodies that we occupy. To do one without the other is a fallacy.

My, and your, sweaty body is not there for anyone to grab.

RGS-IBG18: A Critical Geopolitics of Data? Territories, topologies, atmospherics?

Nick Robinson and I have put together a cfp for the RGS-IBG Conference that is below. This sort of movement to considering geopolitics is something that is becoming far more dominant in my work, and to have such a session helps to bring together some ideas that I’ve been having for many years, and particularly from the start of my PhD, on data and its relationship to territory (particularly after some excellent lecturing in my undergraduate days from Stuart Elden). However I look forward to understanding how data is/are constructive of geopolitics and how this may tie into some of the historical genealogies of the term.

A Critical Geopolitics of Data? Territories, topologies, atmospherics?

Sponsored by the Political Geography Research Group (PolGRG) and the Digital Geographies Working Group (DGWG)

Convened by Andrew Dwyer (University of Oxford) and Nick Robinson (Royal Holloway, University of London)

This session aims to invigorate lively discussions that are emergent at the intersection between political and digital geographies on the performativities of data and geopolitics. In particular, we grant an attentiveness to the emergent practices, performances, and perturbations of the potentials of the agencies of data. Yet, in concerning ourselves with data, we must not recede from the concrete technologies that assist in technological agencements that explicitly partake in a relationship with data, such as through drone warfare (Gregory, 2011), in cloud computing (Amoore, 2016), or even through the Estonian government’s use of ‘data embassies’ (Robinson and Martin, 2017).  Recent literature from critical data studies has supported an acute awareness of the serious and contentious politics of the everyday and the personal, with geographers utilising this such as around surveillance and anxiety (Leszczynski, 2015). In recent scholarship, a geopolitical sensitivity has considered the contingent nature of data, the possibilities of risk and the performances of sovereignties (Amoore, 2013), or even the certain dichotomies found in data’s ‘mobility’ and ‘circulation’, and its subsequent impact upon governing risk (O’Grady, 2017). In this, we wish to draw together insights from those on affective and more-than-human approaches in their many guises, to experiment and ‘map’ new trajectories, that emulsify with the more conventional concerns of geopolitics to express what a critical attention to data brings forth.


In this broadening of scope, how we question, and even attempt, to capture and absorb the complex ‘landscapes’ of data is fluid. How do our current theorisations and trajectories of territory, topology and atmospheres both elude and highlight data? Do we need to move beyond these terms to something new, turn to something else such as ‘volume’ (Elden, 2013) or indeed move away from a ‘vertical geopolitics’ in the tenor of Amoore and Raley (2017)? Do we wish to work and make difficult the complex lineages and histories that our current analogies provide us? Has geopolitical discourse, until now, negated the multitude of powers and affects that data exude? In this session, we invite submissions that offer a more critical introspection of data – its performativity, its affectivities, its more-than-human agencies – upon geopolitical scholarship, and even reconfigure what geopolitics is/are/should be.


Themes might include:

  • Data mobilities
  • Data through, across, and beyond the border
  • Data and its reconfigurations upon notions of sovereignty and territory
  • Vibrancies and new materialism
  • Legalities and net neutrality
  • Affectivities and non-representational approaches
  • Visceralities and broader attentiveness to the body
  • Atmospheres
  • Infrastructures
  • Diplomacy
  • Popular geopolitics


Session information

This session will take the form of multiple paper presentations of 15 minutes, each followed by 5 minutes of questions.

Please send a 200-word abstract, along with name and affiliation, to Andrew ( AND Nick ( by Monday 5thFebruary 2018.

Further information can be found on the RGS-IBG website here.



New Year, New Me

So, this year will hopefully be the one where I gain a driving licence, pass the DPhil, and at least have a clear idea of what I intend to do in 2019 (if not already doing it).

One of my resolutions is to start writing blog posts alongside the writing of my thesis chapters, as a way to start to digest the astounding amount of stuff I have collected over the past 27 months. However, I am aware of recent turns and troubles of releasing information too early, for this then to be published by someone else. Although rather depressing that this happens – it does, and one should be aware of it (especially as an early career researcher). This will also tie in with another resolution to keep a diary of my broader life alongside the ‘day-job’. I kept one throughout all of my authethnographic research, and it is has been equally enthralling and disturbing reading of the banality of one’s life. Yet the usefulness of being able to keep a record of what I believe will be an interesting 2018, I believe, warrants the time and attention it requires.

I’m not exactly sure how these will play out, but this is so that I can express some thoughts that may never go into papers in their full form, or may inform other bits that I am writing. Because, in all honesty, much work never rises to a point of publication – not because of quality, but often of time and interest. Then there is a more cynical reason that I need to be able to demonstrate thinking beyond malware, that shows my versatility as a researcher. These two objectives are complimentary, and can be enjoyable, even if a thrust of one goes against what I would like to admit to myself. So I look forward to the rest of this year, and intend to leave some insightful comments and points along the way for anyone interested. I think my January post will likely revolve around something on the ‘Ecological Thought’, and its implications for cybersecurity.


This week I attended the #NATOtalk16 conference held at the (infamous) Hotel Adlon, along with a pre-session discussion with the youth arm of the German Atlantic Association (YATA). This was a great few days with a dedicated ‘cyber security’ group which was great. There are recommendations which were written by all participants (available here), where my short paper on the future of NATO and deterring digital ‘warriors’ (one which I don’t like, but worked with) is. This is also shown below.

It was interesting to see NATO respond to the election of Trump and the future of Germany within Europe in the context of Brexit. The event was in partnership with the British Embassy, Berlin and it was clear there was an emphasis that Brexit does not damage the relationship with NATO. However, with Germany’s new security doctrine published in the Weißbuch (White Book) German/English which gives a more assertive stance to Germany’s positioning, with a growth in spending and in particular citing Russia as a core concern. This has provided me with some interesting background and context for my thesis project in malware ecology, and how this is being thought of in more international relations circles.


Fluid Operations: NATO and Cyberdeterrence

Multiple actors, lack of attribution, and hybrid action are all part of modern warfare. The growth of the internet and other digital systems has rapidly led to cyber security becoming a serious concern, from
individual users to (inter)national security. This short piece examines NATO and its ability to deter actors who attempt to subvert its collective security. This follows an analysis of current difficulties in deterrence, namely difficulties with attribution, low engagement barriers, and multiple actors. These concerns are then folded into avenues for further exploration in defence and offensive operations, and what blended or hybrid responses may entail. An exploration of these issues concludes that the distinction between defensive and offensive operations in cyberspace are fluid, where ‘active defence’ utilising situational awareness provides the best deterrence for most actors.


Alertness to cyber security sharpened with attacks against Estonia in 2007. Although never fully attributed to Russia, it exposed the potential vulnerabilities that existed among allies as dependence on assets in cyberspace has grown. Additional events in Georgia in 2008 and more recently in Ukraine have demonstrated how cyberattacks can be blended in forms of hybrid attacks that aim to destabilise states before more conventional incursions occur. NATO has responded through developing a coordinated cyber security apparatus and the formalisation of doctrine that declares that international norms of engagement apply to cyberspace.

Yet, in comparison to previous decades, there has been considerable difficulty in engaging in forms of deterrence. I identify three of the most pressing:

  •   Attribution: Due to the ability to mask location and to lay decoys to the origin of an attack, conventional forms of deterrence are often not applicable.
  •   Low Engagement Barrier: The pervasiveness of digital systems across allied and non-allied states increases the vectors and opportunities for low-skilled actors to engage.
  •  Multiple Actors: Due to the low engagement barrier, it is not only states that have interest in subverting NATO, but also criminals, terrorists, and hired mercenaries that may sell their services to the highest bidder.

Current policy options

We often divide defensive and offensive capacity, which enables clear doctrinal policy, but is of little use to cyber security strategy. NATO is responsible only for its own internal systems and ensuring that these integrate with allied systems. Yet, it currently has no offensive capacity of its own apart from those developed by allies.

Defensive: In all scenarios, defence of critical systems provides the best deterrence from actors in cyberspace. This includes everyday management of critical national infrastructure, ensuring good education, and the monitoring of networks along with other recognised good cyber security ‘hygiene’. My PhD research on malware ecology demonstrates that maintaining good cyber security posture often prevents many subversions at entry points to the system. Yet due to interdependencies between systems, between governments and business, there will always be deficiencies in cyber security, including the opening up of previously unknown vulnerabilities such as zero-day exploits.

Offensive: Discussions of offensive capacity in NATO often focus on the trigger for Article 5, and what an armed cyberattack may constitute. This often descends into theoretical discussions over ‘cyber weapons’, and one which I will not go into. If we disregard the latter, the options remain either symmetric or asymmetric with conventional response. The former is often difficult due to time dependencies in developing a sophisticated response after an attack. The latter could be considered disproportionate, but is an essential arsenal for deterrence.


There is a false dichotomy between defence and offence in cyberspace. Ensuring security often requires scanning for threats a priori an attack or subversion. This means maintaining a high sense of situational awareness, and one that espionage traditionally provides. Therefore, developing potential offensive operations to be deployed in case of attack provide the most appropriate avenue for deterrence. Publicly disclosing an arsenal of non-specific advanced defensive preparation may deter some attacks. This addresses proportionality, enhances situational awareness and allows for preparedness. In addition, it aids with attribution as situational awareness of an array of actors can be pinpointed with greater accuracy whilst also enabling responses that do not wrongly attribute a state for non-state actors.

Policy Recommendations

  1. Furtherenhancedefensivecapacitythroughgoodpracticesofcybersecuritythatharmonise across allied states.
  2. Developanoffensivearsenalthatcanberapidlydeployedintheeventofanattackthrough ‘active defence’.
  3. Maintain conventional asymmetrical response.

AAG2017 – Curating (in)security

I am thoroughly looking forward to the AAG with a session Pip Thornton and I have put together on ‘Curating (in)security: Unsettling Geographies of Cyberspace’. The programme is above, along with the original session outline below.

Curating (in)security: Unsettling Geographies of Cyberspace

In calling for the unsettling of current theorisation and practice, this session intends to initiate an exploration of the contributions geography can bring to cybersecurity and space. This is an attempt to move away from the dominant discourses around conflict and state prevalent in international relations, politics, computer science and security/war studies. As a collective, we believe geography can embrace alternative perspectives on cyber (in)securities that challenge the often masculinist and populist narratives of our daily lives. Thus far, there has been limited direct engagement with cybersecurity within geographical debates, apart from ‘cyberwar’ (Kaiser, 2015; Warf 2015), privacy (Amoore, 2014), or without recourse to examining this from the algorithmic or code perspective (Kitchin & Dodge, 2011; Crampton, 2015).

As geographers, we are ideally placed to question the discourses that drive the spatio-temporal challenges made manifest though cyber (in)securities in the early 21st century. This session attempts to provoke alternative ways we can engage and resist in the mediation of our collective technological encounters, exploring what a research agenda for geography in this field might look like, why should we get involved, and pushing questions in potentially unsettling directions. This session therefore seeks to explore the curative restrictions and potentials that exude from political engagement, commercial/economic interests, neoliberal control and statist interventions. The intention is not to reproduce existing modes of discourse, but to stimulate creative and radical enquiry, reclaiming curation from those in positions of power not only in terms of control, but by means of restorative invention.

Do cybersecurity objects matter?

For anyone who has been following my twitter will realise I have been writing about malware as objects. This seems like a fundamentally weird and albeit useless thing to do (and one I have wondered myself). Yet thinking of objects as something that matter in cybersecurity is essential.

This is a question I posed myself: can malware be an object?

This was somewhat triggered by my other side as a geographer interested in space, time, and place. Evidently when malware was emerging in the 1990s as a political concern, cyberspace was still often referred to as ‘frictionless’ and transversing the Westphalian model of individual sovereign states – all part of a growing post-Soviet triumphalism of western liberalism. This is how malware is often seen, as being ‘out-there’ and something bounded and what travels without little connection to anything else. Yet I’ve never been able to put my finger on to what may be a malware object – it clearly is much more than the software used to construct it. How about the writers (sometimes known as hackers and artists), the malware ecology of different interdependencies? Can it extend out to speeches, political discourse, malware laboratories? Some of these things would not exist if it wasn’t for malware. Yet who knows what this is.

In a good start to thinking through these issues and implications for cybersecurity, Balzacq and Cavelty (2016) (open access available here) talk of an actor-network theory approach. Though I disagree with some points they do highlight the importance that objects have to, in this case, international relations. Yet it is also true they have a huge impact on computer science and cybersecurity. I do not want to overly dwell on the philosophy here, but there have been movements to appreciate objects as things in themselves over the past two decades or so, with one of these being Object Orientated Ontology (OOO). This helps us comprehend how objects, such as malware, have an ability to act and cause things to change. I am not saying that malwares have intention, as that would suggest they have a human quality to be malicious – that is the human working with them. Of course objects in computer science have a somewhat different meaning to what I’m referring here, but do fit in. Without falling into the trap that Alexander Galloway notes in his work (2013) that we orientate our thinking around the technology we talk about, objects have states and behaviours.

However I do not think we can locate malware in a specific location on a map. If we think of how malware communicate – through command and control servers, in botnets, through peer-to-peer networking, using the internet – to download modules, to share information, to activate, then malware is stretched across multiple different places. If you require some information from a server that is routed through Ukraine, let’s say, but your target is in the USA, then where is malware as an object in the broader sense? Yes, there is local software on the individual machine, but it requires connection to extract information for instance. Then there are the political reasons that certain groups operate out of certain places, the training required, the knowledge to do certain things are all geographically disperse. Can you separate the malware object from this? I think not, and it becomes part of the malware object, made up of different malicious elements, such as the local software on the machine, with a sever elsewhere, with the right political conditions that enable it to become malware in a sense that we can detect and analyse it and it becomes successful.

So, when we consider malware as geographically distributed in this way, it is in tension, with lots of potential for something to happen (think of the Conficker botnet that did very little). So it is when all elements of the malware object are part of doing something that it really formulates, and it becomes malicious. Yes, we can see the warning signs through signatures, but it is only when the malware object comes together that it is something we can track, analyse, detect through networks. This is why Advanced Persistent Threats (APTs) are so interesting, as they are so sleuth that the object is very difficult to detect – and may not seem to be acting differently to the norm. When is an APT part of a malware object? This is something I need to do a bit more thinking on.

Therefore when talking about malware, when detecting it, it’s about the entire ecology of malware, it is not just the end-point detection, but it only becomes malware when all the elements forge an object. This may now sound obvious – but it disrupts the idea that an object is material, located in a fixed place at a certain time, and adds tension to the mixture. Therefore you have to tackle all parts of the ecology – computer science, international relations, crime – to attempt to force it to something that is only ever partially controlled. This means that connected thinking is essential to consider how to tackle malware, and cannot be simply at the end-point. Evidently, this is just me dropping an idea at the moment but I hope to work with this much more as a core tenet of how malware can be reconsidered to assist in cybersecurity, but also challenge some geographical thinking.

Evernote – Data Protection Woes…

Unfortunately, Evernote cannot be used for any personal or confidential information it seems if you’re from the EU. As I was wading through the required confidentiality and data protection required for my DPhil fieldwork, I had to really dig around to find out what the University’s (that is Oxford’s) policy on cloud storage. It appears this excludes any transfer of data outside of the EEA (the European Economic Area). That is even with the new ‘Privacy Shield’ between the EU and the USA.

I was thinking of using Evernote as a simple tool to store notes and my research diary – with the syncing a useful back-up tool. However Evernote is not yet a signatory to the new ‘Privacy Shield’, which you can check here. Although Evernote is a signatory to the old ‘Safe Harbor’ agreement, this is now invalid – as can be seen on this page – following the European Court of Justice’s ruling in October 2015. Therefore if you are a researcher, and are using Evernote with information that falls under Data Protection, you are likely falling foul of your obligations to ensure it remains under EU jurisdiction.

Therefore I recommend you follow instructions here to create a local notebook only that is stored only on the computer you are using it on. Instructions are here. This is the only way to ensure you are keeping with requirements under EU data protection and ensuring your research maintains data security integrity. I’m hoping Evernote sign up to ‘Privacy Shield’ soon so that I can sync my notes as this would be very useful.


If I am wrong, it would be great to know, but after a good time searching I cannot find evidence to the contrary.

CDT Open Day

I wish I could have attended my centre’s open day, from which I hear was a major success! It’s great to be part of a group of individuals pursuing some very different areas of cyber security across computer science, international relations, law, philosophy, and geography (well, only me, so far). Below is the poster that summarises my current DPhil onto A1. I haven’t seen it printed and it will have been there yesterday.

I’m currently in the process of organising my fieldwork (still), but hopefully I will get there. I am also still obsessing and dragging my feet over a piece on ‘objects’ I am writing, which I will be presenting first at the ISA conference in February – this is definitely the furthest in advance I’ve been writing and thinking in depth for a conference, and subsequent paper, so I think that must be progress?