Seminar – TORCH Oxford – 27 November – A Computational Reckoning: Calculating the Anthropocene

This will be happening in Trinity College’s Sutro Room, 27 November 2-3pm

I’ve got another talk coming up in addition to a talk at King’s College London (7 November) but this time in a seminar session at the research network ‘Life Itself in Theory and Practice‘ which is now in its second year. I had a great time going to these last year and I’m very happy to be able to speak on a bit of research which I’ve been working on but which did not necessarily work itself into my thesis (but broadly pulls on similar arguments around computation and choice) but in broader contexts of the Anthropocene debates (which started very early in 2015 when I contributed to a ‘Future Fossils’ exhibit at Society and Space.

I hope to have an interesting conversation from the other side of the spectrum to Kings and this will hopefully inform how I take forward my whole arguments around choice and computation. I also get to play around with some media (such as the Netflix film Tau) Find the abstract for the seminar below:

Computation has become, and continues to deepen, its integration, actualisation, and sensoriality among, and through, life, environments, and the ‘anthro’ of this era.  Various works on geological media (Parikka, 2014) and technologically-mediated futures (Gabrys, 2016) have opened up how computation impacts this so-called Anthropocene. In this seminar however, we will explore how computation not only unsettles our senses of dominance, but how, from the emergence of the general electronic, digital computer, as the first non-organic ‘cogniser’ (Hayles, 2017), they surrender the last vestiges of human authority of ‘decision-making’ and ‘choice’. This is not restricted to machine learning or ‘AI’ but is at the core of all computation. Unlike much debate around ‘learning data’ and the biases of machine learning – all crucial for social justice – there is something more, something far more nebulous and which cannot be attributed to us. This is the putare, the reckoning of computation – they are political. What is therefore political? How does the supposed ‘calculative machine’ make political choices beyond and through us? Together, we will seek to deepen our exploration to reckon with computation’s choices and decisions that are no longer – or more precisely perhaps, become exposed as not being – our own. Yet, in delegating, even outsourcing, choices and decisions, are we not committing an act of calculative injustice? Is this not another frontier of neoliberalism, the movement away from human politics and intervention itself? How does this reckon with the huge energy required for such choices and decisions and the implications for our climate? The Anthropocene is then not only about the demise or dominance of humanity, but of a new actor, one from the 1940s onwards, as the formulation and articulation of a new politics in our midst.

Talk at King’s College London – November 7 – Deciding on Choice: Cybersecurity, Politics and (Cyber)War

I am speaking on November 7 2019 at KCL’s Cyber Security Research Group at their Strand Campus, London. I am going to be substantively developing on my doctoral research on malware to complement it with a broader appreciation of other computational ‘materialities’ and my critiques of ‘artificial intelligence’. In order to do so I will be blending and weaving Hayles, Whitehead, and Derrida among others to establish my thoughts on decision and choice. In particular, this will be orientated to questions of intentionality and politics in discussions in cybersecurity and more so in ‘cyberwar’. I am thoroughly looking forward to this event – and the (inevitable and welcome) critique that I hope comes forth from a variety of perspectives.

There is an Eventbrite link if you wish to attend and it is open to all: https://www.eventbrite.co.uk/e/deciding-on-choice-cybersecurity-politics-and-cyberwar-tickets-74267759869

The abstract for the talk is below (and I will try to keep it as faithful to this as possible on the day):

Artificial intelligence will solve cybersecurity! It is an existential threat! It will make better decisions for us!

That is at least what we are commonly told.

In this talk, I instead unpick why we talk of decision-making in machine learning, its inherent failings, and the implications of this for the future of cybersecurity. To do so, I develop on my doctoral work on malware to explore the intricacies of choice-making in computation as one of its core foundations. I argue that we must see malware – and other computational architectures – as political and active negotiators in the formation of (in)security.

This means our contemporary notions of weapon and (cyber)war sit on shaky foundations in an age which is experiencing an explosion in computational choice. We have to decide on the role of choice in our societies, what makes something ‘political’, and what happens when we have alternative cognitive human and computational registers working together, on parallel and increasingly divergent paths.

Reflections on Data’s Dirty Tricks: The case of ‘value’

Last week, Oxford’s School of Geography and the Environment’s Political Worlds research cluster (thanks for funding the event!) hosted an event I and several others organised around ‘data’s dirty tricks’. As chair, I had no idea what each panellist was going to speak on, which made it both a challenge and equally thought provoking – more information here on a previous blog post. Below is a summary as a chair, and does not necessarily reflect a truthful account of the event, and any errors in the below are mine.

We had three panellists: James Ball (Journalist and author of Post-Truth: How Bullshit Conquered the World), Dr Lina Dencik (Director of the Data Justice Lab at Cardiff University and co-author of Digital Citizenship in a Datafied Society), and Dr Vidya Narayanan (Director of Research at the Oxford Internet Institute’s Computational Propaganda Project). All contributed a different dimension – broadly from the media, from those tracking propaganda and issues of surveillance and privacy. Some tweets from the event:

What came out most for me is, what is value? This was a common theme; whether from James Ball’s understanding of fake news and how the media attempts to discern what is valuable or not (i.e. more likely to be truthful, deceptive or indeed outright lies). From Vidya Narayanan, it was clear a value emerged in talking about whether Russian bots, for example, in the ‘Brexit’ process had been able to have a decisive influence (it seems they did not contra popular perception, they only amplified – or in her words lead to further polarisation of – existing views of groups). And from Lina Dencik a more fundamental critique of the value of global monopolies such as Facebook and whether they indeed provide a value at all (her discussion of how these companies are steering the debate, and are thus shutting down political debate of their actions was exceptional. Why do we have to respect their value frameworks at all?)? All three provided an excellent overview of different aspects of data’s dirty tricks – and how, I think, data has become one of values. In this sense, how is the collection, processing, and decisions made upon data all laden with different forms of value – and how do these interact and become conflicted in different spaces? The spaces of Silicon Valley are different to Westminster, and even these are different to the values of how those who are on Universal Credit see their data being used, tracked and analysed.

The core ‘surprise’ was that data’s dirty tricks are actually quite tricky. Cambridge Analytica, the firm that took lots of personal data from Facebook and used in political campaigning, was actually pretty poor at addressing the issue of changing peoples’ minds and voting. No more so than on Brexit, with perhaps a better influence on the 2016 US Presidential election. Trying to convert selected data and derive particular forms of value are hard – whether you wish someone to buy a product or vote a certain way in an election. No doubt, there are perhaps some avenues that data have been used adversely – but as Ball pointed out, it was the hacking of the US Democratic National Convention (DNC) by the Russian state hacking group CozyBear (APT29) that released emails relating to Hillary Clinton, that more likely swung the election. This is not to say that computing and hacking cannot be influential, but that data’s dirty tricks may not what they’re all cracked up to be. This is reinforced by Narayanan’s work on Russian bots which showed they are semi-automated and  rather poor at directing people in certain ways – only polarising those in different groups away from one another – but maybe that’s enough, to cause polarisation?

Whether we have ‘strong’ organisations also cropped up, with Dencik arguing that due to austerity, there had been a weakening of the state to counteract the demands of tech companies. This leads governments and other organisations to accept their demands, citing NHS contracts with Alphabet’s Deep Mind that took data with little to no patient consent. Therefore it is not only about individual consent about data but thinking about the collective privacy issues that emerge when data is used in these ways. Yet, Ball was keen to emphasise that the mainstream media is actually the main proponent of fake news, not social media, and that it is up to them to do more fact checking but the stresses of journalism make this hard. One thing he said  was that we should be proud that the UK has the BBC – as this provides a pivotal place in which to challenge inaccuracies and restrict filter bubbles… However what is distinctive about data is that it has an ability to move in ways previously impossible – and that is the new challenge, one of speed and distribution, rather than one of distinct difference.

I think the discussion left us at two avenues; one where the contortions of social media data do very little, but that the political economies (riffing off Dencik) of the use of data are challenging conventional political decision making. What I find interesting is the recent focus on the former (Facebook, elections, and so on), but little on the everyday issues of Universal Credit, NHS contracts, and outsourcing that are based on the sharing of data without appropriate consent. Hence, the dominant focus on  data’s dirty tricks should perhaps switch to focus on the latter – on asking what are the politics behind the use of data rather than on how data can influence elections (as it turns out they do very little). Data’s dirty tricks, to me, seem to be laden with power as much as they ever have been.

Data’s Dirty Tricks – Oxford, 15 November 2018

Data’s dirty tricks: The new spaces of fake news, harvesting, and contortion

As part of the new Dialogues series in the Political Worlds research cluster at the University of Oxford’s School of Geography and the environment, we are hosting a panel on ‘Data’s Dirty Tricks.’ I have helped organised this event (with Dr Ian Klinke and Dr Daniel Bos) and shall be chairing with three fantastic speakers. The official link is now available on Oxford Talks.

These are James Ball (Journalist and author of Post-Truth: How Bullshit Conquered the World), Dr Lina Dencik (Director of the Data Justice Lab at Cardiff University and co-author of the forthcoming book, Digital Citizenship in a Datafied Society), and Dr Vidya Narayanan (A researcher at the Oxford Internet Institute’s Computational Propaganda Project, looking at the impact of AI).

This is being held in the Herbertson Room of the School of Geography and the Environment at 16:30, Thursday 15 November 2018 (Week 6). The blurb is below, and everyone is welcome.

In this panel we invite three individuals from different backgrounds, within and outside of the University of Oxford’s School of Geography and the Environment, to offer their take on data’s dirty tricks. In an age where fake news is on the rise and data is harvested from social media platforms and beyond, what is the impact upon us all? We ask, what are the landscapes of fake news, harvesting and its contortions to conventional democratic spaces? How is it possible to respond, tie together, and understand new forms of geopolitical strategy? How do democracies respond to big data and what should be done? This panel seeks to explore this from people who take alternative approaches and offer insights into how it has impacted us so far, what is being done to tackle it, and what should be done in the future.