Publishing here my afterword for “2030, A New Vision for Europe”, the manifesto for European Data Protection Supervisor, Giovanni Buttarelli, who died this summer. The manifesto was developed by Christian D’Cunha, who works in the EDPS office, based on his many conversations with Giovanni.
“A cage went in search of a bird”
Franz Kafka certainly knew how to write a story. The eight-word aphorism he jotted down in a notebook a century ago reveals so much about our world today. Surveillance goes in search of subjects. Use-cases go in search of profit. Walled gardens go in search of tame customers. Data-extractive monopolies go in search of whole countries, of democracy itself, to envelop and re-shape, to cage and control. The cage of surveillance technology stalks the world, looking for birds to trap and monetise. And it cannot stop itself. The surveillance cage is the original autonomous vehicle, driven by financial algorithms it doesn’t control. So when we describe our data-driven world as ‘Kafka-esque’, we are speaking a deeper truth than we even guess.
Giovanni knew this. He knew that data is power and that the radical concentration of power in a tiny number of companies is not a technocratic concern for specialists but an existential issue for our species. Giovanni’s manifesto, Privacy 2030: A Vision for Europe, goes far beyond data protection. It connects the dots to show how data-maximisation exploits power asymmetries to drive global inequality. It spells out how relentless data-processing actually drives climate change. Giovanni’s manifesto calls for us to connect the dots in how we respond, to start from the understanding that sociopathic data-extraction and mindless computation are the acts of a machine that needs to be radically reprogrammed.
Running through the manifesto is the insistence that we focus not on Big Tech’s shiny promises to re-make the social contract that states seem so keen to slither out of, but on the child refugee whose iris-scan cages her in a camp for life. It insists we look away from flashy productivity Powerpoints and focus on the low-wage workers trapped in bullying drudgery by revenue-maximising algorithms. The manifesto’s underlying ethics insist on the dignity of people, the idea that we have inherent worth, that we live for ourselves and for those we love, and to do good; and not as data-sources to be monitored, monetised and manipulated.
In October 2018, the Vatican’s Papal Nuncio to the European Union wrote to Giovanni to support the work of the 40th International Conference of Data Protection and Privacy Commissioners. He said technology is a precious resource when it’s working for everyone, but that technology alone cannot set the direction of human progress. You don’t have to be a Catholic to insist that we ditch cute, reductionist mind-games like the ‘trolley problem’ to decide who wins and who loses, and insist that technology ethics are instead grounded in respect for people. And you shouldn’t have to sound radical to insist that tech business models must serve and be accountable to us, not the other way around.
The manifesto and its Ten-Point Plan for Sustainable Privacy show there is another digital path forward. Not the oppressive brittleness of China’s state sovereignty model, and not the colonialist extraction of Silicon Valley. There is a European Union version of the Internet that starts with the society we as citizens want to live in, and then figures out how to get there. It recognises that just as we don’t live our lives to serve corporate interests, nor must we sacrifice our private and public spaces to serve the state. Because in any future we actively want to live in, autonomy is for humans, not machines.
The European vision of our digital future will take the work of many of our lifetimes to achieve. That eight-word story doesn’t have an ending we can yet see. The surveillance cage cannot help but try to trap birds. That’s its programming. That’s just what it does. But the cage isn’t the technology; the cage is our flawed and narrow assumptions about what technology can do.
The manifesto says we must be optimistic about the future of technology so we can be optimistic about the future of our world. It’s right. Right now, technology itself is in the cage. There is so much more technology can do – banish inequality, repair our environment and support us all in living our best lives – if we cut it loose from the business models that entrap us all.
When indignant interviewers asked Giovanni if Europe was imposing its views about privacy on the rest of the world, he would reply courteously that Europe was just setting an example. (Countries figuring out how to secure an adequacy finding may disagree!) But he was right. Just the fact that a major trading bloc insists in both word and deed that there is another way, a path to a digital future we actively choose, is almost enough.
Almost.
{ 7 comments }
Gareth Wilson 11.18.19 at 10:29 pm
“Running through the manifesto is the insistence that we focus not on Big Tech’s shiny promises to re-make the social contract that states seem so keen to slither out of, but on the child refugee whose iris-scan cages her in a camp for life.”
If you think that people need iris scans to keep child refugees in a camp for life, you need to reread the Book of Exodus.
notGoodenough 11.19.19 at 10:35 am
@ Maria, OP
Thank you for this – very informative!
While the rate of innovation may be good from a research perspective, sadly all to often the ethical considerations lag behind. While I think most people would agree that advances can be good, I think most people would also agree that this isn´t always the case – implementation is a key factor, and letting businesses self-regulate doesn´t always seem to be the wisest course of action (to say the least).
In short, thanks for the interesting post.
notGoodenough 11.19.19 at 11:02 am
Gareth Wilson @ 1
With respect, nowhere in the post has it been implied that iris scanners are required for refugee camps, so you would appear to be disagreeing with a position that hasn´t been proposed.
I won´t speak for the OP, but I think it is not a particularly controversial idea that technology can facilitate things (for good or ill), and so it would seem not unreasonable that there should be at least some consideration of potential ethical and social ramifications during the process of implementation.
For example, most people using modern technology generate information about themselves (the websites they visit, who they follow on the twitters, who is facebook friends, etc.). What should be that person´s right to privacy and control over their information? I don´t think it requires the most active imagination to think of ways this could be exploited (and indeed, one might be able to find examples even now), and leaving these things unregulated does rather open that possibility.
While people may disagree whether or not it is problematic for companies to undertake commodification our electronic information (and to what extent, if any, this should be regulated), I would hope you agree that the conversation is worth having?
Claire 11.19.19 at 5:15 pm
My partner and I were literally just complaining that one of the many ways in which Frontline’s “AI†documentary failed was the complete omission of Europe from the discussion. (If you haven’t seen it, the documentary is pretty maddening, depicting “AI†as an unstoppable force taking people’s jobs, manipulating them through social media, and spying on them to ensure they behave according to its rules. At no point do they mention that all of these things are being chosen by people, whether running companies or governments, and that there are other choices that could be made.)
nastywoman 11.20.19 at 4:33 am
– as somebody who supports every single point of this ”10-Point Plan for Sustainable Privacy” – I even think it doesn’t go far enough.
And the reason why I think – it doesn’t go far enough – I have these friends who –
(in the US) – didn’t get some jobs they really, really wanted – just because somebody breached their ”privacy” –
And they didn’t get ”these jobs” -(or ”these apartments” or this boy-or-girlfriends”)
BE-cause somebody breached their privacy.
And that’s so terribly easy with Internet search machines?
AND perhaps – besides the point – that the EU provides
such great payable health care
and secure jobs – which pay living wages –
and long vacations and free education –
THE utmost important reason – why the EU is… to say it very simple –
‘sooo preferable” –
that in the EU you STILL can get -(a job – and apartment – a girl-orboyfriend) WITHOUT everybody breaching YOUR ”privacy”.
Stephen 11.21.19 at 6:59 pm
As far as I understand it, this sort of thing, in spades redoubled, is now happening in China, particularly in regions thought to be disaffected to the central government.
Am I wrong?
Or are the PRC government to be held to different standards?
notGoodenough 11.22.19 at 10:28 am
Stephen @ 6
My apologies, but I think I´m having trouble understanding your comment. In the interests of trying to better follow your reasoning, perhaps I could ask you to expand a little in order to explain what your point is?
For example, you seem to be implying that in this discussion about a specifically European manifesto, China is being held to a different standard – how did you come to that position?
While there is not an in depth discussion (it would seem to be out of the scope of this specific post), the OP does allude to other countries briefly:
“Not the oppressive brittleness of China’s state sovereignty model, and not the colonialist extraction of Silicon Valley. “
Which, at least to me, does not appear to be overly flattering terms for the state of electronic privacy in China and America (regardless of whether or not you may agree that those are fair characterizations).
So, if I might impose upon you to indulge me a little, could you help me by clearing up the following:
1) To what end did you make your comment (i.e. is your intention to ask for more discussion of other privacy laws, to highlight what you think is a double standard, etc.)?
2) In what way do you think there is a double standard, and why?
nG
Comments on this entry are closed.