Have you ever noticed how often we use metaphors in our day-to-day language? The words we use matter, and metaphorical language paints mental pictures imbued with hidden and often misplaced assumptions and connotations. In looking at the impact of metaphorical images to represent the technologies and concepts covered within the term artificial intelligence, it can be illuminating to drill down into one element of AI – that of data.
Hattusia recently teamed up with Jen Persson at Defend Digital Me and The Warren Youth Project to consider how the metaphors we attach to data impacts UK policy, amalgamating in a data metaphors report.
In this report, we explore why and how public conversations about personal data don’t work. We suggest what must change to better include children for the sustainable future of the UK national data strategy.
Our starting point is the influence of common metaphorical language: how does the way we talk about data affect our understanding of it? In turn, how does this inform policy choices, and how children feel about the use of data about them in practice?
Metaphors are routinely used by the media and politicians to describe something as something else. This brings with it associations made in response in the reader or recipient. We don’t only see the image but receive the author’s opinion or intended meaning on something.
Metaphors are very often used to influence the audience’s opinion. This is hugely important because policymakers often use metaphors to frame and understand problems – the way you understand a problem has a big impact on how you respond to it and construct a solution.
Looking at children’s policy papers and discussions about data in Parliament since 2010, we worked with Julia Slupska to identify three metaphor groups most commonly used to describe data and its properties.
We found that a lot of academic and journalistic debates frame data as ‘the new oil’, for example; while some others describe it as toxic residue or nuclear waste. The range of metaphors used by politicians is more narrow and rarely as critical.
Through our research, we’ve identified the three most prominent sets of metaphors for data used in reports and policy documents. These are:
- Fluid: data can flow or leak
- A resource/fuel: data can be mined, can be raw, data is like oil
- Body or bodily residue: data can be left behind by a person like footprints; data needs protecting
In our workshop at The Warren Youth Project , the participants used all of our identified metaphors in different ways. Some talked about the extraction of data being destructive, while others compared it to a concept that follows you around from the moment you’re born. Three key themes emerged from our discussions:
- Misrepresentation: the participants felt that data was often inaccurate, or used by third parties as a single source of truth in decision-making. In these cases, there was a sense that they had no control over how they were perceived by law enforcement and other authority figures.
- Power hierarchies and abuses of power: this theme came out via numerous stories about those with authority over the participants having seemingly unfettered access to their data, thus enforcing opaque processes, leaving the participants powerless and with no control.
- The use of data ‘in your best interest’: there was unease expressed over data being used or collected for reasons that were unclear and defined by adults, leaving children with a lack of agency and autonomy.
When looking into how children are framed in data policy, we found they are most commonly represented as criminals or victims, or simply missing in the discussion. The National Data Strategy makes a lot of claims of how data can be of use to society in the UK, but only mentions children twice and mostly talks about data like it is a resource to be exploited for economic gain.
The language in this strategy and other policy documents is alienating and dehumanises children into data points for the purpose of predicting criminal behaviour or to attempt to protect them from online harm. The voices of children themselves are left out of the conversation entirely. We propose new and better ways to talk about personal data.
To learn more about our research, watch this video (produced by Matt Hewett) in which I discuss the findings. It breaks down exactly what the three groups were, how the experiences which young people and children had related to data linked back to those three groups, and how changing the metaphors we use when we talk about data could be key to inspiring better outcomes for the whole of society.
We also recommend looking at the full report on the Defend Digital Me website here.