A couple of weeks ago, I went to the launch of Angela Saini’s new book, Superior. Saini explores the history of the concept of race, and how it intersects with politics. The event resonated with some of the things I’ve been thinking about recently. I’m interested in the politics of how data is collected and used to classify, control and assert power.
It also relates to my work as Press Officer at IF, a technology studio working on how to design services that give people agency and control over how data about them is used.
The problem with defining origin based on DNA
During the event, one story in particular caught my attention. AirBnB is working with 23andMe, the American DNA testing company, to offer people discounted travel to their ‘country of origin’.
The blog post announcing the offer says the collaboration will help people find ‘Airbnb Homes and Experiences in their native countries’.
Part of the reason this is problematic is that many people still don’t understand that race is a social construct, as Saini also argues in Superior. Defining a person’s origin based on DNA is highly political and oversimplifies the complex nature of identity.
Here’s a common example. You’d be hard pushed to find a British person of colour who hasn’t, at some point, had their identity challenged with the question: ‘But, where are you really from?’ As Saini argues, ‘squaring your appearance with your nationality is one of the hardest things about being an ethnic minority’. And the rise of a xenophobic right in the UK and Europe is encouraging an atmosphere of intolerance.
How is data shared between 23andMe and AirBnb?
As well as the problematic nature of defining origin based on DNA, there’s also the question of exactly how data is shared between the two companies.
23andMe claim they won’t share data with AirBnb. But there must be some kind of exchange between the two companies to be able to offer the service, as Andrew Eland argued in response to my tweet about it.
DNA data is extremely sensitive. So the lack of transparency around how and why data is shared is concerning. What is 23andMe getting from this? A more diverse database? So far, people who identify as white have been overly represented in 23andMe’s data. This is unsurprising considering the history of how minority ethnic groups have been treated by geneticists. And what about AirBnb? What’s in it for them?
AirBnb and 23andMe are both American companies, where identity politics is on the rise too. Just last month the government announced that almost all travellers to the USA will be forced to hand over social media details. How that data will be shared across government agencies is unclear. And civil liberties groups are concerned the data will be used discriminate against specific groups, like people from countries where Islam is the majority religion.
The 23andMe and Airbnb data sharing agreement is just one example that exposes much deeper problems in the system. 23andMe allows GlaxoSmithKline to use its data to target people for drug trials. They also already share data with some government agencies, which incidentally is how the Golden State Killer was caught. In the current political context, is it too far-fetched to imagine 23andMe one day giving the US government unlimited access to their database?
Data and technology, like science, are not apolitical
Using DNA data to define people’s identity will only serve to reinforce historic biases and oppression. And using science to simplify, organise and codify race and identity has proved destructive in the past. Identity cannot be reduced to a few data points.
Without the right levels of scrutiny, technology will scale historic bias and inequality. So, what needs to change?
"People still don’t understand that race is a social construct"
The way personal data is used needs to be clearer, to give people more agency and control over how that data used. The business model underpinning services should be more explicit too. If the purpose of a service is to make money selling DNA data, the company should be required to make that clear to people, from the very start.
Collectively, we have the power to scrutinise and challenge the social impact of these data practices. This has already started, with the work of organisations like the ACLU in the US, and Citizens Advice in the UK. But they’re in need of the right funding, skills and technologies to play the more powerful role we need in data driven societies.
Angela Saini was the target of lots of hate speech after the launch of Superior. She recently left twitter. History has shown that shutting down a diversity of opinion doesn’t end well. We need to find ways to harness the power of the collective to challenge how data, technology - and science - impact our lives and societies. Individuals can’t do it alone.
Big thanks to the IF team for their support writing this post
Biography
Grace Annan-Callcott is Press Officer at Projects by IF, a technology studio specialising in ethical and practical uses of data and AI.
Grace's role is paramount to increasing IF’s visibility and influence, and ensuring IF shapes the narrative across issues of trust, privacy and data rights. Her expertise and ability to build strong relationships allows her to connect with journalists and influential organisations, to establish IF’s reputation as field-leading experts.
@GAnnanCallcott
@projectsbyif
Comments