cherezoff - stock.adobe.com

New forms of governance needed to safely and ethically unlock value of data

Despite today’s abundance of data, most of it remains locked in silos, meaning new governance structures are needed to open it up and unlock its potential value to society

In an increasingly data-driven world, new forms of governance are needed to unlock the full potential of data while also minimising its potential risks, something the government is trying to achieve through data trusts.

The potential of data trusts is being explored by a partnership between the Government’s Office for Artificial Intelligence (AI) – itself a joint venture between the Department for Digital, Culture, Media and Sport (DCMS) and the Department for Business, Energy and Industrial Strategy (BEIS) – and the Open Data Institute (ODI).

The Office was established in 2017 as part of the government’s industrial strategy, which focuses on investment in AI, innovation, digital skills and infrastructure as a way of increasing the UK’s productivity.

The aim of the data trusts is to create a system of data pooling between organisations – government departments and private entities – which will allow them to share data in a safe, fair and ethical way.

It is hoped this will foster collaboration between organisations so they can work together to tackle a range of societal issues, from recycling and food waste to improving the efficiency of roadworks and construction projects.

An example of this is the pilot data trust the ODI will be running with the Royal Borough of Greenwich, which will focus on collecting real-time data from internet of things (IoT) sensors and sharing it with innovators in the technology sector so they can come up with solutions to challenges in the urban environment.

The opening and pooling of civic and corporate data will give organisations a chance to work with previously inaccessible data as, despite today’s abundance of information, many datasets remain in silos, which prevents them being used effectively.

“It’s a thought that when you are generating your own data, when organisations and government departments are generating data, there might be a fundamental utility in pooling that in an organisation that has a declared broader benefit,” said Nigel Shadbolt, chairman and co-founder of the ODI.

“We never thought quite hard enough about how data, as this extraordinarily vital commodity, needs to have some assurance that we can share it in a way that is more broadly empowering and effective for both a collective and individual,” he added.

For the data trusts to work, and for the public to trust them, Shadbolt said the data must be used and available according to very clear rules. “One of the rules might be, for example, that it can’t be privatised in the future, that there’s some kind of lock-in,” he said.

“I think they can only trust it by virtue of its deeds, by how it is set up, and by the strength of the covenants that are in place. How strict are all those trustee obligations, what exactly are the conditions under which that’s being made available?”

Public concerns about data promiscuity

Since the Cambridge Analytica scandal, public concern about data privacy has increased vastly.

“Awareness and interest in personal data is mainstreaming. That’s something we’re starting to observe in speaking to data protection organisations around the UK at the moment,” said Steve Wood, UK deputy information commissioner at the Information Commissioner’s Office (ICO).

“I had no idea that I was being quite so promiscuous with my data, and it made me think more strongly that we need to make those data destinations very transparent”
Steve Wood, Information Commissioner’s Office

“We’ve always seen people who were activists, who were particularly concerned about accessing their own data and exercising their rights, and people would often do that if something went wrong in their lives, perhaps their credit score was wrong, but we’re increasingly seeing people who are genuinely interested to know what data is held on them,” he added.

However, our data is still incredibly promiscuous. Shadbolt explained how his research group in Oxford downloaded more than a million apps off the Android Play Store to look at the code and see where the data was going.

“What we observed was there’s massive amounts of data concentration that regulators need to know about,” said Shadbolt.

“You as an individual are emitting huge amounts of personal data, if you have a sense of it, when you see the landscape, when you see where a particular app is shipping the stuff out to, it does give you that sense of what agency do I have over any of this?

“I had no idea that I was being quite so promiscuous with my data, and it made me think more strongly that we need to make those data destinations very transparent.

“We’re not asking everybody to adjust the settings on every single app, but there needs to be a much stronger set of regulations and assumptions about what is being tracked and what is going where.”

Collaborative governance structures

According to multiple panel discussions and speakers at the ODI Summit 2018, the key to unlocking the value of data, as well as protecting against its potential risks, is through collaboration, not only between organisations, but with the public as well.

“One of the things people said [in our focus group work] was it feels like there are no rules on this, or what rules there are, are in favour of the tech companies,” said Catherine Miller, director of policy at Doteveryone, a think tank promoting responsible use of technology.

“There’s an ombudsman for my pension but no ombudsman for the internet, so there’s a sense that people are grappling with technological change very much on their own, without any social infrastructure around them,” she added.

The exact structure of the data trusts, however, is yet to be finalised, meaning a number of questions and challenges remain.

“There’s an ombudsman for my pension but no ombudsman for the internet, so there’s a sense that people are grappling with technological change very much on their own, without any social infrastructure around them”
Catherine Miller, Doteveryone

“What I would be interested in seeing here is the process behind this, in terms of how lawful it will be, how transparent it will be, how accountable it will be, where the independent oversight is coming from, for example,” said Nóra Ni Loideain, director and lecturer at the Information Law and Policy Centre.

“These aren’t novel questions – in the 1960s, we had environmental impact assessments, we had social impact assessments, then we had privacy impact assessments in the 1990s, and now with GDPR we have the data protection impact assessment,” she said.

“Legally speaking, if we’re going to be setting up data trusts with massive amounts of data and serious risks in terms of data security and the implications on people’s privacy, questions about consent, we’re going to have to have data protection impact assessments,” she added.

In terms of how decisions are made about data usage, the trusts could potentially help strike a balance between purely giving organisations control, which could encourage monopolistic behaviour and further entrench the power imbalance, and purely giving individuals control, which would require significant effort on their part to manage the vast amounts of data held on them.

“I think the notion that an individual wants to literally manage large amounts of data about themselves is a strange sort of idea,” said Roger Taylor, chair of the Centre for Data Ethics and Innovation, which was set up by the DCMS in June 2018 to help create ethical frameworks for the use of emerging technologies.

“But the ideas that organisations could act on behalf of the individual and [that there would be] some separation, as it were, of the provision of services based on data and the control of that data, are interesting. The role of the centre is to explore these ideas and identify how they can fix these problems.”

A collaborative approach could also help to make controlling data more inclusive, and less prone to discrimination or abuse.

“We also want to recognise that data isn’t a neutral tool – we embed our values into what we collect, into how we use data, how we share it, we embed our values in how we choose to govern it and regulate it, and we embed our values in how we decide who benefits from data, who we protect, who we include in making choices about how data is used,” said Jeni Tennison, CEO of the ODI.

“Our data ecosystem is complex. Sometimes even well-justified, well-intentioned data policies and practices can have unanticipated, harmful side effects. We think we can be more alert to these if we deliberately include a range of voices,” she added.

There are plans to run pilots for these data trusts in the coming months – one with the Mayor of London’s office and one with the Royal Borough of Greenwich – to see how they will work. These pilot trusts will initially look at unlocking the value of environmental and geospatial data.

Read more about data management

  • Although clearly a good thing in theory, open data risks increasing the digital divide and social inequality unless approached in the right way.
  • Chair of the new Centre for Data Ethics and Innovation says collaboration is vital for developing effective frameworks to manage the proliferation of AI and data-driven technologies.
  • Secretary of state for DCMS, Jeremy Wright, announces make-up of Centre for Data Ethics and Innovation board at ODI Summit

Read more on Technology startups