Feng Yu - stock.adobe.com

Public sector IT projects need ethical data practices from start

Data ethics needs to be integrated into public sector IT projects from the very start, and considered throughout every stage of the process, to be effective

Public sector organisations should take an iterative approach to data ethics that encompasses every stage of a project, from initial data collection all the way through to live implementation and beyond, so that the lessons learned can be truly incorporated into future work, says data ethicist Haroon Ahmed.

Ahmed, commercial partner for data capability at public sector technology delivery firm Made Tech, says that a major focus for Made Tech is integrating data ethics into every project it works on, which it refers to as “ethics by design”.

He tells Computer Weekly that, far too often, data ethics is simply ignored by public sector organisations because it is perceived as either too contentious, complicated or time consuming. He also noted that with governments generally, but the UK government in particular, there is no clear, standardised direction or approach to data ethics.

“Maybe you can’t [adopted standardised approaches] because there’s so much variation in the types of work different departments do...[but] usually data ethical problems turn up when it’s a bit too late, when things have already gone wrong,” he says.

To overcome this and to catch any issues early on, Ahmed says organisations need to think about the ethics of data inputs and sources from the very start of the process.

“The starting point is looking at the whole data journey. So it’s looking at, what are your data sources? Who has the rights on those data sources? Are there any limitations on data sources?” he says, adding there also needs to be clarity around why data is being collected and processed in the first place.

From here, he says organisations should start measuring the effect of the data on different demographics, so they can see where it is producing both positive and negative impacts.

For example, organisations should look at what kinds of bias are “inadvertently being created” in their models, as well as through the ways they collect, share and use the data.

“Are there practices that you’re developing that reinforce stereotypes or inequalities that may exist within the data – well, look at the data sources, who’s collecting them, how are you collecting them data?” Ahmed says.

“Look at data that may propagate some falsehoods. A lot of the time we’re structuring data around opinions – for example, if you’re mining data from social media, what sort of information are you gathering? How much can you trust it, what is news and what isn’t? Most of the time, bias [is introduced] in the way you’ve collected the data, so look at your research methods.”

Ahmed says that once negative impacts are identified, organisations should immediately move to think about mitigation: “It’s just asking yourself like clear and transparent questions around how you’re setting up the project…there’s also a level of testing and rigour you should put in when you’re building. Have private betas, develop them, test them before going out and launching something live that is then going to give you a bad name and crush everything.”

Communicating purpose 

Ahmed further adds that the need to clarity and transparency is even more important for public sector organisations developing citizen-facing services, and that members of the public should ideally be consulted on whatever is being built.

Giving the example of Made Tech's work with a local authority on a data platform project, Ahmed says that the public are virtually consulted on a regular basis, so that they can se  the impacts it is having or how data is being processed by the system.

“When you openly communicate your purpose – when you’re open about how you’re doing things, how data has been used – you instil the most valuable thing when doing data projects, which is trust,” he says. “We are probably in a place where people don’t trust organisations or the government with their data, and that’s a bad place to be… so the transparency, openness, communicating purpose, engaging with people is fairly key.”

Ahmed adds that, even in situations where it is not possible to consult the public – for example, when systems are being built for law enforcement or intelligence purposes, or just internally for use by civil servants – openness among the internal stakeholders is still very important.

All of this should also be documented: “To make [ethical] framework that’s suitable, to what you’re doing as an organisation, keep referring to it, refresh it, measure your outcomes, learn from it… the potential loss of revenue or reputation for an organisation or government departments are huge if it goes wrong. With ethics, the win is not getting things wrong, the win is having less negative impact or more positive impact.”

Ahmed adds while ethical frameworks should be drawn up at the start of any project or body of work so that they can be referred back to as work progresses, they also need to be “living documents” that are updated with lessons learned.

Giving the example of the UK government’s data ethics framework published in 2018, Ahmed says: “I can’t tell you any examples of where that’s been used.” He added that most of the time, such frameworks are made, published and then “parked”.

“As you learn more, as your business and your organisation changes, as you become more data mature and digitally mature. [Frameworks] they should be updated, and there should be ownership on that,” he says, adding that while very few public sector clients come to Made Tech with specific requests to consider data ethics, “nobody’s ever against it… or pushes back on it” when the subject is broached.

“It’s about departments proactively thinking about building it into projects, and not just on the ones we’re doing with them – they do tonnes of projects internally,” he says.

Taking these steps should, according to Ahmed, help to create a “culture of data ethics” so that it is no longer seen as a barrier, or otherwise treated as an afterthought, by organisations.

Read more about tech and data ethics

Read more on Privacy and data protection