Stuart Monk - Fotolia

Government data quality framework geared to overcome poorly organised data

The Office for National Statistics has published guidelines aimed at improving “disorganised” public sector data

The Office for National Statistics (ONS) has published guidelines to improve the quality of data in the public sector.

The Government data quality framework, developed by the Government Data Quality Hub and Government Digital Service, provides five principles for data quality management in government.

In a joint foreword to the framework, Ian Diamond, national statistician, and Alex Chisholm, chief operating officer for the Civil Service, wrote: “The 2019 Public accounts committee report showed that data has not been treated as an asset, and how it has become normal to ‘work around’ poor-quality, disorganised data.

“The extent of the data quality problem within government is poorly understood. Work on data quality is often reactive and not evidence-based. Where quality problems have been identified, the symptoms are often treated instead of the cause, leading to ineffective improvements and wasted resources.”

The framework principles are: know your users and their needs, assess quality throughout the data lifecycle communicate data quality clearly and effectively, and anticipate changes affecting data quality.

Julia Lopez, Cabinet Office parliamentary secretary, said in a statement issued by the ONS: “The publication of this framework represents a milestone for data quality in government and is a step towards building a true data quality culture in the civil service.

“We hope to see more decisions being made on the basis of data that is complete, consistent, reliable and accurate.”

Read more about data quality in government

James Tucker, head of the Government Data Quality Hub, added: “A data revolution is transforming almost every aspect of people’s lives. Data held by the public sector is therefore one of the UK's most valuable assets, and it’s vital that it is fit for purpose

“This framework sets out the principles and practices to enable Government to understand, communicate and improve quality, so that we can all have the utmost confidence in our data.”

The framework document itself stated: “At present, we lack a consistent approach to managing data quality across government. This framework draws on international and industry best practice and sets out a series of principles, practices and tools aimed at achieving fit for purpose data. The framework asks organisations to develop a ‘culture’ of data quality, by treating issues at source, and committing to ongoing monitoring and reporting.”

The document included three case studies of data quality best practice, including a canvass reform programme overseen by the Cabinet Office, aimed at modernising the electoral registration process.

This project, it said, has “brought data into an outdated auditing process to reduce costs on the taxpayer and the administrative burden on local government officials, while ultimately improving the experience for citizens.”

From this year, the annual canvass introduces a new “data matching step” at the outset of the canvass process in which Electoral Registration Officers (EROs) compare their electoral registers against data from the Department for Work and Pensions, as well as locally held data sources.

This “data matching” step indicates whether the information on the electoral registers is likely to be correct. The results will tell EROs which properties are likely to have the same people residing in them and which properties are likely to have had a change, with a consequent efficiency gain.

Read more on Artificial intelligence, automation and robotics