Can GDS avoid becoming just another Whitehall silo? Analysing the NAO's scathing critique

Prior to this week’s publication of the latest National Audit Office (NAO) report on the Government Digital Service (GDS), rumour had it the Whitehall watchdog had pulled back from serious criticism and tempered its observations. A first draft, so it was said, was subsequently toned down.

However, for anyone familiar with NAO-speak – an accountants’ version of Whitehall-speak that couches disapproval in Sir Humphrey’s vague shades of grey – the report is scathing in its analysis of the problems facing GDS.

On the positive side, the NAO reaffirms the importance of GDS’s role as a central authority for digital government, and acknowledges the progress it has made.

“GDS has successfully reshaped government’s approach to technology and transformation,” said the report.

“We have found that methods promoted by GDS, such as agile development, are used widely across government, and that digital leaders are perceived as breaking down traditional barriers between IT and other functions.”

But then the question marks over GDS today and in future take over – and the NAO poses a lot of them. The most fundamental is that the NAO “found widespread views across government that GDS has struggled to adapt to its changing role”.

Of course, GDS supporters will respond – with justification – that the whole point of the organisation is to counter “widespread views” in Whitehall departments; that such comments only exemplify the institutional inertia and silo mentality that successive GDS leaders have fought against. But it’s hard to ignore many of the facts presented for the first time in the NAO report.

The report is 57 pages long, you can read the whole thing here, but I’ll pick out a selection of the most pertinent observations and statistics – I can assure you none of these quotes are taken out of context – with my observations added.

Strategy and transformation

“Although [GDS’s] budget increased to £150m in 2016-17, GDS expects to underspend against this by £45m, largely because of lower than expected take-up of centrally provided services.”

That underspend, according to sources, is almost entirely down to low take-up of Gov.uk Verify, the common identity assurance service developed by GDS, which continues to miss adoption targets. That unspent cash was allocated for payments to the third-party identity providers who verify users are who they say they are – because of low take-up, those providers are not completing enough transactions, and hence don’t get the money they hoped to receive.

The NAO report also reveals that in 2015/16. Verify accounted for 21% of GDS expenditure – think about that, one-fifth of the entire spend on that one project.

“GDS’s new approach is still emerging. It is not yet clear how GDS will prioritise its activities over the next few years, or how it will develop a plan to support its new approach. GDS told us that, in January 2017, it started to work with digital leaders across government to understand the current position and where it needs to get to by 2020. At the time of our assessment, there were no outputs from this process available for review.”

GDS was awarded £455m in November 2015 to cover the 2016 to 2020 spending review period – yet as of March 2017, it could not tell the NAO how it would prioritise its work, and had only just started work to understand what’s needed by 2020.

“GDS’s role in supporting transformation is not set out clearly in the new Government Transformation Strategy… It is not clear who is responsible for driving business transformation in government to address issues such as culture and process change, which were highlighted in the Government Transformation Strategy. It is also unclear how they will do this.”

The Government Transformation Strategy – what was previously the digital transformation strategy – was written by GDS, and morphed into a “transformation” plan under the leadership of GDS director general Kevin Cunnington, who was started the role in September 2016 with a specific remit for transformation. The strategy was launched in February by Cabinet Office minister Ben Gummer. But the NAO is still “unclear” how it will actually happen.

“GDS publishes data on 802 government services on an online Performance Platform. But only 118 services publish data on costs per transaction and not all services publish required data on digital take-up, the completion rate for online transactions or user satisfaction. In an internal review in 2015, GDS found that there was a lack of clarity about the purpose of the Performance Platform. Some data have not been updated since March 2016.”

When it was introduced by former GDS chief Mike Bracken, the Performance Platform was rightly presented as a critical tool for monitoring the wave of digital services that would appear across government. In particular, it would show ministers how much transactions were costing, as a way to focus spending on priority areas and build a business case for change. As such, it was a sensible and important initiative. Now, GDS has allowed it to have a “lack of clarity”, when that purpose seemed clear when it was first launched in 2012. Of course, GDS’s biggest challenge here – as has so often been the case – is the lack of buy-in from departments.

“The Executive Management Committee within GDS is responsible for overseeing performance and setting strategic direction. We reviewed board minutes and performance dashboards from each of four business groups (operations, digital, data and technology). We found that objectives and results presented to the board were sometimes vague and did not always include baselines or targets. This made it hard to assess progress.”

GDS has been frequently criticised for publishing strategies and plans without measurable targets – the NAO suggests this is not only a problem with its external pronouncements. In mitigation, the NAO noted that, “GDS has recognised this issue. In August 2016, board minutes noted that there was still work to be done to ensure that objectives and results aligned with the organisation’s objectives and could be measured against hard progress measures.”

“Our review of minutes from the Executive Management Committee found mixed evidence about the level of guidance that GDS is providing on priorities for specific programmes. The minutes for four months from September 2016 noted that the Digital Group (which covers Verify and other common services) had to ask the Board to clarify current priorities, to ensure it assigned staff to the right areas.”

To be fair, Kevin Cunnington only started as GDS chief on 1 September 2016 -but at that time, the NAO suggests that even GDS’s own teams were unclear on what their priorities were meant to be.

Digital exemplars

As a reminder, the exemplars were 25 high-volume transactions targeted for transformation into modern digital services, to demonstrate the benefits and opportunities of GDS’s approach. The programme started in 2012 and was due to deliver the 25 new services by March 2015. In the end, 15 of the 25 exemplars were available as live services and a further five were available to the public in trial form.

GDS reviewed the success of the exemplar programme in 2015, and Computer Weekly has in the past requested a copy of that review under Freedom of Information laws, but the Cabinet Office declined to release it. The NAO report reveals for the first time some of the findings of that review.

“GDS’s analysis indicates that only six of the live exemplars and two of the publicly available trials had provided an integrated service by March 2015. Full transformation and digitisation was not achieved, either for the citizen or for government.”

It’s been an open secret that some of the exemplars did little more than redevelop the web front-end to make it more user-friendly – the so-called “lipstick on a pig” approach. In some cases, departmental staff still retyped data collected by the new digital services into their legacy back-end databases. This is the first time we’ve learned how many of the exemplars were simply better websites – of course, having a better website that’s easier to use is welcome, and the right place to start. But it makes clear the challenges of genuine end-to-end transformation of services – an objective that is one of the central planks of the new transformation strategy.

“GDS analysed whether the value of expected benefits for those exemplars over a 10-year period exceeded the costs of development. In 12 cases the benefits exceeded the costs, but in 10 cases the costs outweighed the benefits.”

GDS is not to blame here – it’s the department’s responsibility to ensure they deliver the return on investment expected. But the fact that 10 of the 25 exemplars actually cost more than the benefits they achieved must raise huge questions about the business case for some digital services – or at least, departments’ ability to deliver the benefits.

“With such a broad remit, GDS faces a significant challenge in meeting possible needs for technical support in areas demanding a deeper technical knowledge and understanding of the existing government landscape. There is limited guidance on replacing or reconfiguring legacy systems to support transformation programmes. GDS has only recently published guidance on using application programming interfaces (APIs) to link administrative systems, despite an emphasis on APIs when GDS was first set up.”

A frequent criticism of GDS from digital teams in departments is that GDS lacks the technical know-how required for migrating off legacy systems – that the expertise in GDS has been focused mostly around web development. However, part of GDS’s remit was always to support departments in moving away from legacy systems and the outsourcing contracts that locked them in to that legacy.

The observation about APIs is particularly pertinent – if you go back and read the original 2010 report by Martha Lane Fox that led to the creation of GDS, it was clear about the importance of GDS developing and promoting the use of APIs. The NAO suggests this priority had been somewhat overlooked.

“While GDS has concentrated on developing ‘registers’ (canonical lists, such as countries or local authority areas), there is little strategic overview of the data needs of departments and no common view of how best to assess privacy concerns, consent and security.”

The work on registers is an important part of GDS’s activity around data – there is a clear need and benefit for common standards around data that is re-used across departments. It’s absolutely right that GDS does this work. But the NAO suggests that, once more, engagement with departments is not all it should be.

The NAO goes on to note there is “no overall data strategy to provide clarity of overall purpose”; “Previous [data] mapping attempts failed because of fragmented landscape and burden of detail”; and “No overall view of future state for data for services, sharing, data security and privacy”. The Cabinet Office is recruiting a new chief data officer – clearly they will be kept busy.

Spending controls

GDS introduced cross-government controls on departmental IT spending at an early stage – the process is seen to be one of GDS’s success stories, with the NAO highlighting £1.3bn of savings (although it should be noted that these “savings” are more “not spending money we might otherwise have spent” rather than cuts in existing expenditure).

As a result, there has been some controversy recently after Cunnington revealed he intended to weaken the controls, and allow departments more independence over their spending. Here, the NAO report suggests such a move is justified – despite criticisms from some outsiders that diluting controls would simply bring a return to the bad old ways of poorly purchased projects from large systems integrators.

“GDS data shows that requests of up to £1m accounted for 47% of its spending controls team’s time on spending controls. At the same time, these requests produced only 1% of the financial savings claimed in 2015-16.”

In other words, the real savings from spend controls come from oversight of the bigger projects – not the many small ones that GDS has also been approving. However, here too there is a clear need for better engagement with departments, with the NAO noting that: “Departments regularly submit spending proposals for GDS approval at a late stage in the development of programmes and projects. Our examination of 2016-17 data found that 40% of programmes and projects relating to applications received at the full business case stage had not been reviewed previously by GDS.”

Cunnington is introducing a new controls process to address this issue, with departments being mandated to share an 18-month pipeline of forthcoming digital projects.

Suppliers

“While new digital and procurement frameworks targeting SMEs have had some impact, most government procurement with digital and technology suppliers continues to be with large organisations. In 2015-16, 94% of such spending was with large enterprises, a fall of less than one percentage point since 2012-13.”

GDS has made much of its attempts to introduce more SME suppliers, and to reduce dependence on the “oligopoly” of large systems integrators that became associated with government IT overspending and waste over the previous decade. Much has been made of G-Cloud and the Digital Marketplace, the online catalogues that see 64% of sales go to SMEs.

But here, yet again, departments are simply not buying into the plan. It’s not entirely a GDS problem though – last week, MPs on the Public Accounts Committee criticised the Crown Commercial Service – the Cabinet Office’s central procurement agency – saying it had not won departments’ confidence, and had only been able to manage £2.5bn of spend on behalf of seven departments, rather than the £13bn and 17 departments that had been predicted in 2014.

Gov.uk Verify and Government as a Platform

“In 2014, the Civil Service Corporate Management Board asked GDS and HM Treasury to work with departments on the case for adopting a cross-government approach. They stated that ‘a first principle for delivering any of the building blocks of Government as a Platform would be to reuse previous work done by departments’. But so far the main working components are newly built platforms.

“In principle, development effort is reduced when new services can make use of existing common components. GDS’s new platforms are attempting to aggregate demand. The underlying applications (such as text message notification) are already commercially available and used in existing services. It is not clear how new platforms are meeting the greatest need and the direct benefits of aggregation are small.”

External critics of GDS have often questioned why the organisation feels it needs to develop so many of its services in-house, rather than using off-the-shelf software purchased commercially. Gov.uk, the single government website, is considered GDS’s biggest success story, but uses a series of content management systems (CMS) that were all developed internally, instead of using one of the many CMS platforms that are commercially available.

The NAO reserves some of its heaviest criticism for Verify, the troubled identity assurance programme – which, if you’ll remember, accounts for 21% of all GDS spending. It’s worth reading the NAO comments in full:

“To achieve the target of 25 million [Verify] users by April 2020, GDS needs the profile of users to increase at a much sharper rate from April 2019. The September 2015 business case predicted 4.4 million users by the end of March 2017. This projection was reduced to 1.8 million in the October 2016 business case. As of February 2017, Verify had 1.1 million user accounts.

“Verify has not achieved the volume of users in the central forecast of the business cases, in part due to slower development of digital services across government, and fewer than expected services being ready to adopt Verify as the primary access route. In 2014, GDS expected over 100 departmental services to be using Verify by 2016. In October 2016, GDS predicted that 43 services would be using Verify by April 2018. In February 2017, 12 services were using Verify.

“Even services that do use Verify are continuing to use alternative methods to access services online. Of the 12 departmental services connected to Verify as of February 2017, nine also allow access by other means including, for one department, an enhanced version of the existing Government Gateway.

“Reduced take-up means that Verify will need to be centrally funded for longer, and reduces the incentive for the identity providers to lower their prices over time. It is not clear how or when GDS will determine whether continuing with Verify will achieve projected benefits.

“The use of multiple routes to accessing services online undermines the business case for Verify. In October 2016, GDS modelled the scenario of no additional HMRC users and found that this would reduce benefits by £78m over four years leading to a net cost of £40m. It also modelled that failure to achieve sufficient volumes to reduce the commercial costs of the service in 2018-19 could lead to a net cost of £70m in present value terms over four years. Although GDS has estimated a large positive net present value once indirect benefits and a longer time frame are included, the business case is highly reliant on assumptions about savings in departments, and it is not clear whether these are reasonable.”

The report goes on to document some of the well-publicised issues with Verify functionality, usability and performance, concluding that: “Combined with performance problems, this means that departments face weak incentives to adopt Verify.”

The Cabinet Office says that Verify has been mandated to departments as the one and only system to be used for identifying individual citizens using digital services. As we know from HM Revenue & Customs (HMRC), some of those departments are less than keen.

It’s true to say that GDS has a dilemma here. On one hand, it’s criticised for slow adoption of common platforms and told it should work with departments and mandate their use. On the other hand, it’s told departments don’t like services to be mandated and it needs to do a better job persuading them to buy in. There’s no easy answer. But Verify still teeters on the precipice between being the “gold standard” for identity that its original vision hoped for, or becoming an over-complicated, over-specified, over-ambitious and hugely costly failure.

The NAO adds:

“There was no full analysis of how existing services identified customers or analysis of the way in which customer data is held in existing services or how this might affect the user journey from Verify to completion of the service transaction. Such analysis may have provided more understanding about likely rate of take-up and the type of incentives required for departments to use Verify.”

Former GDS staffers say that Verify has been developed in too much of a cocoon, with the team focused too much on their goal of creating a universal identity assurance service that could become a standard across the UK, even in the private sector. As a result, say these sources, Verify has lost its grasp on what is really needed – a means to quickly, effectively and securely allow citizens access to the online services they want to use. HMRC will no doubt be delighted to see this observation from the NAO:

“The Verify business case ruled out development of Government Gateway as an alternative to Verify, based on strategic, technical and contractual grounds saying that to change this service would involve ‘disproportionate and duplicative investment’. Government Gateway currently hosts 138 live public sector services, and the Gateway is being improved. GDS has not reassessed the cost and security implications of an improved Gateway service.”

On Verify, NAO concludes:

“It is not yet clear whether Verify will be able to overcome the limitations that have prevented its widespread adoption across government, or whether attempts to expand in other ways will be successful in encouraging departments to adopt it. Take-up and cost projections remain optimistic.”

Conclusion

One paragraph of the NAO report neatly summarises the challenges for GDS:

“Digital transformation has a mixed track record across government. It has not yet provided a level of change that will allow government to further reduce costs while still meeting people’s needs. GDS has also struggled to demonstrate the value of its own flagship initiatives such as Verify, or to set out clear priorities between departmental and cross-government objectives.”

Things are changing – that’s why Kevin Cunnington was brought in. The transformation strategy is intended to provide the framework within which GDS and departments can finally engage productively and collaboratively, without the in-fighting that has dogged them in the past.

But the NAO paints a picture of an organisation that is struggling to adapt as it moves away from being a startup-style digital disrupter, without becoming another example of the Whitehall silos it was created to destroy.

As the NAO concludes:

“GDS’s renewed approach aims to address many of these concerns as it expands and develops into a more established part of government. But there continues to be a risk that GDS is trying to cover too broad a remit with unclear accountabilities. To achieve value for money and support transformation across government, GDS needs to be clear about its role and strike a balance between robust assurance and a more consultative approach.”