carloscastilla - Fotolia

CW@50: From ICL to ITIL

Standardisation helps industries to grow. Here in the UK, a number of highly influential standards emerged that have become building blocks for the modern computer industry

Standards drive adoption of new products. The IT industry has evolved around numerous industry standards, to define hardware and software environment, which promote interoperability, encourage competition and give customers choice.

The UK has been very much a driver in the wide adoption of standards. The British Standards Institute (BSI) was the world’s first national standards body.

Standardisation harks back to the golden age of the British Empire. Observing British influence in India in the 19th century, Jeremy Black notes in his book The British Empire: A History and Debate: “The British certainly brought standardisation.”

ISO 9000 was the first internationally recognised management standard. According to the British Assessment Bureau, it came about because the Ministry of Defence needed a set of documented procedures from its equipment suppliers. This was later expanded to the commercial sector.

“The idea of quality assurance spread beyond the military and in 1966, the UK government led the first national campaign for quality and reliability with the slogan ‘Quality is everyone’s business,’” the British Assessment Bureau’s website notes. This eventually led to BSI 5750, and later ISO 9000 and ISO 9001.

Martin Campbell-Kelly, computer historian, emeritus professor at the University of Warwick and trustee at the National Museum of Computing, says the age of third-generation computers marked the start of the modern IT industry.

“In the early 1960s, the problem of computing was reprogramming,” he says. “If you changed the computer, your programs didn’t work any more. Even though the Cobol and Fortran programming languages were around, compatibility was not very good.”

Even computers made by the same manufacturer were incompatible. “With IBM, if you had a 650, you would upgrade to a 1401 – but the two were not compatible,” says Campbell-Kelly.

But this was all about to change.

UK’s response to IT hardware standardisation

On 7 April 1964, IBM announced the IBM System/360. Campbell-Kelly recalls: “The plan was that you would have a range of compatible computers from the bottom end right up to supercomputers and that software would work on every computer.”

Unfortunately, IBM did not quite achieve this level of scalability with a single compatible architecture. “They couldn’t do supercomputers or the smallest computer,” he says.

At the top end of scalability, the instruction set was not geared up for the floating point number-crunching required by supercomputing applications, adds Campbell-Kelly.

IBM’s de facto standard

In 1969, IBM unbundled its software, training and services from its mainframe hardware business. Among the products unbundled was the CICS (customer information control system), the company’s mainframe transaction processing engine, which was developed in 1968. In 1974, CICS development shifted to IBM Hursley, home of the Second World War Spitfire.

According to IBM, the CICS now runs in most global businesses. It is used in banking transactions when a customer takes out money from a cash machine, or buys an insurance policy, pays a utility bill or shops in a large retail store. In 1993, Hursley announced the IBM MQ Series, its message queue transaction processing engine.

For scaling down, the economics did not pan out, says Campbell-Kelly. “For compatibility, you needed a minimum instruction set, which wasn’t possible on a very low-end machine.” The hardware was simply too expensive, he adds.

Nevertheless, the System/360 represented the world’s first attempt at a compatible family of computers and IBM’s global sales operations were gearing up to hammer home their advantage.

“When IBM came up with a compatible range, it locked in the user,” says Campbell-Kelly. “We are locked into System/360 architecture even today.”

Meanwhile, in the UK, things were about to get very interesting. “ICT, which later became ICL, decided to produce the 1900 series, a Canadian design,” says Campbell-Kelly. “It held on to the original Canadian architecture and made bigger and smaller machines.”

This gave the British computer something of a head start because it already had a machine on which to base a product line, he says.

Unlike IBM’s System/360, the 1900 had a six-bit character rather than an eight-bit byte.

Various US companies produced IBM-compatibles, such as Univac and Burroughs.

But for Campbell-Kelly, the most provocative was RCA, which introduced its Spectra 70 IBM compatible mainframe.

In the UK, English Electric had a relationship with RCA, and made the English Electric System 4, an IBM System/360 compatible.

Standardisation of the internet

The history of the UK’s influence on worldwide standards is incomplete without at least a passing reference to the internet. The world wide web was invented by Tim Berners-Lee at Cern in March 1989, as he was looking at how researchers at the particle physics lab could share documents. He developed the HTTP and HTML protocols and a web server running on a NeXT Cube Unix workstation, which became the first version of the world wide web.

But the UK’s history with internet standards began more than two decades earlier. As Computer Weekly previously reported, on 10 September 1973 at the University of Sussex, Vint Cerf gave the first public presentation of a seminal piece of research that would define the modern internet.

Cerf and co-author Robert Kahn distributed the paper, entitled A Protocol for Packet Network Interconnection, to a special meeting of the INWG (International Network Working Group) at the university, describing a method to communicate across a network of networks. This network of networks is what is now known as the internet, and the so-called protocol for packet network interconnections later became TCP/IP.

ICT merged with English Electric in 1968 as part of an initiative by the Wilson government to create a British computer industry that could compete with the US. Clearly, the problem faced by the newly formed ICL was that it now sold two machines and so would have two entirely different and incompatible ranges.

“They decided to replace the two ranges with the ICL 2900 series, which was not IBM-compatible,” says Campbell-Kelly. The company chose a different architecture to IBM to make its range of computers distinct. 

At the time, managing director Arthur Humphreys thought ICL could not hope to compete with IBM on scale. “ICL needed a differentiated product, and to offer superior technology,” says Campbell-Kelly.

The 2900 series was eventually launched in 1974, but according to Campbell-Kelly, ICL never quite got the R&D grant of £25m it had been promised by the Wilson government, because of a financial crisis.

The grant would have helped the company to launch the 2900 series earlier, he says, although “ICL was making profits and was surviving quite well”.

ICL signed a deal in 1981 with Fujitsu, which manufactured its semiconductors, substantially reducing ICL’s R&D and manufacturing costs.

Evolution of Britain’s open standards contributions

ICL was doing well with the government at least until the early 1980s. But the Thatcher government, whose agenda was geared towards privatisation and the forces of capitalism, opened the door to IBM and others to tender for government contracts.

Ian Clayton, author and c-level strategist for Customer Centricity, says: “I would suggest the Tories were more about competition – to bring down the price. I know ICL was a slow-moving and expensive government-funded operation. It was losing money and losing ground.”

Opening up the tender process promised to give the UK government better value for money and would potentially enable the UK to benefit from leaps in processing performance. To compete, ICL needed to partner with Japanese tech giant Fujitsu.

Read more CW@50 articles

At the time, the Central Computer and Telecommunications Agency (CCTA) provided telecoms and computer support for government departments and politically – as some industry commentators note – an IT government deal involving a Japanese firm would not have sat well alongside the jingoism following the Falklands War in 1982.

Operations management is a key part of modern IT, but in the past, much of the expertise to manage and operate complex computer systems was held by the big computer providers. Clayton believes it was a selfish move to arm the CCTA staff with the latest ‘how to’ on how to operate and manage an IBM computer the way IBM and other outsourcers (EDS) were managing datacentres.

Clayton used to work at EDS, the remnants of which still remain somewhere in HP Enterprise. He was team leader of the IT project to build what was then the Midland Bank centre. As ICL was now in a less favourable position, he believes government IT staff realised they needed to reskill.

“I was interviewed by the CCTA in 1985 about what practices we were using at EDS to build and staff datacentres,” he says. “Given the stories told by the players at the heart of the CCTA, I don’t think the government IT folks would have moved an inch without fear of losing their pensions by being replaced by IBM knowledgeable folks.”

Best practices

In other words, the UK government wanted some of the best practices that IBM and the big computer companies used to manage IBM mainframe systems. Conveniently, IBM published its research in a set of volumes entitled Information Systems Management Architecture.

According to Clayton, some of this guidance crept onto IBM MVS/XA systems programming tapes, which were the install and implementation procedures available to mainframe system programmers. “I know that because I was the one who found it,” he says. 

When printed out, this documentation gave mainframe administrators the basic language for operations management – start/stop computer procedures, backup/restore, print management, and event management, says Clayton.

It was the CCTA, working on the National Savings Bank IT system, that eventually cemented datacentre operations management into a set of best practices that are now adopted globally as ITIL.

In an article posted on the Service Management University website, Brian Johnston, one of the original creators of ITIL, recalls how it all began. “Between 1985 and 1988, Willy Carroll (RIP) has probably the sole justifiable claim to be the person who first ‘implemented’ ITIL…except that then the method was known as the government IT infrastructure management method (GITIMM),” he says.

Having read an early draft of GITMM, Wilson says Carroll believed National Savings needed to change its culture to have any chance of benefiting from the huge investment in mechanisation of the clerical environment. According to Wilson, Carroll saw a need to get IT operations organised.

ITIL was not the first IT standard forged by the UK government. The CCTA also established the SSADM (structured system analysis and design method) in 1981 to provide a systematic approach to analysis and design of application software. This has been adopted as the standard method for software development in all departments of the UK government.

In 1989, the CCTA also adopted the PROMPT II (project resource organisation management planning techniques) for managing government IT projects. This became Prince 2, which is widely used as a project management methodology in government and commercial IT projects.

Rewinding standardisation

During the 1980s, a computer revolution took shape in the UK which led to bedroom coders and home computers such as the ZX81, the Spectrum, the Dragon 32, the Acorn BBC B and later the Amstrad PCW, all of which were UK made. TNMOC trustee Kevin Murrell says: “We had many more designs in the UK than in the US.”

At the time, the IBM PC was far too expensive for amateurs and even small businesses to afford, so the UK computer market became very disjointed. Just as in the pre-mainframe era of the early 1960s, software and even data could not easily be copied from one home computer architecture to another.

By the start of the 1990s, the PC revolution was beginning to take shape. It empowered a new way to work with IT and the dawn of client server computing, supplanting the mainframe as the preferred way to run enterprise systems. As for ICL, it was eventually acquired by Fujitsu in 1992.

Read more on IT project management