masterzphotofo - Fotolia
IT greats: Top 10 greatest IT people
For every world-famous name with a world famous fortune – think Bill Gates, Steve Jobs and Michael Dell – there are hundreds of other individuals who have moved the IT industry and its technology inexorably forward
For every world-famous name with a world-famous fortune – think Bill Gates, Steve Jobs and Michael Dell – there are hundreds of other individuals who have moved the IT industry and its technology inexorably forward.
Fame and fortune has rarely been their immediate spur. A passion for changing the world through technology is the hallmark of the IT greats. Sometimes they have changed technology, sometimes they have transformed the way technology is marketed or radically altered the way IT is perceived by society.
Some have been involved in great leaps forward, some have made incremental changes that have stood the test of time.
Whatever the case, our industry is truly one where we all stand on the shoulders of giants, and we are proud to pay tribute to some of them in the results of our IT greats poll.
Top 10 greatest IT people
1. Steve Jobs
2. Tim Berners-Lee
3. Bill Gates
4. James Gosling
5. Linus Torvalds
6. Richard Stallman
7. Arthur C Clark
8. Ted Codd
9. Steve Shirley
10. Martha Lane Fox
1. Steve Jobs
Steve Jobs, the co-founder and chief executive of Apple Computer, topped the Computer Weekly 40th anniversary poll due to the devoted following he has generated through his pioneering work in personal computing and product design.
Jobs was born in 1955 in San Francisco, and during his high school years he showed his early enthusiasm for computing by attending after-school lectures at the Hewlett-Packard Company in Palo Alto, California. He met fellow Apple founder Steve Wozniak during a summer job at HP.
In the autumn of 1974, Jobs, who had dropped out of university after one term, began attending meetings of the Homebrew Computer Club with Wozniak. He took a job as a technician at Atari, a manufacturer of popular video games.
At the age of 21, Jobs saw a computer that Wozniak had designed for his own use, and convinced his friend to market the product.
Apple Computer was founded as a partnership on 1 April 1976. Though the initial plan was to sell just printed circuit boards, Jobs and Wozniak ended up creating a batch of completely assembled computers, and entered the personal computer business.
Their second machine, the Apple II, was introduced the following year and became a huge success, turning Apple into an important player in the nascent personal computer industry.
Read more Computer Weekly top 10s
- Chips with everything: Hardware top 10.
- The soft machine at the heart of IT: Software top 10.
- Innovation is the key to greatness: The top 10 organisations.
In 1983, Apple launched the Lisa, the first PC with a graphical user interface (GUI) – an essential element in making computing accessible to the masses. It flopped because of its prohibitive price, but the followin year Apple launched the distinct, lower priced Macintosh and it became the first commercially successful GUI machine.
Despite his success in founding Apple, Jobs left following a boardroom row in 1985. But his influence on the computer industry did not end there.
Jobs moved on to found Next Computer, then in 1986 he bought little known The Graphics Group from Lucasfilm, which achieved global dominance in animated feature films during the 1990s, after being renamed Pixar.
Much of Next’s technology had limited commercial success, but it laid the foundation for future computing developments. The company pioneered the object-oriented software development system, Ethernet port connectivity and collaborative software. It was the Next interface builder that allowed Tim Berners-Lee to develop the original worldwide web system at Cern.
Without Jobs, Apple had stumbled. Market share fell while it struggled to release new operating systems. Its answer was to buy Jobs’ company Next, together with its innovative operating system, and welcome back its charismatic former CEO.
On returning to Apple, Jobs drove the company ever deeper into the consumer electronics and computing market, launching the iMac and iPod.
Steve Jobs died on 5 October 2011 after a long battle with cancer, but his place in computing history is guaranteed.
2. Tim Berners-Lee
Dot coms, bloggers and Google all have one man to thank for their place in the 21st century world. In 1990, Tim Berners-Lee made the imaginative leap to combine the internet with the hypertext concept, and the worldwide web was born.
He was born in 1955 in London. Berners-Lee’s parents were both mathematicians who were employed together on the team that built the Manchester Mark I, one of the earliest computers.
After attending school in London, Berners-Lee went on to study physics at Queen’s College, Oxford, where he built a computer with a soldering iron, TTL gates, an M6800 processor and an old television. While at Oxford, he was caught hacking with a friend and was subsequently banned from using the university computer.
He worked at Plessey Telecommunications from 1976 as a programmer and in 1980 began working as an independent contractor at the European nuclear research centre Cern.
In December 1980, Berners-Lee proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. While there, he built a prototype system called Enquire.
He joined Cern on a full-time basis in 1984 as a fellow. In 1989, Cern was the largest internet node in Europe, and Berners-Lee saw an opportunity. “I just had to take the hypertext idea and connect it to the TCP and DNS ideas,” he said – and the worldwide web was born.
He wrote his initial proposal in March 1989, and in 1990, with the help of Robert Cailliau, produced a revision which was accepted by his manager, Mike Sendall.
He used similar ideas to those underlying the Enquire system to create the worldwide web, for which he designed and built the first web browser and editor – called World-wide Web and developed on Nextstep – and the first web server called Hypertext Transfer Protocol Daemon (HTTPD).
The first website built was at http://info.cern.ch/ and was put online on 6 August 1991. The URL is still in use today. It provided an explanation of the worldwide web, how one could own a browser and how to set up a web server. It was also the world’s first web directory, since Berners-Lee maintained a list of other websites.
In 1994, Berners-Lee founded the World Wide Web Consortium (W3C) at the Massachusetts Institute of Technology. It comprised various companies willing to create standards and recommendations to improve the quality of the web.
Berners-Lee made his ideas available freely, with no patent and no royalties due. He is now a professor at University of Oxford’s department of computer science, working alongside his long-time research collaborator Nigel Shadbolt as a member of Christ Church college. He is also a professor at the Massachusetts Institute of Technology (MIT) in the US, and will continue to work on shaping the future of the web through his work with the World Wide Web Consortium (W3C) and the World Wide Web Foundation.
3. Bill Gates
As joint founder of the world’s biggest software company, Microsoft, Bill Gates’ approach to technology and business was instrumental in making technology available to the masses.
Gates was born in Seattle, Washington, in 1955 to a wealthy family: his father was a prominent lawyer and his mother served on the board of directors for First Interstate Bank and The United Way.
At school, Gates excelled in mathematics and the sciences, and by the age of 13 he was deeply engrossed in software programming.
With other school mates he began programming and bug fixing for the Computer Center Corporation, and in 1970 Gates formed a venture with fellow school student and Microsoft co-founder, Paul Allen, called Traf-O-Data, to make traffic counters using the Intel 8008 processor.
In 1973, Gates enrolled at Harvard University, where he met future business partner Steve Ballmer. Their first venture was to develop a version of the Basic programming language for the Altair 8800, one of the first microcomputers.
Soon afterwards Gates left Harvard to found “Micro-Soft”, which later became Microsoft Corporation, with Allen. Microsoft took off when Gates began licensing his MS-Dos operating systems to manufacturers of IBM PC clones. Its drive to global dominance continued with the development of Windows, its version of the graphical user interface, as an addition to its Dos command line.
By the early 1990s, Windows had driven other Dos-based GUIs such as Gem and Geos out of the market. It performed a similar feat with the Office productivity suite.
Gates fought hard to establish Microsoft’s dominant position in the software industry– and fought even harder to defend it. His ability to get Microsoft software pre-installed on most PCs shipped in the world made Microsoft the world’s largest software house and Gates one of the world’s richest men. It also meant Microsoft found itself on the wrong end of antitrust legislation in both the US and Europe.
Gates stood down as chief executive of Microsoft in 2000 to focus on software development, and on 16 June 2006 he announced that he would move to a part-time role with Microsoft in 2008 to focus on his philanthropic work.
Since 2000, Gates has given away about £15.5bn – a third of his wealth – to charity. Such is his fame in the world outside computing, fictional Gates characters have appeared in cartoons including the Simpsons, South Park and Family Guy.
4. James Gosling
Of your choice of the most influential people in IT, James Gosling is the true geek. Unlike Bill Gates and Steve Jobs, neither of whom finished college, Gosling completed a PhD in computer science and contributed to software innovation at a technical level.
Born in 1955 near Calgary, Canada, Gosling is best known as the father of the Java programming language, the first programme language designed with the internet in mind and which could adapt to highly distributed applications.
Gosling received a BSc in computer science from the University of Calgary in 1977, and while working towards his doctorate he created the original version of the Emacs text editor for Unix (Gosmacs). He also built a multi-processor version of Unix, as well as several compilers and mail systems before starting work in the industry.
In 1984, Gosling joined Sun Microsystems, where he is currently chief technology officer in the developer product group.
In the early 1990s, Gosling initiated and led a project code-named Green that eventually became Java. Green aimed to develop software that would run on a variety of computing devices without having to be customised for each one.
Although much of the technology developed as part of Green never saw the light of day, Gosling realised that some of the underlying principles they had created would be very useful in the internet age.
Sun formally launched Java in 1995. Gosling did the original design of Java and implemented its original compiler and virtual machine. For this achievement he was elected to the US National Academy of Engineering. He has also made major contributions to several other software systems, such as Newa and Gosling Emacs.
Although some critics say Java has not lived up to its initial “write-once-run-anywhere” claim, Gosling's success in the Computer Weekly polls is precisely because Java has allowed the creation of robust, reusable code which runs on devices as diverse at mobile phones, PCs and mainframes.
5. Linus Torvalds
As the creator of the Linux operating system, Linus Torvalds has been a driving force behind the whole open source movement, which represents not only an ever increasing challenge to proprietary software, but is also the inspiration for the industry to move to open standards.
Torvalds remains the ultimate authority on what new code is incorporated into the Linux kernel.
6. Richard Stallman
Richard Stallman is the founder of the GNU Project, an initiative to develop a complete Unix-like operating system which is free software. Stallman has written several popular tools, created the GNU licence and campaigns against software patents.
7. Arthur C Clarke
2001: A Space Odyssey writer Arthur C Clarke has consistently been ahead of his time in predicting how technology will change the world. Most notably, in 1945 he suggested that geostationary satellites would make ideal telecoms relays.
8. Ted Codd
Ted Codd created 12 rules on which every relational database is built - an essential ingredient for building business computer systems.
9. Steve Shirley
Steve Shirley was an early champion of women in IT. She founded the company now known as Xansa, pioneered new work practices and in doing so created new opportunities for women in technology.
10. Martha Lane Fox
With Brent Hoberman, Martha Lane Fox created Lastminute.com in 1998, and as “the face” of Lastminute raised the profile of e-commerce ever higher in the public consciousness.
1955: a good year for computing
The top four people in our poll were all born in 1955, making it a very beneficial year for the world of computing.
It may have been a good year for computing, but 1955 was a sad year for science, as Albert Einstein died on 18 April.
It was also the year that the first McDonald’s fast food franchise was opened: we’ll leave you to make up you own mind about that one.
Your big names
Outside the main choices for greatest hardware, the most popular readers’ suggestions were:
- Ken Olsen, founder of Dec, who invented the minicomputer
- Clive Sinclair, home computer visionary
- Vint Cerf, one of the internet's founding fathers
- Bill Joy, co-founder of Sun Microsystems
- Larry Ellison, founder of Oracle
- Steve Wozniak, Apple co-founder
- Dennis Ritchie, inventor of the C programming language
- Donald Davies, co-inventor of packet switching
- Ken Thompson, co-creator of Unix
- Grace Hopper, Cobol pioneer