The Leading Edge

ClearPath Forward4 minutes readJun 22nd, 2012
SHARE +

A study, a particular event and an anniversary gave me the ideas for this blog. A colleague in Unisys pointed me at the study and the event. Both were about costs: one of system downtime, the other the consequences of data theft. I’ll come to the anniversary later.

The cost of downtime was the subject of a report by the Aberdeen Group. In 2010, they produced a report, Datacentre Downtime: How Much Does It Really Cost?, on the cost of data centre downtime, based on an analysis of a large number of organisations worldwide. The report to which my colleague drew my attention is an update produced in March 2012.

The report shows that the costs of data centre loss have become more serious in the two years between the reports. Although industry-leading organisations today have minimal losses, those without adequate provision are paying a high price, measured in millions of dollars per year.

The second was a news item about the theft of a PC, which contained confidential information about a large number of people. The cost of informing them all was over $6 M. The consequences of having data violated in a data centre by someone hacking into the system are likely to be orders of magnitude higher. The data centre is supposed to be secure, so confidence in the organisation would be damaged.

Where does this fit with ClearPath systems? They have outstanding availability and security attributes, making them ideally suited to the demanding environments in which they normally operate. Tools such as BCA, XTC and Operations Sentinel are provided for maintaining or restoring service continuity in the event of the loss of a data centre. If business requirements demand it, service interruption can be all but eliminated. And as regards security, ClearPath OS 2200 and MCP are the only operating systems to have no reported data vulnerabilities in the NIST dataset of operating system vulnerabilities.

In short, ClearPath systems are at the leading edge of platforms for mission-critical applications. Unisys has been at the leading edge since the start. It produced the first general-purpose, commercial computer, the UNIVAC. Before that, computers had tended to be one-off systems, often for special purposes. Readers interested in the history of Unisys computing should see the excellent book ‘Unisys Computers: An Introductory History’ by George Gray and Ron Q Smith.

Which brings me in a roundabout way to my other trigger for this piece. This year is the centenary of the birth of Alan Turing, who was at the leading edge in computer science. He was born in London on 23rd June 1912. (For a number of years I walked past his birth place – now a hotel – on my way to work for Unisys.)

Although best known for  his work in code breaking during WW 2 – he was instrumental in cracking the German enigma code – he made outstanding contributions to computer science theory, starting with his seminal 1936 paper, “On Computable Numbers, with an Application to the Entscheidungsproblem“, in which he developed what became known as the Turing Machine.

Between 1936 and 1938, he worked for a PhD in the Institute of Advanced Study, Princeton, under Alonzo Church, and visited the US on a number of subsequent occasions, spending some time at Bell Labs among other places. It’s possible that on one of his visits he may have met the UNIVAC pioneers J Presper Eckert and John Mauchly.

There are events worldwide to mark the centenary. The Association for Computing Machinery (ACM) has the Turing Award as its most prestigious prize, often held to be the Nobel Prize of computing. All the surviving winners are in San Francisco for an event on the 15th and 16th June.

Alan Turing deserves to be remembered as a leading edge pioneer by all of us in the business of computing.