Welcome!

Blog Feed Post

The collaboration behind Colossus

CRI-117When I first heard about the heroic efforts during WWII to break the Nazi communications codes such as Enigma, I had in my mind the image of a lone cryptanalyst with pencil and paper trying to figure out solutions, or using a series of mechanical devices such as the Bombe to run through the various combinations.

But it turns out I couldn’t be more wrong. The efforts of the thousands of men and women stationed at Bletchley Park in England were intensely collaborative, and involved a flawless execution of a complex series of steps that were very precise. And while the Enigma machines get a lot of the publicity, the real challenge was a far more complex German High Command code called Lorenz, after the manufacturer of the coding machines that were used.

The wartime period has gotten a lot of recent attention, what with a new movie about Alan Turing just playing in theaters. This got me looking around the Web to see other materials, and my weekend was lost in watching a series of videos filmed at the National Museum of Computing at Bletchley Park.  The videos show how the decoding process worked using the first actual electronic digital computer called Colossus. Through the efforts of several folks who maintained the equipment during wartime, the museum was able to reconstruct the device and have it in working order. This is no small feat when you realize that most of the wiring diagrams were immediately destroyed after the war ended, for fear that they would fall into the wrong hands. And that many people are no longer alive who attended to Colossus’ operations.

The name was realistic in several ways: first, the equipment easily filled a couple of rooms, and used miles of wires and thousands of vacuum tubes. At the time, that was all they had, since transistors weren’t to be invented for several years. Tube technology was touchy and subject to failure. The Brits figured out that if they kept Colossus running continuously, they would last longer. It also wielded enormous processing power, with a CPU that could have had a 5 MHz rating. This surpassed the power of the original IBM PC, which is pretty astounding given the many decades in between the two.

But the real story about Colossus isn’t the hardware, but the many people that worked around it in a complex dance to input and transfer data from one part of it to another. Back in the 1940s we had punch paper tape. My first computer in high school had this too and let me tell you using paper tape was painful. Other data transfers happened manually copying information from a printed teletype output into a series of plug board switches, similar to the telephone operator consoles that you might recall from a Lily Tomlin routine. And given the opportunity to transfer something in error, the settings would have to be rechecked carefully, adding more time to the decoding process.

There is an interesting side note, speaking about mistakes. The amount of sheer focus that the Bletchley teams had on cracking German codes was enormous. Remember, the codes were transmitted over the air in Morse. It turns out the Germans made a few critical mistakes in sending their transmissions, and these mistakes were what enabled the codebreakers to figure things out and actually reconstruct their own machines. Again, when you think about the millions of characters transmitted and just finding these errors, it was all pretty amazing.

What is even more remarkable about Colossus was that people worked together without actually knowing what they did. There was an amazing amount of wartime secrecy and indeed the existence of Colossus itself wasn’t well known until about 15 or 20 years ago when the Brits finally lifted bans on talking about the machine. For example, several of the Colossus decrypts played critical roles in the success of the D-Day Normandy invasion.

At its peak, Bletchley employed 9,000 people from all walks of life, and the genius was in organizing all these folks so that its ultimate objective, breaking codes, really happened. One of the principle managers, Tommy Flowers, is noteworthy here and actually paid for the early development out of his own pocket Another interesting historical side note is the contributions of several Polish mathematicians too.

As you can see, this is a story about human/machine collaboration that I think hasn’t been equaled since. If you are looking for an inspirational story, take a closer look at what happened here.


Read the original blog entry...

More Stories By David Strom

David Strom is an international authority on network and Internet technologies. He has written extensively on the topic for 20 years for a wide variety of print publications and websites, such as The New York Times, TechTarget.com, PC Week/eWeek, Internet.com, Network World, Infoworld, Computerworld, Small Business Computing, Communications Week, Windows Sources, c|net and news.com, Web Review, Tom's Hardware, EETimes, and many others.

Latest Stories
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...