The recent VE Day 75th anniversary was celebrated with bunting, street parties and non-socially distanced Congas. At the end of the actual war in Europe the use of ticker tape was more popular. However, the use of paper computer program tape to mark this event would have been far more appropriate!



The end of World War 2 in Europe and developments in Radiotherapy are intrinsically linked.

We take a look at the earliest computer-based treatment planning systems, recent advances to AI and Machine Learning to plan precision radiotherapy while initially bow to the computer scientists who shortened World War 2 saving many thousands of lives.

What was ticker tape actually designed for?

Ticker tape was used for transmitting stock price information over telegraph lines, in use from around 1870 through to 1970. It consisted of a paper strip that ran through a machine called a stock ticker, which printed abbreviated company names as alphabetic symbols followed by numeric stock transaction price and volume information. The term “ticker” came from the sound made by the machine as it printed. Ticker tape was the earliest electrical dedicated financial communications medium.

Pic NYC ticker tape parade

Paper ticker tape was phased out in the 1960s as TV and computers were increasingly used to transmit financial information and is still done today for instance at the bottom of the Sky News feed.

Old ticker tape was used as confetti and thrown from the windows above parades either cut up into scraps or thrown as whole reels, this became known as a ticker tape parade. Ticker tape parades celebrated major events, such as the end of WW1 and WW2 or the safe return of one of the early astronauts as shown above.

Bletchley Park saved lives and created the first programable computer using paper tape

I was watching TV at the time of the VE Day 75th anniversary celebrations or the “Semisesquicentennial” and a program about the thousands of lives saved by the code breaking work at Bletchley Park was on and how that these efforts shortened the war in Europe by many months at a time when over 10m lives were being lost world-wide annually and so the idea for this blog came to mind.

I had always believed that this was the mainly due to the work by Alan Turing on breaking the Enigma code. In our February 2019 News-Zone we shouted-out about his work when he was voted the nation’s icon of the 20th century in BBC Two’s Icons finale

Alan Turing is considered to be the father of modern computing and artificial intelligence. His concept of the Turing machine is still one of the most widely examined theories of computation.

We said then that we should consider his impact in the field of Radiotherapy as we treat our patients with ever more complex machines daily as without his foresight the world of cancer treatment would look very different today! I thought it would be good to explore this a little further in this blog!

On Icons Chris Packham said: “Alan Turing’s genius brought Britain back from the brink during WW2. While he was punished for being different, his work celebrated diversity. Under the circumstances, that makes him truly iconic.”

https://us13.campaign-archive.com/?u=f37734b7d4289a51e4f451fea&id=850935f241

However, that is only part of the story of the code breaking work at Bletchley Park.

The Colossus Machine

Designed by Tommy Flowers at Bletchley Park, Colossus was developed to crack Nazi codes.

Tommy Flowers was born in 1905, had a basic East End of London education, a strong cockney accent and was the son of a bricklayer and so not someone that might have been expected to one day have a dramatic impact on WW2.

He completed an apprenticeship in mechanical engineering and a degree in electrical engineering at the University of London and then joined the telecommunications branch of the General Post Office or GPO moving to work at the research station at Dollis Hill near Wembley where he had what can only be described as a computer epiphany!

From 1935 he explored the use of electronics for telephone exchanges and by 1939, he was convinced that an all-electronic system was possible. This background in electronics would prove crucial for his computer design in WW2.

Flowers entered the world of codebreaking when approached to work with the aforementioned Alan Turing’s Bombe, a system designed to break the Enigma codes of the German Navy in the early war years. This allowed the UK to intercept and decipher encrypted radio-messages to and from U-Boats, keeping the Atlantic safer for all forms of shipping, especially critical when the UK was reliant on overseas food supplies. Enigma was broadcast in Morse-code, and then from 1941 the less-well-known “Fish” transmissions were based on electric teleprinter technology.

Pic: U-Boat

A new and more complicated German machine called Lorenz, required an even more complex system to decipher it. In order to break the code, Flowers proposed the design for the machine that would become known as Colossus, the world’s first programmable electronic computer.

Pic: Tommy Flowers.

During WW2, the Germans used the Lorenz machines to encrypt messages between the German High Command in Berlin and their armed forces throughout Europe, allowing the transmission of what they believed to be, top secret messages and intelligence.

These strategic communications required the encryption of much longer messages, many thousands of characters in length. Adolf Hitler allegedly called Lorenz his ‘secret writer’ and the breaking of this cipher was of extreme importance to Bletchley Park as is also provided the Allies with an insight into Hitler’s longer-term plans and strategic intentions.

Pic: German High Command badge

Lorenz or ‘Tunny’, (Yes as in Tuna Fish) transmissions as the workers at Bletchley called them were delivered from a 12 wheeled rotor cipher machine that allowed every starting position for a new message to be different, therefore no two messages would have the same initial position.

There were 2 central cogs and 5 on each side but of course no one at Bletchley knew this as they had not seen it!

Pic: Lorenz ‘Tunny’ coding machine

The code was broken due to one hugely lucky but momentous human error by a German operator. On 30th August 1941, the same message was sent twice with the same settings but small changes to the text, a mistake that was very useful to the research codebreakers. This blunder allowed Bill Tutte, another less known but equally important colleague to identify the code and produce a description of the unseen ‘Tunny’ machine.

He managed to unmask systematic patterns in the messages. He worked out very presciently that the masking letters were produced inside the Tunny machine by a system of 12 different wheels.

Thanks to the initial 1944 computer designed by Flowers and once the code was cracked, the Colossus machine was able to decrypt over 63 million characters of German communications in hours rather than days. Vital WW2 plans such as the German preparation for D-Day were deciphered, helping to shorten the war by many months and save thousands of lives.

Not only was eves-dropping critical it was also possible to “suggest” to the Germans that the allied D-Day landings would happen in Calais when we now know it was the Normandy beaches that were chosen after the Allies were able to calculate the weather forecast and local availability of German boats and planes.

Pic: You can buy this book on Amazon by B Jack Copeland for just £11.99 in paperback and it’s a good read, I did! It’s called Colossus, the secrets of Bletchley Park’s codebreaking computers.

Modern computing design came directly from the work at Bletchley Park

The 1944 computer was the first large-scale electronic computer ever built, which helped to show that digital, programmable and electronic devices could be built and maintained. Even with most of the Colossus machines being destroyed shortly after the war, many of Bletchley’s team took an interest in the development of computers, particularly Alan Turing who as mentioned went on to build the first post war computer in the UK.

The Colossus computer did not store programs in its memory at all, completing different tasks and jobs required manual rewiring using switches, relays and plug panels. It had as electric typewriter output and the CPU used custom circuits with thermionic valves and thyratrons. A total of 1,600 valves in Mk 1 and 2,400 in Mk 2 machines. It had no memory or RAM, the display was a simple indicator lamp panel and the program was input using paper tape of up to 20,000 × 5-bit characters in a continuous running loop.

It proved that large numbers of electronic circuits could be made to do reliable calculations at speed. Tommy Flowers and his team would overcome many of the reliability issues of working with valves by leaving the Colossus permanently switched on for the duration of the war!

Pic: Colossus machine at Bletchley Park

A system of wheels on Colossus and shown on the right in the photograph above, guided the punched paper tape, containing the German encrypted message, through an optical reader as a repetitive loop of punched paper tape in 5-bit teleprinter code. Characters on the tape were read into Colossus over and over again at the amazing rate of 5,000 characters per second.

Once the paper tape was set up around 4-6 hours would pass before Colossus would generate the results of the analysis of the message. These results together with further work by the codebreakers, would result in the breaking of the German Lorenz cipher to reveal the strategic message it disguised.

The end of Colossus

Ten Colossus units were built during the war, providing key intelligence to the Allied forces. All but two were dismantled after the war, and these last two units were decommissioned in 1959 and 1960 after further intelligence use at GCHQ. The existence of the Colossus system was only revealed in the 1970s after the Russians used the “Tunny” machines in a reconditioned format during the cold war!

Bill Tutte was remembered for his efforts by signing the “Royal Society Charter Book”, joining illustrious compatriots such as Darwin, Churchill and Turing while Tommy Flowers had a street and education centre in east London named after him that is now sadly closed down! How the mighty fall.

You can read more about Royal Society recognition here:

https://blogs.royalsociety.org/history-of-science/2013/09/13/charter-book-history/

The EMI Plan at the Middlesex Hospital used a continuous reels of computer tape too!

In 1980 when I started as a student radiographer and was thrown into the planning department and real-life patients on day one, part of this three-week stint was to liaise with “Physics” who created the computerised treatment plan for each patient. This liaison was usually in the form of porter to deliver and collect requisite information inter-departmentally.

One more interesting aspect was a visit to the EMI Plan room, the forerunner to the RT Plan and GE Target treatment planning systems based on the work of Bentley and Milan using 2D central axis depth dose data and off axis ratios to create a plan. Nowadays if you Google Bentley Milan you get the website for a new prestigious car dealership in Italy but this was state of the art in its day and continued in various updated forms as the gold-standard for treatment planning in the UK for many years until the advent of commercial 3D planning systems in the early 90’s. I should mention here in dispatches the work of Jack Cunningham and his EQTAR dose calculation model that was also popular in the UK at this time but more so in the US and Canada when incorporated into Theraplan.

This system was uniquely installed into an EMI CT Scanner console

When EMI were developing their whole-body CT scanner, they presciently foresaw that using a CT image would have an important role to play in radiotherapy treatment planning. Without CT data, planning was less precise and a largely analogue process that involved a trade-off between the accuracy of the planning process and the ability of the Linac to deliver the treatment.

Pic: An advert from the 70’s for the latest CT, the EMI Plan used to sit in a similar operator console.

They decided to integrate the planning system into the CT console so that the scan could be used to more accurately define the patient, allow for CT data and inhomogeneities or Hounsfield numbers to be used in the dose calculation and to overlay the dose distribution and isodoses over the images for assessment and treatment delivery. This of course also allowed for the accurate delineation of critical structures to avoid too.

There was also a digitiser to allow for the input of patient information that was collected by human beings in the form of “outlines” taken by hand with bendy lead wire and mainly used for breast plans and a printer to print out the end product for the radiographers to manually transcribe on to the patient.

The EMI Plan ran on a continuously fed computer program very similar to the Colossus as far as I can recall with no RAM either and did so until the advent of large floppy disks that could contain the program updates and also archive the treatment plans.

The RT Plan initiative that followed this was also installed in a quasi-CT Scanner console but was not directly connected to the CT Scanner in the same room or area and could communicate remotely. At Mount Vernon when I moved there, the CT was about half a mile away in the Paul Strickland Scanner Centre and so notable progress had been made!

The “Light-Pen” was a seemingly magical pen with a computer cable attached and mentioned below allowed you to move the treatment beams and isocentre on the display monitor to create the required plan by pointing the pen at the screen in the required part and the isodoses would refresh some moments (seconds/minutes and a long time for a complex plan) later when the computer had calculated the revised dose-distribution. Theraplan used a trackball and cursor system for this purpose for which in both early products was really a form of automated hand planning.

I found this image below on the internet but there is not much else I’m afraid. If you have anything to add and my memory is fading, it would be good to hear from you!

Pic: Remember these? Various sized floppy disks, the big ones were used initially to archive cases and run programs on early eighties TPS.

From automated hand planning and paper tape to AI and Machine Learning

In the space of just 40 years, AI and Machine Learning has dramatically changed the way we plan patient’s radiotherapy from what was essentially automated hand planning in the early eighties.

Pic: A journey through time to AI

The short road from 2D to 3D treatment planning was travelled with the integration of various diagnostic imaging modalities and MLC, we went quickly from forward to inverse planning and optimisation, added dose calculation models such as pencil beams, convolution-superposition and Monte-Carlo, where precision versus computation time was the ultimate trade-off. The moves from dependence on geometric parameters to the specification of volumes of tumour targets and sensitive structures, as well as their dose constraints have all been bridged.

However, it seems like Machine Learning offers far more and is the future

RaySearch have said that they created the first TPS with machine-learning where it’s not just automation that is important, it’s personal!

Where “deep learning segmentation” allows for contouring in less than a minute, saving time and manual work. This they say uses machine learning models based on previous clinical cases to create contours of the patient’s organs automatically. These can be reviewed and adjusted as required.

They then generate “personal” plans to review and approve at a speed suitable for adaptive therapy and create multiple, editable plans for different modalities to assist in treatment decision support with a choice of varying plan trade-offs.

Pic: RayStation TPS

I thought that this quote on their website was a good one to finish this blog with, it is by Tom Purdie, a Medical Physicist at the Princess Margaret Cancer Centre in Canada.

“Machine learning is a natural fit for automating the complex treatment planning process. We expect it will enable us to generate highly personalized radiation treatment plans more efficiently, thereby allowing clinical resources or specialist technical staff to dedicate more time to patient care. Deemed clinically acceptable by experts around the world, RayStation algorithms generate high-quality treatment plans that are preferred or deemed equivalent to clinical plans.”

You can read more about Machine Learning and RaySearch here: https://www.raysearchlabs.com/machine-learning-in-raystation/

It’s all change in radiotherapy and in such as short space of time!

Pic: I promised you Bentleys in Milan!

I’m not sure that there is any other form of medicine that has changed so dramatically in such a short space of time as radiotherapy has in just 40 years. From Cobalt Units to MRI Linacs and Bentleys in Milan to Machine Learning in Canada, what an amazing transformation.

If you want to consider a career in radiotherapy, contact admin@RadPro.eu or follow @RadProWebsite for more information, if I was you…I would!!

Duncan Hynd- July 2020