Richard Hind MBCS explores the computers that took Neil Armstrong and Buzz Aldrin to the surface of moon and considers their legacy.
The computer industry was barely 10 years old when JFK announced that ‘we choose to go to the moon’ and the sky was no longer the limit. Getting to the moon is hard. Indeed, it is rocket science, which meant computers were essential. In 1952, a UNIVAC machine had predicted Eisenhower would win the US election, taking 82% of the votes (in the end he took 83%) and the computer had come of age. However, packing an eight-ton computer, measuring 4.3m by 2.4m by 2.6m, into the Apollo command module (let alone the lunar excursion module) was not going to happen.
NASA used industry standard IBM mainframes on the ground. In 1968, a year before Apollo 11, they upgraded to an IBM system 360 model 50, costing $45k per month to rent. They also purchased an ILLIAC IV supercomputer: Ceruzzi. Even the mini-computers of the day were too big and bulky for the Apollo mission, but fortunately by the mid-sixties, miniaturisation was well under way with transistors already being replaced by integrated circuits...
Launch vehicle control
The Saturn V launch vehicle was under the control of the Instrumentation Ring or Instrumentation Unit (IU) which was located on top of the third stage of the Saturn V. It contained all the navigation, guidance, control and sequencing hardware for the main launch vehicle, using triple redundancy for safety. This unit was built by IBM in Huntsville Alabama. This section was approximately six metres in diameter and one metre tall, weighing just over two tons. The electronics were liquid cooled (methanol / water mix) propelled by nitrogen gas.
Of course, cooling is still one of the big issues in computing, as any enthusiast will tell you. At T-minus 15 seconds, Jack King (the voice of mission control) announced ‘guidance is internal’ meaning, as NASA explains ‘inertial and holds an orientation with respect to the stars’. In short, using three gyroscopes on an inertial platform. The IU computer system was made up of three sub-systems: the Launch Vehicle Digital Computer (LVDC), the Launch Vehicle Data Adapter (LVDA) and the (analogue) flight computer- according to NASA’s official flight manual.
The LVDC hardware specification was:
- 28-bit word (Split into 2 x 13-bit plus 1-bit parity ‘syllables’)
- 2.048 MHz clock * I/O handled by LVDA (serial communication at 512Kbps)
- 32k x 28-bit words (RAM) magnetic core memory
From lift off, the LVDC was reporting on 1,348 systems (Ryan, p.66) and transmitting back to mission control via a 51.2Kbps down-link using S-band (2-4GHz short wave, now used for WiFi): the same speed as the last generation of analogue modems from the late nineties. Interestingly, Ryan notes the up-link speed was only 2.4Kbps, so ADSL broadband was not the first example of an asymmetric wide area network connection!
In the same year that the prototype Internet appeared (ARPANET), NASA was already using NASCOM (NASA Apollo Communication Network) which linked up 16 receiver dishes around the world, from Europe to Australia and even four ships located in the Pacific and Atlantic. The central hub was the Goddard Space Flight Centre, north-east of Washington DC. However, the most revolutionary development was the Apollo Guidance Computer (AGC).
The Apollo Guidance Computer
The AGC was fairly compact for a computer of its time, being the first to employ integrated circuits. There were three for each Apollo mission; one located in the command module (CM) and two in the lunar excursion module (LEM). You will see them referred to as ‘PiNGS’ and ‘AGS’, i.e. the Primary Guidance, Navigation and Control System (PGNCS) computer and the Abort Guidance System computer, which ran (open loop) in parallel in the LEM for backup. The job of AGS was to abort the landing and get the LEM to a safe orbit where the two crew members could await rescue by their colleague (in the CM).
In NASA’s own words ‘the invisibility of the AGS is a tribute to PGNCS, since the AGS was never needed to abort a landing. It was, however, an interesting and pioneering system in its own right’. Each AGC measured 61cm by 32cm by 15cm, about the size of the old ‘full tower’ form factor PC. It weighed 31.8kg and used 70 watts of power, which again, is comparable to PCs of the mid-nineties. The hardware specification was:
- 16-bit word (15 data bits and one parity bit)
- 1.024 MHz clock
- 16 I/O channels
- 36k x 16-bit words (ROM) magnetic core rope memory
- 2k x 16-bit words (RAM) magnetic core memory
The ‘core rope’ memory was a new development and boasted an incredible 2000 bits per cubic inch - almost 20 times more compact than the current core memory technology. It was programmed by threading (or not threading) the individual ferrite cores onto the wires, all 598 thousand! Compare this specification to a typical PIC24 family micro-controller chip, which costs under £3, for which you get:
- 16-bit word
- 70 MHz clock (maximum)
- 31 I/O channels (6 A/D & 4 D/A)
- 256kB of addressable program memory
- 32KB of data memory
Operating at 3-3.6v and just milliwatts of power, measuring 4cm by 1cm by 0.5cm.
Development of the AGC
The AGS was designed at MIT in 1962 and built by Raytheon, a major U.S. defence contractor. What made it such an innovative machine is that it was the first computer to use integrated circuit technology (which first appeared in 1958). Each used about 4,000 Fairchild ‘type-G’ (3-input NOR gate) circuits, which equates to 24,000 transistors per computer. By today’s standards, this is nothing to shout about (high performance processors now have billions of transistors) but compared to an IBM 7090 mainframe of the day, with about 50,000 transistors (and filling a room), that’s impressive.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
It is reported that in the summer of 1963, AGC prototypes consumed 60% of the US supply of ICs. By 1965, construction of the AGCs used 200,000 ICs, costing $20-$30 each. The computerhistory.org website suggests ‘this is one of the few cases in which NASA's requirements acted as a direct spur to the computer industry’. It was a big risk though, and to minimise this, MIT took the decision to use only the one type of IC for the whole design.
According to Paul Ceruzzi this meant that ‘no Apollo Guidance Computer, on either the command or lunar modules, ever experienced a hardware failure during a mission.’ Eldon C. Hall, Lead Designer of Apollo Guidance Computer, is credited with pushing for the use of ICs, despite much resistance from NASA’s own safety team. He realised that this was the only way to deliver a machine that was both powerful and compact enough for the mission. By the seventies, it was standard practice to build computers from discrete ICs and ultimately microprocessors - heralding the home computer revolution of the eighties and beyond.
The AGC user interface
The guidance computers would have been little use without a suitable user interface, in this case the DSKY (display and keyboard, pronounced 'diss-key'). The most interesting feature of this was how it was configured or programmed by the astronauts during the mission. Limited memory capacity meant that they had to be updated regularly during the mission. NASA estimates that ‘it took about 10,500 keystrokes to complete a lunar mission.’
Ramon Alonso, a computer scientist working on the AGS, came up with the idea of using numerical ‘verbs’ and ‘nouns’ to make it easier to program. A verb would be a command such as 06 which means ‘display the information for the following (noun)’ and then the noun could be, for example 44 or ‘orbit information’ (Sourceforge). This was clearly a hit with the astronauts: in an interview for the Hack the Moon (2019) website, Alonso recalls Dave Scott (Apollo 9 and 15) saying ‘I don't know who thought up that verb and noun, but that was so good because we astronauts could really understand it’. How many user interfaces get such glowing reviews from the end users these days?
Lessons learned from the Apollo project
As with any advanced technology project, much was learned about project management and the development lifecycle. In NASA’s own review, the lesson that software is more difficult to develop than hardware, was one of the most important - reflecting that the choice of memory should be software driven. Perhaps this is just something we take for granted these days? They also identified that more modularisation of the software was needed, a core theme in any programming fundamentals course now.
The final observations come from computer science pioneer Margaret Hamilton, who worked on the AGS software at MIT. She is widely credited with inventing the term ‘software engineering’, as she explains, in an effort to bring the discipline the legitimacy and respect it deserved. In a presentation available on YouTube (2018), she looks back on the AGS project and the importance of developing real-time error recovery that provided a ‘program specific to system wide protection’.
They developed a priority interrupt system for processors (pretty much standard these days) which allowed the highest priority jobs to keep running in the event of a problem. The infamous 1202 alarm reported by Buzz Aldrin during lunar descent is an example of this. Fortunately, the software behaved as it should ‘[the] software warned of a hardware problem AND compensated for it’. Testing, fault finding, bug fixing and patching are still as much of an issue now as they were back in the sixties and Margaret laments this in her presentation.
She gives a perfect example of how easy it is to miss something vital because of assumptions made about the design of test plans. Her six year old daughter wanted to ‘play astronauts’ and managed to crash the simulator they were using for testing. It transpired she had triggered program PO1 (the pre-launch program) during a simulated flight. However, this problem was not addressed because ‘management’ assumed astronauts would not do anything that silly! However, after the Apollo 8 astronauts triggered PO1 during their flight (and spent hours resetting everything) the bug was fixed. Perhaps the lesson is: employ your kids to do the alpha testing!
Image: NASA