Highlights of the Early Days of Computing at The Aerospace Corporation

The progression of computers from mainframes to personal computers, and the programming languages and software that evolved with these advances, has greatly enhanced the way Aerospace provides technical support to its customers.

William Clarkson

 

The Mainframe Era

From the Orbiter, June 23, 1963: A console operator communicates with one of the laboratory's two computers by electric typewriter. Instead of using specially built equipment, the facility is made up of standard items used in a unique manner.

From the Orbiter, June 23, 1963: A console operator communicates with one of the laboratory’s two computers by electric typewriter. Instead of using specially built equipment, the facility is made up of standard items used in a unique manner.

In the 1960s, computers were large, power-hungry beasts housed in climate-controlled rooms and attended to by a select few. In today’s parlance they would be termed “mainframes,” although that descriptor didn’t come into use until there were other kinds of computers to compare them with.

Early Aerospace “scientific” (vs. “business”) computers were the IBM 709/7090/7094 series. The 709 used vacuum tubes. It could execute 42,000 add or subtract instructions or 5000 multiplies per second, and had 32,768 words of 36-bit magnetic core memory. That amount of memory in today’s terms was about 150 kb—less than a single image from a typical digital camera. The 7090 was a transistor-type version of the 709, which sold for $2.9 million or rented for $63,000 per month. It had a basic memory cycle time of 2.8 microseconds. The 7094 was an upgraded version introduced in 1962.

In the mid-1960s, Aerospace transitioned to Control Data computers for scientific computing—the CDC 6000 series. The first member of that series, the CDC 6600, performed up to three million instructions per second. It sold for $6–10 million. The 6000 series computers represented a significant technological advance over the previous generation. They had functional units for specific types of operations, an instruction stack, and independent asynchronous input/output (I/O). They could also run multiple programs simultaneously.

The last CDC computer at Aerospace was the 7600. It had a 27.5 nanosecond clock cycle and 65,000 60-bit words of memory (almost half a megabyte by today’s measurement!). It featured “pipeline” instruction processing, which had significant speed advantages over earlier architectures. Critical data were archived on large magnetic tapes. By 1966, the Aerospace tape library contained over 8000 tapes.

Ervin Frazier of the Information Processing Division's systems programming department explains hard copy plot processing in the computer center during a February 21, 1979, tour by high school students.

Ervin Frazier of the Information Processing Division’s systems programming department explains hard copy plot processing in the computer center during a February 21, 1979, tour by high school students.

Processing launch vehicle telemetry has always been an integral part of the Aerospace core mission. In the early 1960s, telemetry data from launch ranges were sent on tape to the Telemetry Data Reduction Center, which processed the data of interest in batches (the remaining data were stored in raw form); the processed data were then written to tape. To view a set of parameters for a given flight or test, an analyst would submit a request to the Telemetry Data Reduction Center, which would then either process the raw data or retrieve the processed data from tape and then generate appropriate plots and listings. The elapsed time from an actual flight to when an analyst could review the data was measured in weeks.

Programming and Terminals

At first, only “programmers” were allowed to write programs and submit them to the computers, based on designs provided to them by “engineers.” Later, Aerospace took the progressive step of allowing “open shop” programming.

The FORTRAN programming language was introduced for the 709 and was widely used at Aerospace for many years. Programming efficiency was important. The computers were limited in capability and expensive to operate (charges were by the second of compute time), and programs were prepared on punched cards, 2000 to a box. Programmers (especially the open shoppers) didn’t like to carry more than one box to the dispatch area.

From The Aerospace Corporation annual report, 1976: Administrative tasks, such as payroll, accounting, and reports, are handled by this IBM 370 computer in the company's computation center.

From The Aerospace Corporation annual report, 1976: Administrative tasks, such as payroll, accounting, and reports, are handled by this IBM 370 computer in the company’s computation center.

Remote terminals connected to the mainframe appeared, beginning with one called the Aerospace remote calculator (ARC). These terminals allowed engineers to perform analyses using an internally designed and developed scientific programming language. By 1968, 32 remote calculators had been installed. Instead of waiting for results overnight, engineers had immediate access to the computer. Later terminals provided virtually complete access to the power of the mainframe computer. Despite their seemingly modest capabilities, these early computers were used by the Aerospace technical staff to develop and apply significant analytic capabilities in support of the corporation’s customers.

An important tool developed in the late 1960s was the Eclectic Simulator Program (ESP). It was a precompiler for FORTRAN and a collection of subroutines that facilitated the fast, easy solution of ordinary differential equations. Expressions involving vectors, matrices, and scalars could be used directly without breaking them down into equivalent FORTRAN. Most notably, ESP played a significant role in the Aerospace rescue of two tumbling satellites, later recognized with an Aerospace President’s Achievement Award. ESP was used extensively by control system engineers and others for about 35 years, the last 25 of which were without further development or bug fixes.

People Power

Computer program validation and verification as described in February 1971. The three center boxes represent the program development flow. The outer boxes represent the program checking functions.

Computer program validation and verification as described in February 1971. The three center boxes represent the program development flow. The outer boxes represent the program checking functions.

From the mid-1960s to the mid-1970s, Air Force satellite programs were supported by a large (by contemporary standards) computer complex managed by the Satellite Control Facility in Sunnyvale, California, with worldwide locations. That facility was heavily supported by Aerospace personnel and contractors. The primary computer systems were the CDC 3600 series, which were upgraded to CDC 3800 machines in the early 1970s. Notable aspects of this era were the extensive use of custom hardware and software and a heavy reliance upon people. The interface from the primary Satellite Control Facility mainframes was a “tennis shoe” connection, as command tapes were transferred from the mainframes to a number of smaller CDC 160A computers, which were connected to the tracking stations in the network.

The CDC 3600 and 3800 machines principally handled I/O via large banks of magnetic tape drives. Those tapes were used to load the software into the machines and receive the generated satellite commands. The software was essentially custom developed by Aerospace, the Air Force, and contractors. These software programs included the computer operating system, known as System II; the Satellite Control Facility standard compiler, JOVIAL J3; the orbital ephemeris programs, known as the Advanced Orbital Ephemeris System (AOES); and a number of program-specific command-and-control programs.

From the Orbiter, June 23, 1963: ASTRA lab technicians watch the operation of an automatic graph-making machine. The device can produce calibrated charts—and under special conditions even diagrams—from information on reels of tape.

From the Orbiter, June 23, 1963: ASTRA lab technicians watch the operation of an automatic graph-making machine. The device can produce calibrated charts—and under special conditions even diagrams—from information on reels of tape.

System II, JOVIAL, and AOES were all developed, tested, maintained, and operated by the Satellite Control Facility. Program-specific software was developed, tested, and maintained by the program offices using their own software contractors. These were large software packages, and relied heavily on large numbers of people for their development, testing, and maintenance. This led to a great deal of desk-checking of code and manual maintenance of computer programs on decks of computer cards. But people were considered relatively inexpensive to hire, whereas computers were expensive to buy, so the system seemed to work well.

Another excellent example of people being involved in the software development process was that each set of generated commands were hand-checked by flight support personnel after they were generated on the mainframes and prior to the command tapes being sent to the tracking stations for upload to the satellite onboard command and control computers. Aerospace program office satellite experts were the lead flight support personnel. Each command set required an explicit sign-off by these experts and the contractor support team prior to release for upload to the satellite.

Guidance

The onboard computers in early inertially guided launch vehicles could be characterized (approximately) by the following parameters:

  • Word length: 24 bits
  • Memory: 4K (4096) words
  • Add time: 30 microseconds
  • Multiply time: 300 microseconds
  • Weight: 40 pounds
  • Power consumption: 100 watts
  • Reliability: 7000 hours mean time between failures

Early launch vehicles and satellites contained a plethora of special-purpose hardware—for example, sequencers, timers, and analog circuitry. In the mid-1960s, Aerospace promoted a concept known as integrated digital mission management. It could be summarized as: “There’s a computer onboard with excess capacity. Let it run the show.”

The rationale for assigning nonguidance tasks to the guidance computer was simply that such assignment would result in lower weight and power consumption on the launch vehicle and satellite, and less complexity (and therefore higher reliability) than the implementation of those functions by separate, special-purpose hardware. There were benefits to connecting subsystems to the computer rather than to each other, and mechanizing as many inline functions as practical within the computer. In fact, the concepts of integrated digital mission management were later employed in many other industries, most notably in the automobile industry.

A related idea for reliability improvement was to use the computer to detect malfunctions in other subsystems and initiate corrective or compensating actions. This flexibility would allow some required late functional changes to be resolved without hardware changes. However, this idea failed to anticipate that as onboard software became lengthier and more complex, changing it would become just as daunting as changing hardware.

Radio guidance systems using ground-based computers were used on early Atlas boosters. Later Atlases used onboard inertial guidance, as did the Titan launch vehicles. Initially, guidance equations used a perturbation method wherein the nominal powered flight trajectory was stored, and guidance was based on deviations from that nominal. This method placed relatively low requirements on computer capacity. Later, a more accurate and flexible form of guidance equations, termed “explicit guidance,” used a two-body model of the vehicle flight. Explicit guidance required considerably greater computer capacity.

May 1963: Robert Mercer, member of the technical staff, Computation and Data Processing Center, explains the control console for the 7090 computer to Narbonne High School students who visited Aerospace on Los Angeles City Schools' Boys-in-Industry Day.

May 1963: Robert Mercer, member of the technical staff, Computation and Data Processing Center, explains the control console for the 7090 computer to Narbonne High School students who visited Aerospace on Los Angeles City Schools’ Boys-in-Industry Day.

Explicit guidance was at the leading edge of guidance methodology, and Aerospace was a pioneer in that technology. Aerospace was responsible for the validation of guidance equations via a scientific simulation in which the equations (programmed in FORTRAN) were run together with a vehicle simulation on a mainframe computer.

The first Titan IIIC inertial guidance computer used a rotating magnetic drum for memory. The flight computations were locked to the drum rotation rate and phase. The guidance program had to be aware of the drum position and speed relative to the central processing unit (CPU) speed. If the program needed a quantity but it had just passed under the drum read heads, the program had to wait until the needed data came back around, thereby incurring a serious time penalty.

The guidance program had a major cycle of 1 second in which the guidance calculations were performed. There was also a concurrent synchronized minor cycle of 50 milliseconds in which the vehicle turning commands were calculated. (The vehicle needed smoother attitude commands than a jerk every second.) A later Titan guidance computer had random access memory, which was a great improvement and allowed relaxation of all the drum synchronization requirements.

Validation and Verification

The quality requirements for mission-critical real-time software such as guidance programs were considerably more stringent than those for ordinary computer programs. It was (and still is) expected that most ordinary computer programs would enter operational usage with several undiscovered errors remaining in them (despite a reasonable amount of testing). But an operational program failure (i.e., a lost vehicle or mission) would cost millions of dollars, and that was unacceptable.

Accordingly, Aerospace led the way in computer program validation and verification. The validation functions were similar to testing as employed for ordinary software— i.e., “black box” testing. Verification was the process of determining that a program was in conformance with its specification by detailed analysis of its internal structure.

Aerospace developed many innovative tools for validation and verification, including interpretive computer simulations in which the flight computer was modeled at the bit level on a mainframe (see sidebar, Aerospace-Developed Software). Most guidance computer programs were certified in this manner rather than on the actual flight hardware. Aerospace was responsible for the verification via simulation of the actual Titan IIIC flight program tapes produced by the contractor (“the bits that fly”). Those tapes were used to load the flight computer before liftoff.

A technician works at the console of the Control Data 6600 (circa 1966

A technician works at the console of the Control Data 6600 (circa 1966

The classic onboard computer for most of the 1960s and into the 1970s was the CDC 469. This rugged little computer started out with a “plated wire memory,” which gradually evolved to a semiconductor memory. It had limited memory—a few thousand words, a small instruction set, and basically no reload or update capability once on orbit. Therefore, the code was extensively tested, and changes made from flight to flight were introduced slowly.

Aerospace personnel were the key technical experts on the hardware design and development and the related software development for the CDC 469. Aerospace was involved in the evaluation, monitoring, and testing of each development. An Aerospace engineer was the top expert on the CDC 469 hardware and software during those years.

Aerospace’s last scientific mainframe was a Cray-1, housed in a newly constructed underground facility. It weighed about 5 tons, including its required refrigeration system, and consumed about 115 kilowatts; the cooling and storage systems required a similar amount of power. The Cray computers were the definitive supercomputers of the 1970s and 1980s, but the fastest PCs today perform more than 40 billion floating-point operations per second (40 GFLOPS)—more than 130 times faster than a Cray-1.

Standardization

In the 1970s and early 1980s, the Air Force and Aerospace led the evolution of spacecraft computers, and eventually, the mandate of using a 1750A computer for space systems. The 1750A mandate defined the set of basic computer instructions the manufacturer was required to implement, but not the hardware characteristics for that implementation. This led to a number of 1750A computers becoming available, but little interchangeability. Since the intent of the mandate was to gain more onboard computing capability, cut costs, and reduce development risks, the mandate was viewed as only a partial success.

From The Aerospace Corporation annual report, 1976: This Control Data terminal, used primarily for computations in scientific analysis, is linked with one of the most powerful of today's research computers.

From The Aerospace Corporation annual report, 1976: This Control Data terminal, used primarily for computations in scientific analysis, is linked with one of the most powerful of today’s research computers.

Meanwhile, the Ada programming language was being designed under contract to the Department of Defense (DOD) from 1977 to 1983 to supersede the hundreds of programming languages then used by the DOD. It was originally targeted at embedded and real-time systems, and compilers were validated for reliability in mission-critical applications. In 1987, the DOD began to require the use of Ada (“the Ada mandate”) for every software project in which new code represented more than 30 percent of the total (though exceptions were frequently granted). The Ada mandate was rescinded in 1997 when the DOD began to use significant amounts of commercial off-the-shelf (COTS) software.

The evolution of onboard spacecraft processing proceeded in parallel with an equally intense effort to improve the ground processing of data from the satellite systems. A related concern in the late 1970s and early to mid-1980s was the erosion of the U.S. semiconductor industrial base. Two initiatives constituted a major Air Force effort to address these issues.

The first was the Very High Speed Integrated Circuit (VHSIC) program. The goals of this multicontractor effort were to enhance the national capability to produce large high-speed integrated circuits and to develop custom hardware for processing for the specialized data being collected from satellites. The second was a large six-contractor program to develop a generic VHSIC spacecraft computer (GVSC). The goal was to develop a suite of interchangeable 1750A processors using the new high-capability integrated circuits from the VHSIC program.

The initial statement of work and technical guidelines for GVSC were developed by a team headed by Aerospace, and the technical development required extensive coordination with a number of government laboratories. The teams responsible for the technical control of the VHSIC and GVSC program were headed up and staffed by Aerospace.

Simulation became an important analytical tool at Aerospace. “Discrete event” simulations were developed in FORTRAN, and also in specialized simulation languages such as Simscript. They were first used to analyze the performance of computer systems, including spaceborne computers, but other applications soon arose. During the 1970s, Aerospace worked on transportation systems for agencies such as the Department of Energy and the Department of Transportation. Aerospace developed and applied a novel simulation approach to transportation modal split analysis to determine the patronage of alternative travel modes. Such studies had traditionally been done using regression methods. The simulation approach was much more accurate and flexible.

Beginning in the 1970s, computing at Aerospace became more distributed (although the mainframes were usually the computing engines behind the curtain). APL (“A Programming Language”) became available to those who could make their peace with it. It was characterized as a “write only” language, in that APL programs were extremely hard to read and understand.

In the early 1970s, the Hewlett-Packard HP-35 scientific calculator was introduced—a computing tool with no mainframe behind the curtain! It was the first calculator with a full suite of trigonometric and transcendental functions, and every Aerospace engineer wanted one. But at $395 each, budget limitations prevailed.

Engineers began to take HP-35s into analyst rooms at the Satellite Test Center (STC) in Sunnyvale. But there was a requirement that all “software” used at the STC be certified. Did this rule apply to HP-35 “programs?” It was decided that since those “programs” resided in the minds of the engineers, they were OK. Later, truly programmable calculators caused the issue to be revisited, and more stringent controls were put into place.

The Rise of the PC

Arguably, the first “personal computer” at Aerospace was a Zenith Z-89, acquired in the early 1980s. The CPU was a Zilog Z80 8-bit microprocessor, which ran at 2 MHz. The maximum amount of memory was 64 KB. Aerospace engineers pondered the potential utility of this computer, and decided it might only be useful as a terminal to the mainframes. But it had a rudimentary spreadsheet program, and a C compiler, so it could be (and was) more than a terminal.

December 1966: Tape librarian Marianna Subis retrieves one from the more than 8000 tapes in the the computer data processing center.

December 1966: Tape librarian Marianna Subis retrieves one from the more than 8000 tapes.

In 1981, the IBM PC was introduced, and everything changed. The early PCs used the Intel 8088 16-bit processor and had 64 KB of memory (expandable to 256 KB) and two floppy disk drives. Available software included Microsoft BASIC, VisiCalc (an early spreadsheet program), and VolksWriter (an early word processor). Additional applications appeared at a rapid pace, and improved models (the AT and XT) were introduced. The PC had an “open architecture,” which permitted other manufacturers to develop compatible models. PCs came into Aerospace as rapidly as budgets would permit (see sidebar, PCs: Preliminary Concerns).

Early PCs at Aerospace generally weren’t “personal”—they were located in bays and intended to be shared. Users quickly discovered that a PC without a hard disk was slow. Almost immediately, a clamor for hard disks arose. A memo from 1982 states: “A 10 to 20 MB hard disk can currently be purchased for approximately $3500.” Certainly, a far cry from the 2-terabyte hard drives that can be purchased today for about $100! PCs also introduced a number of questions and unanticipated costs: For example, how to manage shared hard disk space? How to acquire and pay for software? How to provide technical support?

From the Orbiter, March 22, 1961: Barbara Morgan, Lynn Muessel and Karen Kite, keypunch operators, prepare the cards for the computer. ("The computer" was the UNIVAC Solid State 80, Aerospace's newest electronic computer.)

From the Orbiter, March 22, 1961: Barbara Morgan, Lynn Muessel and Karen Kite, keypunch operators, prepare the cards for the computer. (“The computer” was the UNIVAC Solid State 80, Aerospace’s newest electronic computer.)

During this time, Aerospace developed an assembler and simulator for the 1750A instruction set, which ran on a PC and provided tools to analyze contractor-developed code in detail. As FORTRAN and C compilers became available for the PC, many mainframe applications were ported over. Examples include the TRACE program and the MADLIB subroutine library, which formed the basis for ASTROLIB—the astrodynamics software library. ASTROLIB was created to assist programmers and analysts in solving a wide range of orbital analysis problems in mission planning, architectural studies, satellite visibility/coverage analysis, and similar areas. It has been continuously expanded and adapted to new missions and is still in use today on mainframes and PCs.

In the mid-1980s, Aerospace pioneered the development and use of 3-D orbital simulations in the development of the Satellite Orbit Analysis Program (SOAP). Initial versions of this software were used to verify spacecraft orbits, bus/payload geometry, and attitude before analysts committed a scenario to extensive thermal analysis. The program was first implemented for the Evans and Sutherland family of vector graphics terminals, with FORTRAN programs written on a VAX host downloading the display commands.

From the Orbiter, March 22, 1961: W. W. Drake, vice president, administration; F. E. Leslie, head of accounting and banking department; and N. B. Kaufman, supervisor of data processing operations, examine a printed circuit similar to those used in the body of the UNIVAC Solid State 80 computer recently installed in the data processing department.

From the Orbiter, March 22, 1961: W. W. Drake, vice president, administration; F. E. Leslie, head of accounting and banking department; and N. B. Kaufman, supervisor of data processing operations, examine a printed circuit similar to those used in the body of the UNIVAC Solid State 80 computer recently installed in the data processing department.

In 1988, the software was completely redeveloped in C to run on IBM PC/AT class computers furnished with the new enhanced graphics adapter raster display. The scope for the software was gradually expanded to encompass applications such as modeling the emerging GPS constellation.

As desktop computers became more sophisticated, SOAP was expanded to encompass many different applications. In 1995, the advent of the Windows 95 operating system hastened the transition to 32-bit computing, and by 1998, the widespread availability of OpenGL 3-D graphics standards allowed graphics processing units to obtain hardware-accelerated displays. SOAP was adopted by the Jet Propulsion Laboratory for use in deep space missions such as Cassini and NEAR.

As the software grew in complexity, portability became a concern, because the software teams were now supporting Macintosh and UNIX clients. Although OpenGL solved this problem for the 3-D subsystem, the use of different user interaction models and application programmer interfaces across these platforms (and even across newer models of the same platform) presented a major development challenge. The challenge was met through the use of portable graphical user interface software that allowed the same application code to run on all platforms. This approach is now widely used in engineering applications throughout industry, with Google Earth and MATLAB as prominent examples.

Parallel Computing

In the mid-1980s, Aerospace pioneered the development and use of 3-D orbital simulations in the development of the Satellite Orbit Analysis Program (SOAP). Shown here are computer-generated screen shots of SOAP.

In the mid-1980s, Aerospace pioneered the development and use of 3-D orbital simulations in the development of the Satellite Orbit Analysis Program (SOAP). Shown here are computer-generated screen shots of SOAP.

Starting in the 1980s, Aerospace also began intensive work in parallel computing, networks, and distributed computing. Parallel computing essentially packaged many CPUs together in the same chassis where they could work together on the same computational task. As mission data volumes grew and processing deadlines shortened, the operational use of parallel computing became critical. Over the years, Aerospace used the technique for computational fluid dynamics, launch trajectory analysis, GPS orbital analysis, DMSP weather image processing, signal processing, space debris conjunction analysis, thermal analysis, and other applications.

Simultaneously, Aerospace was installing its internal network. Workstations were able to share files and send email, and could be used together as a small-scale, parallel computer. Programming tools such as Parallel Virtual Machine enabled a group of workstations to be used as a “message-passing” parallel machine. This network-based communication allowed workstations and parallel computers at different locations, and even outside institutions, to be used together in a distributed environment. This gave rise to “grid computing,” whereby computing resources from different organizations across the country could be securely shared to work on massive computing problems. Aerospace took part in a set of grid demonstrations in 1995 involving 38 different institutions from across the country.

In the late 1980s and into the 1990s, onboard processing systems and satellite ground processing systems evolved to encompass various combinations of custom hardware and software, frequently based upon similar commercial developments and commercial products. The rigor of development and testing that had evolved over the years in government programs greatly influenced the commercial developments of that era. The primary driver for much of the move toward commercial computing systems was the size of the commercial market, which often exceeded the military market.

Telemetry Processing Advances

Aerospace personnel supporting launch activities in the A9 building's Telemetry Data Reduction Center.

Aerospace personnel supporting launch activities in the A9 building’s Telemetry Data Reduction Center.

During the last 20 years, Aerospace software and telemetry engineers have incorporated a myriad of advances in computing and communication technology into telemetry processing and display systems. Two advances in the early 1990s are particularly noteworthy for the increased efficiency and functionality they provided.

First, the Telemetry Data Reduction Center acquired a new telemetry processing system, which enabled raw data to be ingested and fully processed in real time. This eventually eliminated the delay of more than a week in providing processed data to analysts.

Second, the Spacelift Telemetry Acquisition and Reporting System (STARS) facilities were opened at the corporation’s El Segundo headquarters and at the Vandenberg and Cape Canaveral launch sites. These facilities still play a vital role in the corporation’s launch operations support. Each facility houses a dedicated Telemetry Data Reduction Center, where raw data from the launch site can be processed and disseminated. Analysts in El Segundo can review processed data from the launch site within minutes. STARS also provides immediate access to the corporation’s extensive archive of launch vehicle telemetry dating back to the 1960s; more than 7000 items are held in the data library. This large collection of historical information, combined with the immediate processed data, enables analysts to compare a vehicle’s current performance with past performance and determine whether it is “within family.”

Speed and Power

SOAP continued to evolve throughout the 1990s and 2000s, offering Aerospace and its customers a diverse set of general-purpose orbital modeling and simulation capabilities. The software was used for visualization and analysis of ground, sea, and air trajectories; attitude, coverage, and radio frequency interference; and collision avoidance. Areas of study today include all phases of the space system life cycle and encompass applications ranging from military tactical simulations to the modeling of deep space missions. The SOAP development team can also rapidly prototype extensions to the software in response to the emerging needs of the space system community. The team is also tracking developments in parallel and 64-bit computing.

The integrated use of commodity processors and networks gave rise to the notion of “cluster computing” as an architectural approach to parallel supercomputing. Cluster computers are more economical than mainframe parallel computers, which have additional engineering costs for their dedicated communication networks and other support hardware. Aerospace began building its own corporate cluster, called “Fellowship,” in 2001. Having grown in size every year, it is now up to 1392 cores that are being used for GPS orbital analysis, Space Tracking and Surveillance System communication modeling, and thermal analysis.

Aerospace has been active in the area of “cloud computing”—the use of virtual machines in a massive data center that can dynamically provision computing requirements, rather than having racks of computing equipment that are dedicated to just one task or mission. Many of the corporation’s customers are contemplating data center migration as a way to host future mission requirements. Aerospace is charting this future direction by building a small in-house cloud using an open-source package and implementing several government prototypes in consultation with its users. Based on this initial work, Aerospace is exploring the installation of a larger cloud computing facility for both internal and government use, which may well be one of the next-generation efforts in computing.

The Path Forward

Throughout the evolution of computers and software at Aerospace, there were a number of fundamental changes in the roles filled by the Air Force and Aerospace. In the 1960s, the role was to design and implement custom hardware and software system components, often to the circuit and code levels required to support satellite missions. In the 1970s and early 1980s, the work evolved into pushing the development of the contractor and commercial base, often while continuing custom development for the support of military satellite systems. During the 1980s and into the 1990s, the role evolved into system design, requiring the ability to integrate state-of-the-art commercial systems, often augmented with custom hardware and software, into large satellite constellations and ground systems.

The current thrust is to design and implement large systems using best commercial practices, augmented by risk-reduction design and process requirements, in a cost-effective manner. The future would appear to require the establishment of more commonality in the systems to allow more interchangeability and risk reduction.

Please note: This article is primarily historical. See future issues of Crosslink for articles on current and more recent developments related to computing at The Aerospace Corporation.

Acknowledgment

The author thanks these Aerospace employees for their valuable contributions to this article: John Coggi, Suellen Eslinger, Ira Gura, Craig Lee, Frederic Pollack, Kenneth Steffan, David Stodden, Merlin Thimlar, and Joe Wertz.

Back to the Spring 2010 Table of Contents

To sidebar:  Aerospace-Developed Software

To sidebar:  PCs: Preliminary Concerns