/home/jpatota | zd7000 Linux | ResNet | writings | Audiovox XV6600

Information Technology and Space Exploration

Written by John Patota, December 2006

Space exploration in the twenty-first century would not be possible without the help of Information Technology. Moon landings, space stations, missions to other planets and even the study of Earth are all facilitated by the processing, transmission, storage and distribution of data and data products. This report investigates those relationships by going back to when mankind first looked to the stars, to the race for the moon, to missions today and tomorrow in the search of how IT enables and continues to facilitate man’s quest for the final frontier.

Introduction

It was September of 1962 when President Kennedy stood before students and teachers at Rice University Stadium giving his famous speech calling for Americans to lead the way in space exploration:

We choose to go to the moon. We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win[9].

We accepted that challenge by landing a man on the moon before the end of the decade, making one of the most ambitious national goals ever set a reality. It was not the first time man had looked to the stars and wondered what lay beyond the sky. Since the beginning of time we have always sought to explore the heavens. Space exploration, and by extension astronomy, began in its earliest form with cultures explaining worldly events by the positions of stars. IT was there then, and it was there when Galileo gazed through his telescope to discover the orbits of stars, moons and planets. When we first ventured into space we could not have done so without the assistance of IT and it will continue to define what is possible in the coming future.

Early Astronomy

In the traditional sense Information Technology strictly pertains to the computer aided processing, transmission, storage and distribution of data and data products[8]. The root of IT stems from the sharing and distribution of information in a way that is accessible to everybody. Current systems cater towards the fact that any data can be interpreted differently depending on that individual’s viewpoint. The evolution of data processing calls for one man’s knowledge to be another’s data, for different data sets combination creating knowledge, Without the transfer of data, information, and knowledge, users have only their own undeveloped feedback[10]. Take for instance Ethnoastronomy: the study of the knowledge, interpretations, and practices of contemporary cultures regarding celestial objects or phenomena[7].

The Milky Way was not always known to contain 400 billion stars making up the spiral galaxy we call home[6]. The hazy band of white light appearing across the sky at night was interpreted many different ways by many different people attempting to explaining its significance. Mesopotamian and Australian aborigines thought it as the smoke of ancient peoples’ sacrifices. Egyptians believed it was the River Nile into which the goddess Isis, in her flight to escape the monster Typhon, threw stalks of grain forming stars. Those who lived in India knew is as the Path of the Serpent. Regardless of which culture you consulted all of them seemed to agree the Milky Way is the path that souls take on their way to the afterlife[5].

With folk law as their source of authoritative technology, these kinds of explanations became widely accepted truths. Stories of sacrifices, monsters and the afterlife were so deeply embedded in cultures that information transformed from one generation to another seamlessly while the lateral differences between cultures suggests their isolation. Without that cross separability, one group’s beliefs did not exceed the bounds of everyday interaction and therefore did not achieve today’s data quality standards.

Before we were able to slip the surly bonds of earth it was apparent we knew only a fractional amount about the world outside our own. Galileo Galilee was one of the pioneers of explanatory exploration but in the year 1600 technology was limited to a telescope with 3x magnification. He would hand write books where so few copies would be produced that only a select few could eventually read and expand upon his findings[4]. While still better then civilizations that came before him, had data sharing been more efficient, the knowledge he discovered could have had a greater impact sooner.

Eventually telescopes got better and information collaboration improved allowing for a number of ground breaking discoveries advancing planetary science, but it wasn’t until the advent of computer technology and its facilitation of space exploration that a lot of these theories could be tested and improved, and new ones constructed.

Going to the Moon

In an era of punch cards, of Central Processing Unit’s that occupied entire rooms, and of 1800 foot magnetic tape cartridges for storage, it’s hard to imagine how we could ever use those resources for anything as significant as exploring space. Rockets which once were designed to carry small payloads were getting larger and more robust, flying higher and higher every day. Rockets which first carried missiles nobody ever thought could reach another continent, never mind the moon, propelled Astronauts into space. For both computer and space exploration technology to reach its full potential, each had to feed off the achievements of the other. As a direct result of information technology and computing systems in general, spaceflight was able to take place. Even more impressive an impact is how spaceflight actually influenced IT.

The first flights into orbit were not flights in the traditional sense in that there was no pilot to man the controls. Once the initial rocket launch was complete, vehicles had no maneuvering capability except for the limited capability of altitude control jets, making the ship completely autonomous. To facilitate this National Aeronautics and Space Administration (NASA) flight controllers, personnel on the ground monitoring all aspects of the mission, had to be in constant communication with the craft as it traveled through space calculating velocities, trajectories and altitudes. They would collect data from on-board sensors as well as tracking information from ground control stations throughout the world all communicating in real-time. Telemetry data would be beamed up to the craft for events such as course corrections and reentry causing small on-board jets to fire accordingly, effectively nudging the spacecraft to its final destination. With the limited scope of operation as the testing ground of whether or not we could actually fly in space, the current system of technology sufficed.

With more asked of the following mission’s goals, the need for on-board navigation systems to be implemented aboard spacecraft became realized. Justified by the need to prevent saturation of ground stations in the event of multiple missions in space simultaneously, it was the reality of physics expressed in the 1.5-second time delay in a signal path from the earth to the moon and back that provided the motivation for a computer in the lunar landing vehicle. With the dangerous landing conditions that were expected, which would require quick decision making and feedback, NASA wanted less reliance on ground-based computing. In addition to that, when passing behind the far side of the moon, there was absolutely no communication with mission control. The line of sight which communicating systems relied on for transitioning data was distributed for a considerable amount of time. If a course correction or docking procedure needed to happen during that time, it would be impossible with a navigation system completely dependent on flight controllers in Houston[1].

To the drawing board went NASA, knowing full well what would be expected of them. Like any successful collaborative system they needed to, as an agency, develop the necessary architecture allowing for the transmission and processing of data, controls to prevent errors on such data or developed procedures. They needed documentation and training materials on how to use the technology they created while defining specific job roles and users within such a system. All of this had to be achieved over and over again to land a man on the moon.

To keep data flowing in an efficient manner there had to be some sort of centralized facility where information was received, processed, and acted upon. That center happened to be called Mission Control. Each facet of the mission had its own system monitored by teams of people governing it. In the days of Apollo, Mission Control monitored the launch vehicle, spacecraft trajectory, launch and landing windows. They had flight surgeons, electrical and environmental systems personnel, communications technicians, and flight directors managing the people who managed systems making critical decisions.

The current technology these controllers were working with was anything from completely dependable. Magnetic tape, a primary form of storage in the early 1960’s, had an error rate that was l bit in 100,000 [1]. The chance, while albeit remote, still existed for something to go wrong where bad data was acted upon jeopardizing the mission and crew. NASA had to come up with a data redundancy and error solution on its spacecraft already pushing the limits of technology.

In the pursuit of breaking down tasks and systems into manageable error-tolerant pieces, NASA decided to send two ships to the moon. The Command Module (CM) would be the main capsule used for its powerful engine and guidance systems. Together with the Lunar Module (LEM), both ships traveled together until they reached the moons orbit where they would undock, allowing for the LEM to descend and land on the surface. Two spacecraft had two navigation systems identical in design but with different software. The computer they used was called the Primary Guidance, Navigation, and Control System or PGNCS. The LEM had an additional computer as part of the Abort Guidance System (AGS). Ground systems backed up the CM computer and its associated guidance system so that if the CM system failed, the spacecraft could be guided manually based on data transmitted from the ground. Since the lunar landing did not allow the ground to act as an effective backup, the LEM had the AGS to provide backup ascent and rendezvous guidance. If the PGNCS failed during descent, the AGS would abort to lunar orbit and assist in rendezvous with the CM [1].

Astronauts and flight controllers trained on these systems every day. The majority of crew training was conducted in flight simulators to create, as nearly as possible, the procedures for the conduct of the mission. Simulators were controlled by computers into which any part of the projected flight could be programmed, from normal operation to emergency disaster recovery. Testing the crew and flight controllers in this way enabled them to work out procedures to help them overcome problems if they occurred during the real event.

When programs running on on-board computers had to be 200 lines long due to the lack of resources, efficiency was the most important concern. During this development process, strict standards of documentation, configuration control, and managing changes and the correction of errors were maintained. Breaking down the application into smaller, potentially interchangeable parts, or modules, became a primary technique. Communication between programming teams working on different but interconnected modules must be kept clear and unambiguous. It is in these areas that NASA has had the greatest impact on software engineering.

NASA finally pulled everything together in the summer of 1969 when Neil Armstrong landed the Lunar Module, Eagle, in the Sea of Tranquility. Neil and Buzz Aldrin walked on the surface of the moon collecting rock samples and deploying scientific instruments for the community back on earth to use. They and Command Module Pilot Michael Collins returned to earth amidst the celebration of achieving this great technological breakthrough.

Space Exploration Today

The early days of exploration were more of a proving exercise than anything else. It was an era of technological discovery and expedition more centered on the accomplishment of feats than investigating science. Today NASA aims to create that equilibrium by hosting a variety of different projects from each mission statement.

The Space Shuttle is the cornerstone of manned spaceflight catering to both interests. The Shuttle serves as the main catalyst for transporting large and delicate satellites into orbit, the conducting of experiments while in its short-lived 14 day orbit, and the building of the International Space Station (ISS). This reusable spacecraft was built upon the conceptual systems previous Apollo craft were made with, which is to say navigation is on-board and supported by the same kinds of enterprise systems. Advancements have been made in areas, however, like in the shuttle’s computer systems as one article explains:

The Space Shuttle avionics system contains five identical general purpose computers, each capable of communicating with the avionic subsystems to perform flight-critical and non-critical functions. During time critical mission phases such as launch, reentry, and landing, four of these computers operate as a redundant set, receiving the same input data, performing the same flight-critical computations, and transmitting the same output commands while the fifth computer performs non-critical computations. In this mode of operation, comparison of output commands and voting on the results in the redundant set provide the basis for efficient detection and identification of two flight-critical computer failures. After two failures, the remaining two computers in the set use comparison and self-test techniques to provide tolerance of a third fault. The Space Shuttle represents the first planned operational use of multiple, internally simplex computers to provide continuous correct system operational the presence of computer hardware failures[2].

When dealing with the most complex machine even built, NASA aims for perfection in all facets of space exploration. If even a single line of code has a bug it could spell disaster the entire crew of a mission. In these life or death situations it is critical that all systems check perform well in every conceivable simulation before actually being executed.

With that in mind NASA’s Independent Verification and Validation (IV & V) team strive for the highest achievable levels of safety and cost-effectiveness for mission-critical software. They have learned from over 40 years of manned spaceflight how to better predict or identify error-prone computer code in the early stages of software development, thus saving time, money, and potentially lives. Benefits to IT come when NASA then makes such software available to the international software development community, contributing its small part in making the underlying software systems in so many facets of IT successful[3].

Rather than the traditional goal of proving an act exploration can be done by going to the moon, scientific discovery leads the way in modern space exploration. With the ISS as an example, the international community all process data and conduct experiments, challenging NASA to provide seamless access to resources at a high data transfer rate using goal-oriented human-centered systems.

In catering to this need of science NASA has made large commitments to advancing technology including the creation of the Computing, Information, and Communications Technology (CICT) program whose primary mission is to develop crosscutting technologies for a variety of aviation and space applications such as intelligent communications, data gathering, and operational systems.

The future of Space Exploration

NASA research centers, academic research laboratories and industry partners. Current approach focuses on placing numerous scientific instruments on relatively large and expensive platforms. All of these instruments have that same high standard of redundancy that NASA strives for making failure proofing every experiment time consuming and expensive. The expense of missions cuts down on their number of launch opportunities limiting the amount of scientific discover that can take place.

Similar to the problems computer systems faced in the age of mainframes, expensive, hard to get to and operate machines were great on centralizing data but poor if multiple teams of people wanted to collaborate on them. In the same way computer networks evolved replacing super computers with many functional systems managing smaller pieces of the whole puzzle, NASA plans on doing the same with its satellites.

NASA foresees a sensor-web system being deployed around a target like Mars, small instrument satellites networked together in an organic measurement system. Each satellite will have its own on board capabilities, of course, allowing for it to autonomously reach and provide coverage where it feels it’s needed within the bounds of the other devices in the system. They will of course be able to be directed by Mission Controllers on earth and by other satellites in that networked system.

Advantages here include a plug-and-play approach to deploying new sensors as upgrades or replacements to current equipment. Rather than having to replace an entire satellite to retain a failed functionality, these cheaper systems allow for on the fly upgrades keeping systems up to date with the state of the art.

Space exploration has indeed advanced a long way from its beginnings in stargazing. So has computing and Information Technology from the days of the abacus and folk law. Each has progressed based on needs presented by the other and neither could be as developed without experiencing the same common challenges. From the days of Apollo and the race to create redundant and efficient computing environments, to today when NASA manages the world’s most complex machines, and tomorrow when the international community shares knowledge with the goal of advancing humanity, Information Technology continue to complement Space Exploration as they grow and develop together.

References


Valid HTML 4.01 Transitional

Last checked 2008.02.06