INTERCONNECTS
Simulations Enable Successful Hubble Navigation Experiment
- Written by: Writer
- Category: INTERCONNECTS
By Jarrett Cohen -- During May's Hubble Space Telescope Servicing Mission 4 (SM4), NASA astronauts outfitted the telescope with enhanced science instruments and extended its life through at least 2014. SM4 also hosted a successful experiment demonstrating new, onboard navigation technologies that will enable Hubble's de-orbit by an unmanned vehicle and future servicing of other space assets.
Months-long simulations at the NASA Center for Computational Sciences (NCCS) produced thousands of images that engineers say were critical for planning and testing this Relative Navigation Sensor (RNS) Experiment.
"It is 100 percent clear to me that synthetic imagery is the way to test these capabilities," said Bo Naasz, RNS Experiment principal investigator and engineer in the Navigation and Mission Design Branch at NASA Goddard Space Flight Center (GSFC). "We pushed this to the limits."
Naasz will be presenting RNS Experiment results in GSFC's Systems Engineering Seminar on Tuesday, August 4 at 1:00 p.m. Eastern Time.
Interconnected Technologies
The RNS Experiment flew in the cargo bay of Space Shuttle Atlantis. Relative navigation entails determining the position and velocity of two vehicles relative to each other. Performing this task for SM4 required multiple technologies. Three cameras—at long, medium, and short ranges—continuously imaged Hubble during the 6 hours of rendezvous and deploy. After rendezvous, astronauts installed a Soft Capture Mechanism on the telescope for future docking. Naasz said that imaging this Soft Capture Mechanism was the experiment's main goal, as this capability is key to flying one vehicle up to another autonomously.
Working in concert with the cameras was the GSFC-designed SpaceCube computer. This onboard multiprocessor system ran a variety of software applications, most notably camera automatic gain control and pose algorithms. Gain control tells the camera to increase or decrease the exposure time, and "is critical for getting a picture because Hubble is shiny like a mirror," Naasz said. Two different pose algorithms calculated the relative position and orientation of Atlantis and Hubble.
With only one shot at success, the RNS team performed rigorous physical laboratory tests before flying their experiment. First using NASA Marshall Space Flight Center's Flight Robotics Laboratory (FRL) and later GSFC's High Bay Clean Room, they suspended large Hubble models in front of the cameras and ran SpaceCube through its paces. The FRL sessions were particularly valuable for testing and validating the hardware interfaces of the RNS system components. However, the engineers needed another tool "to recreate the lighting environment in space, which is extremely challenging," Naasz said.
The Physics of Lighting
Failed attempts using standard graphics for lighting led them to ray tracing simulations. "Ray tracing allows you to precisely say this much light is shining on this spot," Naasz explained. "Ray tracing is based on physics and allows us to methodically test with a scene that is as physically close to reality as possible." The team's particular interests were testing camera gain control and the Goddard Natural Feature Image Recognition (GNFIR) pose algorithm.
With no suitable existing tools, team members Steve Queen and John Van Eepoel developed new software. Their Geomod geometry modeler incorporates high-fidelity 3D models of Hubble, the Shuttle, the Sun, the Earth, and other solar system bodies. Their Phillum software performs the ray tracing by combining physical illumination with the three cameras' optical characteristics.
"We started using the ray tracer on an 8-processor machine in our lab. It blew a power circuit in here!" said Van Eepoel, an engineer in GSFC's Guidance, Navigation, and Control Systems Engineering Branch.
Soon afterwards, Queen and Van Eepoel turned to NCCS for help. NCCS systems architect Jim McElvaney coordinated access to the 6,656-processor Discover computing cluster (since expanded to 11,032 processors). NCCS system administrators Bruce Pfaff, Gene Gooden, and Nicko Acks then set up a front-end control process using one of Discover's visualization nodes to manage the ray tracing across hundreds of processors.
The Discover computing cluster at the NASA Center for Computational Sciences (NCCS) performed ray tracing simulations generating 100,000 highly realistic scenes of the relative navigation between Atlantis and Hubble. View larger image.
"The Discover cluster enables orders of magnitude more than we could do with a single machine," Van Eepoel said. Testing the gain control and GNFIR pose algorithm involved following 27 different trajectories of 90 minutes each and rendering scenes at a rate of 3 frames per second. The lighting physics is of such intensity that a single processor could only render a line segment 5 to 10 pixels long, thus many processors had to act in parallel to build the scene pixel by pixel.
Moreover, "as you get closer to the telescope, you must increase the ray count," Van Eepoel explained. Long-range Camera 1 (28 to 260 meters) images have more black and relatively few reflections; Discover rendered each image in 5 minutes or less. Short-range Camera 3 (2 to 5.5 meters) images are "almost all mirror" with a lot of reflections. "There is significant 'noise,' and then we send many rays to clean it up," he said. Camera 3 images required as many as 50,000 rays per pixel and took 8 hours each to render.
These demanding simulations kept at least 1,024 Discover processors busy at a time, with peak usage approaching 2,000 processors. With nearly non-stop rendering over 5 months, the simulations consumed more than 1.5 million processor-hours. The total output was 100,000 images. "Being able to generate all those images gave us a far greater ability to test our algorithms with a wider range of relative motions and lighting conditions," Naasz said.
Back to Hubble and Beyond
When Hubble and Atlantis rendezvoused on May 13, years of preparation were fulfilled as the RNS Experiment began. During the mission, SpaceCube ran a compression engine for downloading camera imagery to NASA Johnson Space Center. When seeing the compressed imagery for the first time, the team was naturally excited but not shocked. "I definitely remember somebody saying, 'That's the imagery you guys ray traced,'" Naasz recounted.
These images of Hubble demonstrate the realism enabled by the ray tracing simulations. The image on the left was rendered several months prior to SM4 using the NCCS Discover cluster. The image on the right was taken by long-range Camera 1 during SM4 Flight Day 3 (May 13, 2009) and compressed for downloading during the mission. View larger image.
With pleasure, Naasz said that they met their primary goal of imaging the Soft Capture Mechanism and even exceeded secondary goals, including achieving an accurate pose algorithm. "First and foremost, NASA is better prepared to go back to Hubble," he said. The nominal plan is to drop Hubble into the ocean, potentially using an autonomous vehicle. The Soft Capture Mechanism on Hubble is also compatible for docking with NASA's Orion crew exploration vehicle, and Congress has asked for a study on Orion servicing spacecraft.
Meanwhile, the RNS team received 750 gigabytes of uncompressed, on-orbit data in late June, and they have begun analyzing how well the experiment did quantitatively. The Space Shuttle Program is also assembling a detailed report. The team will need at least a few months to complete all the analyses.
The RNS Experiment team has begun analyzing 750 gigabytes of uncompressed images, such as these scenes from (left to right) long-range Camera 1, mid-range Camera 2, and close-range Camera 3. View larger image.
Discover will again come into play as these GSFC engineers seek to improve their camera models to yield an even better match with the SM4 data and ultimately extend their relevance. "These techniques are useful for any mission that requires rendezvous and docking," Van Eepoel said. Robotic servicing, precision landing, and surface navigation are among the potential applications for future simulations.
"With these computational tools, give us a 3D model of your spacecraft, and we'll pump it through NCCS," Naasz said.
Other RNS Experiment collaborators include Richard Burns of GSFC; Joel Hannah of Advanced Optical Systems, Inc.; Shaun Oborn of Solo Effects, Inc.; and Eugene Skelton of Lockheed Martin Corp.