NCSA & AMNH Take Remote Collaboration to New Level to Create Space Show

CHAMPAIGN, IL -- The new space show, "The Search for Life: Are We Alone?" at the American Museum of Natural History's (AMNH) Hayden Planetarium at the Rose Center for Earth and Space features the work of visualization artists at the National Center for Supercomputing Applications (NCSA) integrated into the show through an unprecedented process of remote virtual collaborative visualization. The new show premiered March 2 at the planetarium. "The Search for Life" was developed by the American Museum of Natural History in collaboration with the National Aeronautics and Space Administration. The show was written by Ann Druyan, writer/producer and Carl Sagan's long-time collaborator, and Steven Soter, an astrophysicist and writer in the museum's Division of Physical Sciences, with music by Stephen Endelman. This same team also worked on the highly acclaimed "Passport to the Universe," the first-ever space show presented to the public in the Rose Center, which opened on February 19, 2000. "The Search For Life: Are We Alone?" is made possible through the generous support of Swiss Re. The film, narrated by Harrison Ford, looks at the possibility of life outside our solar system and includes segments on the birth of our Sun and solar system. About eight minutes of the show were created with help from NCSA's Virtual Director team--Donna Cox, an NCSA senior researcher and professor in the University of Illinois School of Art and Design; Robert Patterson, NCSA visualization programmer; and Stuart Levy, NCSA senior research programmer. The Virtual Director software team also worked with staff at the museum to create animations from scientific data for "Passport to the Universe." This time, however, the work included real-time interaction between staff at NCSA on the University of Illinois campus in Urbana-Champaign and at the Rose Center in New York City. "We have been involved in remote collaborative efforts before, but certainly nothing at this level," said Patterson, who spent many late nights working with the museum's production team over a high-speed connection to Internet2's Abilene research network. "This is an example of connecting two sites with two unique display systems to put together visualization sequences in real time over long distances. We never once needed to travel to the museum." The Virtual Director software allows users to navigate through complex computer datasets, record and edit their movements through the data, and share tracker and camera data. In effect, the user choreographs--or directs--with a virtual camera to create animations. To create the new space show's animations, Patterson would start up the Virtual Director software in the NCSA CAVE, a virtual reality display system, and the museum's production team in New York would run Virtual Director on the Hayden's Digital Dome. As Patterson, Cox, and Levy created Digital Dome flight paths through the data, the museum production team in New York was able to "fly" along, view the images in the dome, and discuss choreography and production issues with their colleagues at NCSA. Each participant in the process was represented as a graphical avatar in order to determine their locations in the dataset. Each avatar could freely move through the data or attach to the flight path. Levy handled data conversions and interface issues. His software PartiView was connected directly to Virtual Director so the teams could preview representations of the simulation data in real time and Levy could alter the data representations on both display systems as needed. In addition, Patterson was able to connect directly to the Hayden's display system and make changes to the choreography from the New York user's point of view. Every work session aimed to analyze the scientific data and find its most interesting aspects. Ultimately, the teams sought to tell a complicated story in a way that was scientifically accurate and visually engaging. "Our goal is to bring the most accurate science to the greatest number of people, using the most advanced tools of our age and presented in the most aesthetic manner possible," said Cox. "This process has helped us realize that goal." Once a path through the data was complete, the NCSA team transferred the data to Dave Nadeau, a visualization programmer at the San Diego Supercomputer Center (SDSC). Nadeau rendered every frame with SDSC volumetric rendering software at a resolution of 1280 x 1024 pixels. Each full dome image consists of seven of these frames blended together to create an image with about 30 times the resolution of a standard TV screen image. The entire sequence required more than 75 hours to render on SDSC's Blue Horizon, one of the world's largest computers. "`The Search for Life: Are We Alone?' is a perfect example of what makes public education at the museum unique," said Myles Gordon, vice president for education at the museum. "Through our exhibitions and public programs, we are not only able to educate people about breakthrough scientific concepts, but also to do so with the benefit of insights and first-hand information provided by the museum's scientists and researchers. It is the dynamic collaboration of astrophysicists, educators, computer scientists, science visualizers, artists, producers, engineers, sound designers, script writers, and composers that makes this new space show a unique experience and a potent educational tool." The computer simulation data is the work of Mordecai-Mark Mac Low, assistant curator in the department of astrophysics at AMNH and part of the National Computational Science Alliance Cosmology team. Mac Low's data simulates the birth of a star, and computing it required about four days of dedicated time on a 512-processor SGI Origin2000 supercomputer at NCSA. Other NCSA visualizations incorporated into the space show include a flight through the Milky Way and beyond, and views of the "local" universe created using an astronomical database of 35,000 observed galaxies developed by Brent Tully, an astronomer at the University of Hawaii. In total, the NCSA team computed about 70,000 frames for the 2002 space show. For more information visit www.ncsa.uiuc.edu