Southwest Research Institute scientists developed a new process in planetary formation modeling that explains the size and mass difference between the Earth and Mars. Mars is much smaller and has only 10 percent of the mass of the Earth. Conventional solar system formation models generate good analogs to Earth and Venus, but predict that Mars should be of similar-size, or even larger than Earth.

New process explains massive differences between Earth and Mars

Using a new process in planetary formation modeling, where planets grow from tiny bodies called "pebbles," Southwest Research Institute scientists in Boulder, CO can explain why Mars is so much smaller than Earth. This same process also explains the rapid formation of the gas giants Jupiter and Saturn, as reported earlier this year.

"This numerical simulation actually reproduces the structure of the inner solar system, with Earth, Venus, and a smaller Mars," said Hal Levison, an Institute scientist at the SwRI Planetary Science Directorate. He is the first author of a new paper published in the Proceedings of the National Academy of Sciences of the United States (PNAS) Early Edition.

The fact that Mars has only 10 percent of the mass of the Earth has been a long-standing puzzle for solar system theorists. In the standard model of planet formation, similarly sized objects accumulate and assimilate through a process called accretion; rocks incorporated other rocks, creating mountains; then mountains merged to form city-size objects, and so on. While typical accretion models generate good analogs to Earth and Venus, they predict that Mars should be of similar-size, or even larger than Earth. Additionally, these models also overestimate the overall mass of the asteroid belt.

"Understanding why Mars is smaller than expected has been a major problem that has frustrated our modeling efforts for several decades," said Levison. "Here, we have a solution that arises directly from the planet formation process itself."

New calculations by Levison and co-authors Katherine Kretke, Kevin Walsh and Bill Bottke, all of SwRI's Planetary Science Directorate follow the growth and evolution of a system of planets. They demonstrate that the structure of the inner solar system is actually the natural outcome of a new mode of planetary growth known as Viscously Stirred Pebble Accretion (VSPA). With VSPA, dust readily grows to "pebbles" -- objects a few inches in diameter -- some of which gravitationally collapse to form asteroid-sized objects. Under the right conditions, these primordial asteroids can efficiently feed on the remaining pebbles, as aerodynamic drag pulls pebbles into orbit, where they spiral down and fuse with the growing planetary body. This allows certain asteroids to become planet-sized over relatively short time scales.

However, these new models find that not all of the primordial asteroids are equally well-positioned to accrete pebbles and grow. For example, an object the size of Ceres (about 600 miles across), which is the largest asteroid in the asteroid belt, would have grown very quickly near the current location of the Earth. But it would not have been able to grow effectively near the current location of Mars, or beyond, because aerodynamic drag is too weak for pebble capture to occur.

"This means that very few pebbles collide with objects near the current location of Mars. That provides a natural explanation for why it is so small," said Kretke. "Similarly, even fewer hit objects in the asteroid belt, keeping its net mass small as well. The only place that growth was efficient was near the current location of Earth and Venus."

"This model has huge implications for the history of the asteroid belt," said Bottke. Previous models have predicted that the belt originally contained a couple of Earth-masses' worth of material, meaning that planets began to grow there. The new model predicts that the asteroid belt never contained much mass in bodies like the currently observed asteroid.

"This presents the planetary science community with a testable prediction between this model and previous models that can be explored using data from meteorites, remote sensing, and spacecraft missions," said Bottke.

This work complements the recent study published in Nature by Levison, Kretke, and Martin Duncan (Queen's University), which demonstrated that pebbles can form the cores of the giant planets and explain the structure of the outer solar system. Combined, the two works present the means to produce the entire solar system from a single, unifying process.

"As far as I know, this is the first model to reproduce the structure of the solar system -- Earth and Venus, a small Mars, a low-mass asteroid belt, two gas giants, two ice giants (Uranus and Neptune), and a pristine Kuiper Belt," said Levison. The article, "Growing the Terrestrial Planets from the Gradual Accumulation of Sub-meter Sized Objects," is published online byPNAS. Authors H.F. Levison, K.A. Kretke, K. Walsh, and W. Bottke are all of Southwest Research Institute's Space Science and Engineering Division. This work was supported by the NASA Solar System Exploration Research Virtual Institute (SSERVI) through institute grant number NNA14AB03A.

Simulation of the visible structures of the universe: The most comprehensive simulation within the Magneticum Pathfinder Project covers the spatial area of a cube with a box size of 12.5 billion light years. Credit: Klaus Dolag/LMU, www.magneticum.org

The world's most elaborate cosmological simulation of the evolution of our universe was accomplished by theoretical astrophysicists of the Ludwig-Maximilians-Universität München (LMU) in cooperation with experts of the Excellence Cluster Universe’ datacentre C2PAP and of the Leibniz Supercomputing Centre. The most comprehensive simulation within the "Magneticum Pathfinder" project pursues the development of a record number of 180 billion tiny spatial elements in a previously unreached spatial area of 12.5 billion light years. For the first time, a hydrodynamic cosmological simulation is large enough to be directly compared with large-scale astronomical surveys.

Within modern cosmology, the Big Bang marks the beginning of the universe and the creation of matter, space and time about 13.8 billion years ago. Since then, the visible structures of the cosmos have developed: billions of galaxies which bind gas, dust, stars and planets with gravity and host supermassive black holes in their centres. But how could these visible structures have formed from the universe’s initial conditions?

To answer this question, theoretical astrophysicists carry out cosmological simulations. They transform their knowledge about the physical processes forming our universe into mathematical models and simulate the evolution of our universe on supercomputers over billions of years.

A group of theoretical astrophysicists from the LMU led by Klaus Dolag has now, as part of the Magneticum Pathfinder project, performed a new, unique hydrodynamic simulation of the large-scale distribution of the universe’s visible matter. The most recent results regarding the three most important cosmic ingredients of the universe are taken into account – the dark energy, the dark matter and the visible matter.

The scientists incorporated a variety of physical processes in the calculations, including three that are considered particularly important for the development of the visible universe: first, the condensation of matter into stars, second, their further evolution when the surrounding matter is heated by stellar winds and supernova explosions and enriched with chemical elements, and third, the feedback of supermassive black holes that eject massive amounts of energy into the universe.

The most comprehensive simulation covers the spatial area of a cube with a box size of 12.5 billion light years. This tremendous large section of the universe was never part of a simulation before. It was divided into a previously unattained number of 180 billion resolution elements, each representing the detailed properties of the universe and containing about 500 bytes of information.

For the first time, these numerous characteristics make it possible to compare a cosmological simulation in detail with large-scale astronomical surveys. "Astronomical surveys from space telescopes like Planck or Hubble observe a large segment of the visible universe while sophisticated simulations so far could only model very small parts of the universe, making a direct comparison virtually impossible," says Klaus Dolag. "Thus, Magneticum Pathfinder marks the beginning of a new era in computer-based cosmology."

This achievement is preceded by ten years of research and development, accompanied by experts of the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences, one of the most powerful scientific computer centres in Europe. "One of the biggest challenge for such a complex problem is to find the right balance between optimizing the simulation code and the development of the astrophysical modelling," explains Klaus Dolag. "While the code permanently needs to be adjusted to changing technologies and new hardware, the underlying models need to be improved by including better or additional descriptions of the physical processes which form our visible universe."

The realization of this largest simulation within the Magneticum Pathfinder project took about two years. The research group of Klaus Dolag was supported by the physicists of the datacentre C2PAP which is operated by the Excellence Cluster Universe and located at the LRZ. Within the framework of several one-week workshops, the Magneticum Pathfinder team got the opportunity to use the LRZ’ entire highest-performance supercomputer SuperMUC for its simulation. "I do not know any datacentre that would have allowed me to use the entire computing capacity for such a long time," says Klaus Dolag.

Overall, the Magneticum Pathfinder simulation utilised all 86,016 computing cores and the complete usable main memory – 155 out of a total of 194 terabytes – of the expansion stage "Phase 2" of the SuperMUC which was put into operation recently. The entire simulation required 25 million CPU hours and generated 320 terabytes of scientific data.

These data are now available for interested researchers worldwide. The Munich-based astrophysicists are already engaged in further projects: Among others, Klaus Dolag is currently collaborating with scientists from the Planck collaboration to compare observations of the Planck satellite with the calculations of Magneticum Pathfinder. 

A filament stretches across the lower half of the sun in this image captured by NASA’s Solar Dynamics Observatory on Feb. 10, 2015. Filaments are huge tubes of relatively cool solar material held high up in the corona by magnetic fields. Researchers simulated how the material moves in filament threads to explore how a particular type of motion could contribute to the extremely hot temperatures in the sun’s upper atmosphere, the corona.

Modern telescopes and satellites have helped us measure the blazing hot temperatures of the sun from afar. Mostly the temperatures follow a clear pattern: The sun produces energy by fusing hydrogen in its core, so the layers surrounding the core generally get cooler as you move outwards--with one exception. Two NASA missions have just made a significant step towards understanding why the corona--the outermost, wispy layer of the sun's atmosphere --is hundreds of times hotter than the lower photosphere, which is the sun's visible surface.

In a pair of papers in The Astrophysical Journal, published on August 10, 2015, researchers--led by Joten Okamoto of Nagoya University in Japan and Patrick Antolin of the National Astronomical Observatory of Japan--observed a long-hypothesized mechanism for coronal heating, in which magnetic waves are converted into heat energy. Past papers have suggested that magnetic waves in the sun -- Alfvénic waves - have enough energy to heat up the corona. The question has been how that energy is converted to heat.

"For over 30 years scientists hypothesized a mechanism for how these waves heat the plasma," said Antolin. "An essential part of this process is called resonant absorption -- and we have now directly observed resonant absorption for the first time."

Resonant absorption is a complicated wave process in which repeated waves add energy to the solar material, a charged gas known as plasma, the same way that a perfectly-timed repeated push on a swing can make it go higher. Resonant absorption has signatures that can be seen in material moving side to side and front to back.

To see the full range of motions, the team used observations from NASA's Interface Region Imaging Spectrograph, or IRIS, and the Japan Aerospace Exploration Agency (JAXA)/NASA's Hinode solar observatory to successfully identify signatures of the process. The researchers then correlated the signatures to material being heated to nearly corona-level temperatures. These observations told researchers that a certain type of plasma wave was being converted into a more turbulent type of motion, leading to lots of friction and electric currents, heating the solar material.

The researchers focused on a solar feature called a filament. Filaments are huge tubes of relatively cool plasma held high up in the corona by magnetic fields. Researchers developed a supercomputer model of how the material inside filament tubes moves, then looked for signatures of these motions with sun-observing satellites.

"Through numerical simulations, we show that the observed characteristic motion matches well what is expected from resonant absorption," said Antolin.

The signatures of these motions appear in three dimensions, making them difficult to observe without the teamwork of several missions. Hinode's Solar Optical Telescope was used to make measurements of motions that appear, from our perspective, to be up-and-down or side-to-side, a perspective that scientists call plane-of-sky. The resonant absorption model relies on the fact that the plasma contained in a filament tube moves in a specific wave motion called an Alfvénic kink wave, caused by magnetic fields. Alfvénic kink waves in filaments can cause motions in the plane-of-sky, so evidence of these waves came from observations by Hinode's extremely high-resolution optical telescope.

More complicated were the line-of-sight observations--line-of-sight means motions in the third dimension, toward and away from us. The resonant absorption process can convert the Alfvénic kink wave into another Alfvénic wave motion. To see this conversion process we need to simultaneously observe motions in the plane-of-the-sky and the line-of-sight direction. This is where IRIS comes in. IRIS takes a special type of data called spectra. For each image taken by IRIS's ultraviolet telescope, it also creates a spectrum, which breaks down the light from the image into different wavelengths.

Analyzing separate wavelengths can provide scientists with additional details such as whether the material is moving toward or away from the viewer. Much like a siren moving toward you sounds different from one moving away, light waves can become stretched or compressed if their source is moving toward or away from an observer. This slight change in wavelength is known as the Doppler effect. Scientists combined their knowledge of the Doppler effect with the expected emissions from a stationary filament to deduce how the filaments were moving in the line-of-sight.

"It's the combination of high-resolution observations in all three regimes--time, spatial, and spectral--that enabled us to see these previously unresolved phenomena," said Adrian Daw, mission scientist for IRIS at NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Using both the plane-of-sky observations from Hinode and line-of-sight observations from IRIS, researchers discovered the characteristic wave motions consistent with their model of this possible coronal heating mechanism. What's more, they also observed material heating up in conjunction with the wave motions, further confirming that this process is related to heating in the solar atmosphere.

"We would see the filament thread disappear from the filter that is sensitive to cool plasma and reappear in a filter for hotter plasma," said Bart De Pontieu, science lead for IRIS at Lockheed Martin Solar and Astrophysics Lab in Palo Alto, California.

In addition, comparison of the two wave motions, showed a time delay, known as a phase difference. The researchers' model predicted this phase difference, thus providing some of the strongest evidence that the team was correctly understanding their observations.

Though resonant absorption plays a key role in the complete process, it does not directly cause heating. The researchers' simulation showed that the transformed wave motions lead to turbulence around the edges of the filament tubes, which heats the surrounding plasma.

It seems that resonant absorption is an excellent candidate for the role of an energy transport mechanism--though these observations were taken in the transition region rather than the corona, researchers believe that this mechanism could be common in the corona as well.

"Now the work starts to study if this mechanism also plays a role in heating plasma to coronal temperatures," said De Pontieu.

With the launch of over a dozen missions in the past twenty years, our understanding of the sun and how it interacts with Earth and the solar system is better than at any time in human history. Heliophysics System Observatory missions are working together to unravel the coronal heating problem and the sun's other remaining mysteries. wave motion

The Víctor M. Blanco Telescope at the Cerro Tololo Inter-American Observatory in Chile, where the Dark Energy Camera is being used to collect image data for the DECam Legacy Survey. The glint off the dome is moonlight; the small and large Magellanic clouds can be seen in the background. (Image: Dustin Lang, University of Toronto)

MANTISSA collaboration uses statistical inference to revolutionize sky surveys

The roots of tradition run deep in astronomy. From Galileo and Copernicus to Hubble and Hawking, scientists and philosophers have been pondering the mysteries of the universe for centuries, scanning the sky with methods and models that, for the most part, haven’t changed much until the last two decades.

Now a Berkeley Lab-based research collaboration of astrophysicists, statisticians and computer scientists is looking to shake things up with Celeste, a new statistical analysis model designed to enhance one of modern astronomy’s most time-tested tools: sky surveys.

A central component of an astronomer’s daily activities, surveys are used to map and catalog regions of the sky, fuel statistical studies of large numbers of objects and enable interesting or rare objects to be studied in greater detail. But the ways in which image datasets from these surveys are analyzed today remains stuck in, well, the Dark Ages.

“There are very traditional approaches to doing astronomical surveys that date back to the photographic plate,” said David Schlegel, an astrophysicist at Lawrence Berkeley National Laboratory and principal investigator on the Baryon Oscillation Spectroscopic Survey (BOSS, part of SDSS) and co-PI on the DECam Legacy Survey (DECaLS). “A lot of the terminology dates back to that as well. For example, we still talk about having a plate and comparing plates, when obviously we’ve moved way beyond that.”

Surprisingly, the first electronic survey—the Sloan Digital Sky Survey (SDSS)—only began capturing data in 1998. And while today there are multiple surveys and high-resolution instrumentation operating 24/7 worldwide and collecting hundreds of terabytes of image data annually, the ability of scientists from multiple facilities to easily access and share this data remains elusive. In addition, practices originating a hundred years ago or more continue to proliferate in astronomy—from the habit of approaching each survey image analysis as though it were the first time they’ve looked at the sky to antiquated terminology such as “magnitude system” and “sexagesimal” that can leave potential collaborators outside of astronomy scratching their heads.

It’s conventions like these in a field he loves that frustrate Schlegel.

“There’s a history of how the data are used in astronomy, and the language and terminology reflect a lot of the problems,” he said. “For example, the magnitude system—it is not some linear system of how bright objects are, it is an arbitrary label dating back thousands of years. But you can still pick up any astronomy paper and they all use the magnitude system.”

When it comes to analyzing image data from sky surveys, Schlegel is certain existing methods can be improved upon as well—especially in light of the more complex computational challenges expected to emerge from next-generation surveys like DECaLS and higher-resolution instruments like the Large Synoptic Survey Telescope (LSST).

“The way we deal with data analysis in astronomy is through ‘data reduction,’” he said. “You take an image, apply a detection algorithm to it, take some measurements and then make a catalog of the objects in that image. Then you take another image of the same part of the sky and you say, ‘Oh, let me pretend I don’t know what’s going on here, so I’ll start by identifying objects, taking measurements of those objects and then make a catalog of those objects. ‘ And this is done independently for each image. So you keep stepping further and further down into these data reduction catalogs and never going back to the original image.”

A Hierarchical Model

These challenges prompted Schlegel to team up with Berkeley Lab’s MANTISSA (Massive Acceleration of New Technologies in Science with Scalable Algorithms) project, led by Prabhat from the National Energy Research and Scientific Computing Center (NERSC), a DOE Office of Science User Facility. “To tackle this grand challenge, we have engaged leading researchers from UC Berkeley, Harvard, Carnegie Mellon and Adobe Research,” said Prabhat.

The team spent the past year developing Celeste, a hierarchical model designed to catalog stars, galaxies and other light sources in the universe visible through the next generation of telescopes, explained Jeff Regier, a Ph.D. student in the UC Berkeley Department of Statistics and lead author on a paper outlining Celeste presented in July at the 32nd International Conference on Machine Learning. The new model will also enable astronomers to identify promising galaxies for spectrograph targeting, define galaxies they may want to explore further and help them better understand Dark Energy and the geometry of the universe, he added.

A DeCAM/DeCALs image of galaxies observed by the Blanco Telescope. The Legacy Survey is producing an inference model catalog of the sky from a set of optical and infrared imaging data, comprising 14,000 deg² of extragalactic sky visible from the northern hemisphere in three optical bands and four infrared bands. (Image: Dark Energy Sky Survey)

A DeCAM/DeCALs image of galaxies observed by the Blanco Telescope. The Legacy Survey is producing an inference model catalog of the sky from a set of optical and infrared imaging data, comprising 14,000 deg² of extragalactic sky visible from the northern hemisphere in three optical bands and four infrared bands. (Image: Dark Energy Sky Survey)

“What we want to change here in a fundamental way is the way astronomers use these data,” Schlegel said. “Celeste will be a much better model for identifying the astrophysical sources in the sky and the calibration parameters of each telescope. We will be able to mathematically define what we are solving, which is very different from the traditional approach, where it is this set of heuristics and you get this catalog of objects, then you try to ask the question: mathematically what was the problem I just solved?”

In addition, Celeste has the potential to significantly reduce the time and effort that astronomers currently spend working with image data, Schlegel emphasized. “Ten to 15 years ago, you’d get an image of the sky and you didn’t even know exactly where you were pointed on the sky. So the first thing you’d do is pull it up on the computer and click around on stars and try to identify them to figure out exactly where you were. And you would do that by hand for every single image.”

Applied Statistics

To alter this scenario, Celeste uses analytical techniques common in machine learning and applied statistics but not so much in astronomy. The model is fashioned on a code called the Tractor, developed by Dustin Lang while he was a post-doctoral fellow at Princeton University.

“Most astronomical image analysis methods look at a bunch of pixels and run a

simple algorithm that basically does arithmetic on the pixel values,” said Lang, formerly a post-doc at Carnegie Mellon and now a research associate at the University of Toronto and a member of the Celeste team. “But with the Tractor, instead of running fairly simple recipes on pixel values, we create a full, descriptive model that we can compare to actual images and then adjust the model so that its claims of what a particular star actually looks like match the observations.  It makes more explicit statements about what objects exist and predictions of what those objects will look like in the data.”

The Celeste project takes this concept a few steps further, implementing statistical inference to build a fully generative model to mathematically locate and characterize light sources in the sky. Statistical models typically start from the data and look backwards to determine what led to the data, explained Jon McAuliffe, a professor of statistics at UC Berkeley and another member of the Celeste team. But in astronomy, image data analysis typically begins with what isn’t known: the locations and characteristics of objects in the sky.

“In science what we do a lot is take something that is hard and try to decompose it into simpler parts and then put the parts back together,” McAuliffe said. “That’s what is going on in the hierarchical model. The tricky part is, there are these assumed or imagined quantities and we have to reason about them even though we didn’t get to observe them. This is where statistical inference comes in. Our job is to start from the pixel intensities in the images and work backwards to where the light sources were and what their characteristics were.”

So far the group has used Celeste to analyze pieces of SDSS images, whole SDSS images and sets of SDSS images on NERSC’s Edison supercomputer, McAuliffe said. These initial runs have helped them refine and improve the model and validate its ability to exceed the performance of current state-of-the-art methods for locating celestial bodies and measuring their colors.

“The ultimate goal is to take all of the photometric data generated up to now and that is going to be generated on an ongoing basis and run a single job and keep running it over time and continually refine this comprehensive catalog,” he said..

The first major milestone will be to run an analysis of the entire SDSS dataset all at once at NERSC. The researchers will then begin adding other datasets and begin building the catalog—which, like the SDSS data, will likely be housed on a science gateway at NERSC. In all, the Celeste team expects the catalog to collect and process some 500 terabytes of data, or about 1 trillion pixels.

“To the best of my knowledge, this is the largest graphical model problem in science that actually requires a supercomputing platform for running the inference algorithms,” Prabhat said. “The core methods being developed by Jon McAuliffe, Jeff Regier and Ryan Giordano (UC Berkeley), Matt Hoffman (Adobe Research) and Ryan Adams and Andy Miller (Harvard) are absolutely key for attempting a problem at this scale.”

The next iteration of Celeste will include quasars, which have a distinct spectral signature that makes them more difficult to distinguish from other light sources. The modeling of quasars is important to improving our understanding of the early universe, but it presents a big challenge: the most important objects are those that are far away, but distant objects are the ones for which we have the weakest signal. Andrew Miller of Harvard University is currently working on this addition to the model, which couples high-fidelity spectral measurements with survey data to improve our estimates of remote quasars.

“It may be a little surprising that up to now the worldwide astronomy community hasn’t built a single reference catalog of all the light sources that are being imaged by many, many different telescopes worldwide over the past 15 years,” McAuliffe said. “But we think we can help with that. This is going to be a catalog that will be incredibly valuable for astronomers and cosmologists in the future.”

Figure: (Left) For reference, an image of the entire Sun taken by SDO/AIA in extreme ultra-violet light (false color). (Right) An image of a solar prominence at the limb of the Sun taken by Hinode/SOT in visible light (Ca II H line, false color). As shown in the image, a prominence is composed of long, thin structures called threads. A scale model of the Earth is shown on the right for reference.

Solar physicists have captured the first direct observational signatures of resonant absorption, thought to play an important role in solving the “coronal heating problem” which has defied explanation for over 70 years.

An international research team from Japan, the U.S.A., and Europe led by Drs. Joten Okamoto and Patrick Antolin combined high resolution observations from JAXA’s Hinode mission and NASA’s IRIS (Interface Region Imaging Spectrograph) mission, together with state-of-the-art numerical simulations and modeling from NAOJ’s ATERUI supercomputer. In the combined data, they were able to detect and identify the observational signatures of resonant absorption.

Resonant absorption is a process where two different types of magnetically driven waves resonate, strengthening one of them. In particular this research looked at a type of magnetic waves known as Alfvénic waves which can propagate through a prominence (a filamentary structure of cool, dense gas floating in the corona). Here, for the first time, researchers were able to directly observe resonant absorption between transverse waves and torsional waves, leading to a turbulent flow which heats the prominence. Hinode observed the transverse motion and IRIS observed the torsional motion; these results would not have been possible without both satellites.

This new information can help explain how the solar corona reaches temperatures of 1,000,000 degrees Celsius; the so called “coronal heating problem.”

The scientific paper on which this article is based appears in the Astrophysical Journal. 

Journal References:

  1. P. Antolin, T. J. Okamoto, B. De Pontieu, H. Uitenbroek, T. Van Doorsselaere, T. Yokoyama. RESONANT ABSORPTION OF TRANSVERSE OSCILLATIONS AND ASSOCIATED HEATING IN A SOLAR PROMINENCE. II. NUMERICAL ASPECTS. The Astrophysical Journal, 2015; 809 (1): 72 DOI: 10.1088/0004-637X/809/1/72
  2. Takenori J. Okamoto, Patrick Antolin, Bart De Pontieu, Han Uitenbroek, Tom Van Doorsselaere, Takaaki Yokoyama. RESONANT ABSORPTION OF TRANSVERSE OSCILLATIONS AND ASSOCIATED HEATING IN A SOLAR PROMINENCE. I. OBSERVATIONAL ASPECTS. The Astrophysical Journal, 2015; 809 (1): 71 DOI: 10.1088/0004-637X/809/1/71

Page 3 of 13