CAPTION This is a still from a simulation of a Type Ia supernova. In the simulation, a Type Ia supernova explodes (dark brown color). The supernova material is ejected outwards at a velocity of about 10,000 km/s. The ejected material slams into its companion star (light blue). Such a violent collision produces an ultraviolet pulse which is emitted from the conical hole carved out by the companion star. CREDIT Courtesy of Dan Kasen

Type Ia supernovae, one of the most dazzling phenomena in the universe, are produced when small dense stars called white dwarfs explode with ferocious intensity. At their peak, these supernovae can outshine an entire galaxy. Although thousands of supernovae of this kind were found in the last decades, the process by which a white dwarf becomes one has been unclear.

That began to change on May 3, 2014, when a team of Caltech astronomers working on a robotic observing system known as the intermediate Palomar Transient Factory (iPTF)--a multi-institute collaboration led by Shrinivas Kulkarni, the John D. and Catherine T. MacArthur Professor of Astronomy and Planetary Science and director of the Caltech Optical Observatories--discovered a Type Ia supernova, designated iPTF14atg, in nearby galaxy IC831, located 300 million light-years away.

The data that were immediately collected by the iPTF team lend support to one of two competing theories about the origin of white dwarf supernovae, and also suggest the possibility that there are actually two distinct populations of this type of supernova.

The details are outlined in a paper with Caltech graduate student Yi Cao the lead author, appearing May 21 in the journal Nature.

Type Ia supernovae are known as "standardizable candles" because they allow astronomers to gauge cosmic distances by how dim they appear relative to how bright they actually are. It is like knowing that, from one mile away, a light bulb looks 100 times dimmer than another located only one-tenth of a mile away. This consistency is what made these stellar objects instrumental in measuring the accelerating expansion of the universe in the 1990s, earning three scientists the Nobel Prize in Physics in 2011.

There are two competing origin theories, both starting with the same general scenario: the white dwarf that eventually explodes is one of a pair of stars orbiting around a common center of mass. The interaction between these two stars, the theories say, is responsible for triggering supernova development. What is the nature of that interaction? At this point, the theories diverge.

According to one theory, the so-called double-degenerate model, the companion to the exploding white dwarf is also a white dwarf, and the supernova explosion initiates when the two similar objects merge.

However, in the second theory, called the single-degenerate model, the second star is instead a sunlike star--or even a red giant, a much larger type of star. In this model, the white dwarf's powerful gravity pulls, or accretes, material from the second star. This process, in turn, increases the temperature and pressure in the center of the white dwarf until a runaway nuclear reaction begins, ending in a dramatic explosion.

The difficulty in determining which model is correct stems from the facts that supernova events are very rare--occurring about once every few centuries in our galaxy--and that the stars involved are very dim before the explosions.

That is where the iPTF comes in. From atop Palomar Mountain in Southern California, where it is mounted on the 48-inch Samuel Oschin Telescope, the project's fully automated camera optically surveys roughly 1000 square degrees of sky per night (approximately 1/20th of the visible sky above the horizon), looking for transients--objects, including Type Ia supernovae, whose brightness changes over timescales that range from hours to days.

On May 3, the iPTF took images of IC831 and transmitted the data for analysis to computers at the National Energy Research Scientific Computing Center, where a machine-learning algorithm analyzed the images and prioritized real celestial objects over digital artifacts. Because this first-pass analysis occurred when it was nighttime in the United States but daytime in Europe, the iPTF's European and Israeli collaborators were the first to sift through the prioritized objects, looking for intriguing signals. After they spotted the possible supernova--a signal that had not been visible in the images taken just the night before--the European and Israeli team alerted their U.S. counterparts, including Caltech graduate student and iPTF team member Yi Cao.

Cao and his colleagues then mobilized both ground- and space-based telescopes, including NASA's Swift satellite, which observes ultraviolet (UV) light, to take a closer look at the young supernova.

"My colleagues and I spent many sleepless nights on designing our system to search for luminous ultraviolet emission from baby Type Ia supernovae," says Cao. "As you can imagine, I was fired up when I first saw a bright spot at the location of this supernova in the ultraviolet image. I knew this was likely what we had been hoping for."

UV radiation has higher energy than visible light, so it is particularly suited to observing very hot objects like supernovae (although such observations are possible only from space, because Earth's atmosphere and ozone later absorbs almost all of this incoming UV). Swift measured a pulse of UV radiation that declined initially but then rose as the supernova brightened. Because such a pulse is short-lived, it can be missed by surveys that scan the sky less frequently than does the iPTF.

This observed ultraviolet pulse is consistent with a formation scenario in which the material ejected from a supernova explosion slams into a companion star, generating a shock wave that ignites the surrounding material. In other words, the data are in agreement with the single-degenerate model.

Back in 2010, Daniel Kasen, an associate professor of astronomy and physics at UC Berkeley and Lawrence Berkeley National Laboratory, used theoretical calculations and supercomputer simulations to predict just such a pulse from supernova-companion collisions. "After I made that prediction, a lot of people tried to look for that signature," Kasen says. "This is the first time that anyone has seen it. It opens up an entirely new way to study the origins of exploding stars."

According to Kulkarni, the discovery "provides direct evidence for the existence of a companion star in a Type Ia supernova, and demonstrates that at least some Type Ia supernovae originate from the single-degenerate channel."

Although the data from supernova iPTF14atg support it being made by a single-degenerate system, other Type Ia supernovae may result from double-degenerate systems. In fact, observations in 2011 of SN2011fe, another Type Ia supernova discovered in the nearby galaxy Messier 101 by PTF (the precursor to the iPTF), appeared to rule out the single-degenerate model for that particular supernova. And that means that both theories actually may be valid, says Caltech professor of theoretical astrophysics Sterl Phinney, who was not involved in the research. "The news is that it seems that both sets of theoretical models are right, and there are two very different kinds of Type Ia supernovae."

"Both rapid discovery of supernovae in their infancy by iPTF, and rapid follow-up by the Swift satellite, were essential to unveil the companion to this exploding white dwarf. Now we have to do this again and again to determine the fractions of Type Ia supernovae akin to different origin theories," says iPTF team member Mansi Kasliwal, who will join the Caltech astronomy faculty as an assistant in September 2015.

The iPTF project is a scientific collaboration between Caltech; Los Alamos National Laboratory; the University of Wisconsin-Milwaukee; the Oskar Klein Centre in Sweden; the Weizmann Institute of Science in Israel; the TANGO Program of the University System of Taiwan; and the Kavli Institute for the Physics and Mathematics of the Universe in Japan.

An image comparing the data showing the many galaxies and the X-ray emission from the hot gas (left) with the model of the hot gas (right). The "comet" shape of the X-ray data is well reproduced by the model

The results of the work led by Tom Broadhurst, the Ikerbasque researcher of the UPV/EHU-University of the Basque Country, challenge the accepted view that dark matter is made up of heavy particles

Tom Broadhurst, the Ikerbasque researcher in the Department of Theoretical Physics of the UPV/EHU, together with Sandor Molnar of the National Taiwan University and visiting Ikerbasque researcher at the UPV/EHU in 2013, have conducted a simulation that explains the collision between two clusters of galaxies. Clusters of galaxies are the biggest objects that exist in the universe. They are collections of hundreds of thousands of galaxies pulled together by gravity.

Tom Broadhurst, the Ikerbasque researcher in the Department of Theoretical Physics of the UPV/EHU, together with Sandor Molnar of the National Taiwan University and visiting Ikerbasque researcher at the UPV/EHU in 2013, have conducted a simulation that explains the collision between two clusters of galaxies. Clusters of galaxies are the biggest objects that exist in the universe. They are collections of hundreds of thousands of galaxies pulled together by gravity.

In general, galaxy clusters grow in size by merging with each other to become increasingly larger. Gravitational forces cause them to slowly come together over time despite the expansion of the universe. The system known as "El Gordo", the biggest known cluster of galaxies, is in turn the result of the collision between two large clusters. It was found that the collision process compresses the gas within each cluster to very high temperatures so that it is shining in the Xray region of the spectrum. In the Xray spectrum this gas cloud is comet shaped with two long tails stretching between the dense cores of the two clusters of galaxies. This distinctive configuration has allowed the researchers to establish the relative speed of the collision, which is extreme (~2200km/second), as it puts it at the limit of what is allowed by current theory for dark matter. 

These rare, extreme examples of clusters caught in the act of colliding seem to be challenging the accepted view that dark matter is made up of heavy particles, since no such particles have actually been detected yet, despite the efforts being made to find them by means of the LHC (Large Hadron Particle Collider) accelerator in Geneva and the LUX (Large Underground Xenon Experiment), an underground dark matter detector in the United States. In Tom Broadhurst's opinion, "it's all the more important to find a new model that will enable the mysterious dark matter to be understood better". Broadhurst is one of the authors of a wave-dark-matter model published in Nature Physics last year.

This new piece of research has entailed interpreting the gas observed and the dark matter of El Gordo "hydrodynamically" through the development of an in-house computational model that includes the dark matter, which comprises most of the mass, and which can be observed in the Xray region of the visible spectrum because of its extremely high temperature (100 million kelvin). Dr Broadhurst and Dr Molnar have managed to obtain a unique computational solution for this collision because of the comet-like shape of the hot gas, and the locations and the masses of the two dark matter cores that have passed through each other at an oblique angle at a relative speed of about 2200 km/s.  This means that the total energy release is bigger than that of any other known phenomenon, with the exception of the Big Bang.

Tom Broadhurst has a PhD in Physics from the University of Durham (United Kingdom). Until being recruited by Ikerbasque, he had done research at top research centres in the United Kingdom, United States, Germany, Israel, Japan and Taiwan. He has had 184 papers published in leading scientific journals, and so far his work has received 11,800 citations from other scientists.

In 2010, he was recruited by Ikerbasque and since then has been carrying out his work at the UPV/EHU's department of Theoretical Physics. His lines of research focus on observational cosmology, dark matter and the formation of galaxies.

 Two-dimensional map of the sky that will identify the galaxies that will be the targets for our spectroscopic measurements once DESI is built. Source: http://legacysurvey.org/viewer

A next-generation experiment will create the largest 3D map of the universe and help Berkeley Lab scientists get a handle on dark 

For the past several years, scientists at the U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab) have been planning the construction of and developing technologies for a very special instrument that will create the most extensive three-dimensional map of the universe to date. Called DESI for Dark Energy Spectroscopic Instrument, this project will trace the growth history of the universe rather like the way you might track your child’s height with pencil marks climbing up a doorframe. But DESI will start from the present and work back into the past.

DESI will make a full 3D map pinpointing galaxies’ locations across the universe. The map, unprecedented in its size and scope, will allow scientists to test theories of dark energy, the mysterious force that appears to cause the accelerating expansion and stretching of the universe first discovered in observations of supernovae by groups led by Saul Perlmutter at Berkeley Lab and by Brian Schmidt, now at Australian National University, and Adam Riess, now at Johns Hopkins University.

Michael Levi and David Schlegel, physicists at Berkeley Lab, have been key players in DESI from the beginning. We sat down to discuss the future of the project and how this forthcoming map will help scientists better understand dark energy.

What does it mean to make a 3D map of the universe and how big will it be?

Michael Levi: To start, DESI will use 2D imaging surveys to pick tens of millions of galaxies to study. Then the DESI instrument will give us redshifts of about 25 million galaxies. The redshifts are what gives you the depth information.

David Schlegel: The size of this survey will be huge. In the first five years of operation, we will have measured the distance to more galaxies than previously measured by all other telescopes in the world combined.

Exterior of Kitt Peak Observatory in Tucson, Arizona. Image: NOAO/AURA/NSF

Exterior of Kitt Peak Observatory in Tucson, Arizona. Image: NOAO/AURA/NSF

How does the redshift give you the depth information?

ML: We’re collecting spectra, or wavelengths of light, from galaxies. From the spectra, we get a galaxy’s redshift, the ratio of the wavelength we see, to the wavelength the light had when it left the galaxy. The light will stretch exactly the way the universe stretches. For some galaxies, we look for light that starts with a wavelength of 373 nanometers. If we then see it at 746 nanometers, we know the universe stretched by a factor of 2 while the light was making its way to us.

What kind of telescope collects this kind of spectra?

DS: We’re actually retrofitting a 4-meter telescope at Kitt Peak in Arizona called the Mayall telescope. We’re adding enormous lenses that give a huge field of view. A set of 5000 optical fibers, each like the ones used for long-distance Internet connections, are used to pick off light from up to 5000 galaxies at a time.

The fibers direct light to the 30 cameras and spectrographs we have connected to the whole rig. We’ll observe these galaxies for about 20 minutes, then we will point the telescope in a new direction and 5000 little robots will rearrange these fibers to look at a new collection of up to 5000 galaxies, one fiber per galaxy. In other words, the positions of the optical fibers mimic the positions of the galaxies so that each fiber collects light from one galaxy. Point the telescope in a different direction and the fibers need to take on a new configuration.

ML: The telescope was built like a battleship in the 1970s, and one of the advantages of this venerable telescope is that it can support the weight of these instruments, which weigh as much as a school bus. This heavy equipment would overwhelm a modern telescope.

Mayall Telescope at Kitt Peak. Image: NOAO/AURA/NSF

Mayall Telescope at Kitt Peak. Image: NOAO/AURA/NSF

How much of the sky will DESI be able to see?

ML: You can’t see through the Milky Way—its dust cuts out about a third of the sky. So it works out that from one spot on earth, you can see about one third of the sky.

DS: If we had another telescope in the southern hemisphere, we could get another third of the sky. But as it is, we’re expecting DESI to see 20 to 30 million galaxies, looking out over a distance of about 10 billion light years. The universe is 13.7 billion years old, and so at a distance of 10 billion light years, we’re looking back in time some 70 percent of the age of the universe. The map from DESI will be both much larger than any previous map and extend much further into the past.

Aside from virtual tours of our universe, what scientific purpose does a map like this have? How do we use it to understand dark energy?

ML: What we know is that dark energy, which was discovered in 1998, seems to be responsible for the present accelerating expansion of the universe. But we still don’t know what it is, and we also haven’t made very good measurements of it.

A project called BOSS started to scratch the surface of this question using spectroscopic techniques similar to DESI’s. Other imaging surveys like the Dark Energy Survey and the Large Synoptic Survey Telescope measure dark energy in a different way, by observing how matter that lies between observed galaxies and us distorts that light. All of these projects are working together to better understand what dark energy actually is.

Once we have our very large and very precise map from DESI, we can measure the distances between pairs of galaxies. These distances enable us to track the expansion of the universe.

More than that, the 3D clustering of the galaxies lets us calculate the gravitational field between them, testing how cosmic structure grew over time. It can even check whether Einstein’s theory of gravity works in both the early and late universe.

Has the universe always been expanding like it is today?

DS: In the early universe, dark energy didn’t dominate like it does now and we believe that the expansion was slowing down then. But we only have one data point from that time. All the other data we have is when the universe’s expansion was speeding up, thanks to dark energy.

ML: Maybe the expansion speed comes and goes, but our data are not sufficient to tell us.

Where does DESI come in?

DS: DESI helps us understand the accelerating universe through the structure of this 3D map because the features on this map—the positions and clusters of galaxies—could vary significantly as we look deeper into the universe.

For instance, the map close to us is stretched out a lot more than it should be because of this acceleration due to dark energy. And then in the early universe, when there’s not as much dark energy, it shouldn’t be as stretched out, but this is where we don’t have much data yet. Depending on when dark energy was pushing the universe apart, it will push apart different parts of the map. This data will help us eliminate a number of theories about the way dark energy works.

What’s Berkeley Lab’s role in all this?

DS: We’re running the construction project. We’ve also raised the money for the large corrector optics, which are in the process of being made. We’re designing and manufacturing two-thirds of the CCD cameras used in the spectrographs. And we’re designing and building miniaturized electronics and actuators for the robotic elements that automate the positions of the fiber optics. Also, our supercomputing center, NERSC, is handling the hundreds of terabytes of data DESI will produce.

How many researchers and institutions are involved in DESI and who’s funding it?

ML: The U.S. Department of Energy is expected to provide the majority of the project construction funds with additional contributions from the UK, France, Switzerland, and Spain. We have 180 collaborators from 25 institutions. DESI’s very fortunate to have donations from the Heising-Simons Foundation and the Gordon and Betty Moore Foundation, and support from the Science and Technology Facilities Council in the UK. We’re also supported by the National Optical Astronomy Observatory.

What are your next steps?

DS: We’re releasing the first data from the initial 2D map of the sky that will identify the galaxies that will be the targets for our spectroscopic measurements once DESI is built. This preliminary survey contains a huge amount of data, and we’re still just getting warmed up.

In March, we passed a “CD-1 review,” an early stage “critical decision” process by the DOE’s Office of Project Assessment that tracks the progress on projects. We’ll start installation at Kitt Peak in 2017, and we’re aiming for first light by the end of 2018.

Illustration of DESI. 1. 5000 fibers in robotic actuators. 2. Optics that provide a 3.2 degree field of view. 3. 10 fiber cable bundles that lead to 3. 10 actuators
 

Astronomers and astroparticle physicists are celebrating a €15 million EU funding boost for European telescopes with the launch of the ASTERICS project (Astronomy ESFRI and Research Infrastructure Cluster), which will help solve the Big Data challenges of European astronomy and give members of the public direct interactive access to some of the best of Europe’s astronomy images and data.

Astronomy is experiencing a surge of data from its current generation of observatories, with a size and complexity not seen before. The surge will become a deluge with the next generation of telescopes prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), and with other world-class projects.  This €15 million funding boost will help Europe’s world-leading observatories work together to find common solutions to their Big Data challenges, their interoperability and scheduling, and their data access. ASTERICS will also open up these facilities to the full international community, from professionals to the public, through the International Virtual Observatory Alliance and by funding citizen science mass participation experiments for the current and next generation of world-leading European observatories.

The project is led by the Netherlands institute for radio astronomy ASTRON, with a consortium of 22 European partner institutions, including The Open University, who are leading the citizen science and public engagement work package.

Explaining the ASTERICS acronym, Professor Mike Garrett, Principal Investigator of the ASTERICS project, said “ASTERICS stands for Astronomy ESFRI and Research Infrastructure Cluster. For the first time, it brings together the astronomy, astrophysics and particle astrophysics communities to find imaginative new solutions to our common data avalanche problems by working together and directly engaging with industry and specialised businesses.”

Dr. Stephen Serjeant, Head of Astronomy at The Open University and leading the ASTERICS citizen science and public engagement, said “ASTERICS will open up some of the very best of European astronomy to everybody. Crowdsourcing can solve major scientific problems, because the human eye is often much better than computers at recognising and classifying patterns. We are especially pleased to be working with the leading citizen science platform, the Zooniverse.”

Prof. Chris Lintott, PI of the Zooniverse at Oxford University, said “It’s very clear that even the largest and most sophisticated of surveys can benefit from the involvement of citizen scientists; I’m sure the million-strong army of Zooniverse volunteers is looking forward to being able to help!”

The facilities supported by the ASTERICS programme include:

•        The Square Kilometre Array (SKA), a radio telescope currently being built at two locations in Australia and South Africa, as well as precursor/pathfinder experiments;

•        The Cherenkov Telescope Array (CTA), the first high-energy gamma-ray world-wide observatory, comprising two large arrays of Cherenkov telescopes in the two hemispheres;

•        KM3NeT, a telescope at the bottom of the Mediterranean Sea aiming to detect ghostly neutrino particles from space;

•        The European Extremely Large Telescope (E-ELT), an optical and infrared telescope currently being built in Chile, as well as precursor optical and infrared telescopes.

Other facilities benefitting from ASTERICS support include forthcoming experiments such as the Einstein gravitational-wave Telescope (ET, coordinated by the European Gravitational Observatory-EGO), the Euclid Space Telescope and the Large Synoptic Survey Telescope (LSST), and current facilities such as the Low Frequency Array (LOFAR), the High Energy Stereoscopic System (H.E.S.S.), Major Atmospheric Gamma Imaging Cherenkov (MAGIC), the gravitational-wave detector Advanced Virgo and the European Very Large Baseline Interferometry Network (EVN). The funding was made through the European Union’s Horizon 2020 Framework Programme, which is the biggest EU Research and Innovation programme ever with nearly €80 billion of funding over 7 years (2014 to 2020).

The inner solar system’s biggest known collision was the moon-forming giant impact between a large protoplanet and the proto-Earth. Kilometer-size fragments from this impact hit main belt asteroids at much higher velocities than typical main belt collisions, heating the surface and leaving behind a permanent record of the impact event. (Illustration: NASA/JPL-Caltech)

Through a combination of data analysis and numerical modeling work, researchers have found a record of the ancient moon-forming giant impact observable in stony meteorites 

Their work will appear in the April 2015 issue of the Journal Science. The work was done by NASA Solar System Exploration Research Virtual Institute (SSERVI) researchers led by Principal Investigator Bill Bottke of the Institute for the Science of Exploration Targets (ISET) team at the Southwest Research Institute and included Tim Swindle, director of the University of Arizona's Lunar and Planetary Laboratory.

The inner Solar System's biggest known collision was the Moon-forming giant impact between a large protoplanet and the proto-Earth. The timing of this giant impact, however, is uncertain, with the ages of the most ancient lunar samples returned by the Apollo astronauts still being debated. Numerical simulations of the giant impact indicate this event not only created a disk of debris near Earth that formed the Moon, but it also ejected huge amounts of debris completely out of the Earth-Moon system. The fate of this material, comprising as much as several percent of an Earth mass, has not been closely examined until recently. However, it is likely some of it blasted main belt asteroids, with a record plausibly left behind in their near-surface rocks. Collisions on these asteroids in more recent times delivered these shocked remnants to Earth, which scientists have now used to date the age of the Moon.

The research indicates numerous kilometer-sized fragments from the giant impact struck main belt asteroids at much higher velocities than typical main belt collisions, heating the surface and leaving behind a permanent record of the impact event. Evidence that the giant impact produced a large number of kilometer-sized fragments can be inferred from laboratory and numerical impact experiments, the ancient lunar impact record itself, and the numbers and sizes of fragments produced by major main belt asteroid collisions.

Once the team concluded that pieces of the Moon-forming impact hit main belt asteroids and left a record of shock heating events in some meteorites, they set out to deduce both the timing and the relative magnitude of the bombardment. By modeling the evolution of giant impact debris over time and fitting the results to ancient impact heat signatures in stony meteorites, the team was able to infer the Moon formed about 4.47 billion years ago, in agreement with many previous estimates. The most ancient Solar System materials found in meteorites are about one hundred million years older than this age.

Insights into the last stages of planet formation in the inner solar system can be gleaned from these impact signatures. For example, the team is exploring how they can be used to place new constraints on how many asteroid-like bodies still existed in the inner Solar System in the aftermath of planet formation. They can also help researchers deduce the earliest bombardment history of ancient bodies like Vesta, one of the targets of NASA's Dawn mission and a main belt asteroid whose fragments were delivered to Earth in the form of meteorites. It is even possible that tiny remnants of the Moon-forming impactor or proto-Earth might still be found within meteorites that show signs of shock heating by giant impact debris. This would allow scientists to explore for the first time the unknown primordial nature of our homeworld. 

Co-author Swindle, who specializes in finding the times when meteorites or lunar samples were involved in large collisions, said: "Bill Bottke had the idea of looking at the asteroid belt to see what effect a Moon-forming giant impact would have, and realized that you would expect a lot of collisions in the period shortly after that.

"Here at LPL, we had been determining ages of impact events that affected meteorites, and when we got together, we found that our data matched his predictions," he added. "It's a great example of taking advantage of groups that work in two different specialties - orbital dynamics and chronology - and combining their expertise."

Intriguingly, some debris may have also returned to hit the Earth and Moon after remaining in solar orbit over timescales ranging from tens of thousands of years to 400 million years. 

"The importance of giant impact ejecta returning to strike the Moon could also play an intriguing role in the earliest phase of lunar bombardment," said Bottke, who is an alumnus of the University of Arizona's Lunar and Planetary Laboratory. "This research is helping to refine our time scales for 'what happened when' on other worlds in the Solar System."

Yvonne Pendleton, Director of the NASA SSERVI Institute, notes: "This is an excellent example of the power of multidisciplinary science. By linking studies of the Moon, of main belt asteroids, and of meteorites that fall to Earth, we gain a better understanding of the earliest history of our Solar System."

Page 6 of 13