Three-dimensional simulation of a Type Ia supernova explosion (image: F. K. Röpke MPI for Astrophysics, Garching).

Supernova scientist Friedrich Röpke is the leader of the new research group "Physics of Stellar Objects“ at HITS and professor at Heidelberg University. He examines the high-energy processes in the death of stars using supercomputer simulations.

Modern astronomy began with a supernova. In November 1572, Danish astronomer Tycho Brahe discovered a new star – and destroyed the idea of a sky of fixed stars. Today, we know that Brahe was observing the death of a star, which ended in a massive explosion. Friedrich Röpke aims to find out how these supernova explosions proceed. The astrophysicist is now leader of the new  research group "Physics of Stellar Objects" (PSO) at Heidelberg Institute for Theoretical Studies (HITS). As of March 1, 2015, he has been appointed professor for Theoretical Astrophysics at Heidelberg University. His workplace is HITS. This joint appointment is a perfect proof for the close cooperation between the two institutes. With Friedrich Röpke and Volker Springel, there now are two HITS astrophysicists who are also professors at Heidelberg University.

“The new group is another important component of our concept," says Klaus Tschira who founded the HITS in 2010 as a non-profit research institute. “Research on stellar astrophysics, like Friedrich Röpke does, is a perfect complement of the work of Volker Springel’s group on large-scale processes like galaxy formation."

Friedrich Röpke (40) studied Physics at the University of Jena and the University of Virginia, Charlottesville/USA, and received his PhD in 2003 from the Technische Universität München. In the following years, he worked as a postdoc at the Max-Planck-Institute for Astrophysics (MPA) in Garching and at the University of California, Santa Cruz/USA. In 2008, Friedrich Röpke habilitated at  the  TU München and also became leader of an Emmy Noether research group at MPA. Three years later, he got appointed professor for Astrophysics at the University of Würzburg. In 2010, the researcher was awarded the "ARCHES Award“ by the German Federal Ministry for Education and Research together with Prof. Avishay Gal-Yam from the Weizmann Institute, Rehovot/Israel. The award honors young scientists whose work shows great potential to have noticeable impact on their respective fields of research.

Friedrich Röpke studies Type Ia supernovae. Observation of these cosmic explosions allows astronomers to determine distances in space. In 2011, the Nobel Prize in Physics was awarded to researchers who proved the accelerated expansion of the Universe with supernovae. The PSO group collaborates closely with one of the laureates from 2011, Brian Schmidt (Australian National University, Canberra) in a program supported by the German Academic Exchange Service DAAD. Friedrich Röpke’s research aims to understand exactly what happens when stars die. Together with other scientists, he used computer simulations to show that some highly-luminous supernovae are the result of two compact stars, so-called “white dwarfs”, merging together. He also investigates alternatives by modeling the explosion of a white dwarf when it reaches its maximum stable mass (the so-called Chandrasekhar limit), using highly complex simulations on supercomputers. White dwarfs are only about the size of the Earth and are extremely dense. When they explode as supernova, they shine brighter than the whole galaxy. "Our detailed simulations helped us to predict data that closely reproduce actual telescope observations of Type Ia supernovae," explains the astrophysicist.

“Modelling of supernova explosions is, however, just one part of our research at HITS,” says Friedrich Röpke. “We also strive for a better understanding of how stars evolve and how the elements that make up our world are formed within them.” Classical astrophysics follows stellar evolution based on very simplifying assumptions. "To improve the predictive power of the models, we have to describe the physical processes taking place within stars in a dynamic way," says the astrophysicist. He and his group have developed a new supcomputer code that – combined with the rapidly increasing capacities of supercomputers – opens new perspectives for the modelling of stars.

In contrast to what we are used to from our solar system, most stars in the Universe exist as part of multiple star systems. The interaction between those stars greatly affects their evolution but the involved physical processes are poorly understood until today. The two astrophysics groups at HITS are cooperating on new supercomputer simulations to bring some light into the darkness.

CAPTION The yellow shape in this figure is the heliopause, the boundary between the heliosphere and the local interstellar medium. The sun sits at the center of this large bubble, but is too small to be seen here. The gray lines are the solar magnetic field lines and the red lines are the interstellar magnetic field. CREDIT M. Opher

The magnetic field of the sun plays a crucial role in shaping the heliosphere -- the domain of the sun -- by accelerating the solar wind into a pair of jets

As the sun skims through the galaxy, it flings out charged particles in a stream of plasma called the solar wind, and the solar wind creates a bubble extending far outside the solar system known as the heliosphere. For decades, scientists have visualized the heliosphere as shaped like a comet, with a very long tail extending thousands of times as far as the distance from the Earth to the sun.

New research suggests that the sun's magnetic field controls the large-scale shape of the heliosphere "much more than had been previously thought," says Merav Opher, associate professor of astronomy and director of the Center for Space Physics at Boston University (BU). In the new model, the magnetic field squeezes the solar wind along the sun's North and South axes, producing two jets that are then dragged downstream by the flow of the interstellar medium through which the heliosphere moves. 

The model indicates that the heliospheric tail doesn't extend to large distances but is split into two by the two jets, and that the format of the jets is similar to that of astrophysical jets observed in many other stars and around black holes. 

"Most researchers don't believe in the importance of the solar magnetic field, because the magnetic pressure on the solar wind's particles is far lower than the thermal pressure of the particles," says Opher, lead author on a paper appearing today in Astrophysical Journal Letters. However, the model shows that tension of the magnetic field controls what happens to the solar wind in the tail. 

Picture a tube of toothpaste with rubber bands wrapped around it, suggests co-author James Drake, professor of physics and director of the Joint Space-Science Institute at the University of Maryland. In this case, the toothpaste is the jet's plasma, and the rubber bands are the rings of the solar magnetic field. "Magnetic fields have tension just like rubber bands, and these rings squeeze in," he says. "So imagine you wrap your toothpaste tube very tightly with a lot of rubber bands, and they will squeeze the toothpaste out the end of your tube."

"Jets are really important in astrophysics," Drake adds. "And from what we can tell, the mechanism that's driving these heliospheric jets is basically the same as it is in, for example, the Crab Nebula. Yet this is really close by. If we're right about all of this, it gives us a local test bed for exploring some very important physics."

"It's also exciting that these jets are very turbulent, and will be very good particle accelerators," says Opher. The jets might, for example, play a role in the acceleration of so-called anomalous cosmic rays "We don't know where these particles are accelerated; it's a bit of a puzzle," she says.

Solving such puzzles will be important for space travel. The heliosphere acts as "a cocoon to protect us, by filtering galactic cosmic rays," she says. "Understanding the physical phenomena that govern the shape of the heliosphere will help us understand the filter." 

The new view of the heliosphere was discovered by accident as the team studied surprising data from the Voyager 1 spacecraft and tried to understand how the galaxy's magnetic field interacts with the heliosphere. 

One of two identical twin spacecraft launched in 1977, Voyager 1 in 2012 became the first man-made object to exit the heliosphere and plunge into interstellar space, according to the National Aeronautics and Space Administration (NASA). 

As the spacecraft approached and then crossed this boundary, "Voyager had very bizarre observations," remarks Opher. It did not register the anticipated major change in the direction of magnetic field as it made the crossing. 

Struggling to explain these unexpected results, the team initially focused on the nose of the heliosphere rather than its tail. "The Voyagers had a flashlight in the kitchen, and nobody was looking in the attic," she remarks. "We noticed, while studying the draping of the galaxy's magnetic field around the nose, that the heliosphere was much shorter than we anticipated." When she ran a much larger numerical simulation that continued to follow the flow of the solar wind, she unveiled the unforeseen two-tailed shape.

More data on the heliosphere's boundaries will become available sometime in the next few years when Voyager 2, like its twin, crosses into interstellar space. 

In the meantime, additional evidence about the shape of the heliosphere is available from two spacecraft that measure so-called neutral energetic atoms (ENAs), particles that are created by interactions between the solar wind and neutral atoms from the interstellar medium, and whose presence gives an indication of the heliosphere's border. 

Opher says that results from the Interstellar Boundary Explorer (IBEX) project can be interpreted to offer support for the two-lobed tail model, although she notes that IBEX scientists offer a different interpretation. Data from the Cassini spacecraft's ENA measurements also may suggest an almost "tailless" heliosphere, she adds.

DSCOVR will deliver better information for better space weather forecasts

NOAA’s Deep Space Climate Observatory (DSCOVR) lifted off from Cape Canaveral, Florida, at 6:03 p.m. EST on its way to an orbit one million miles from Earth. DSCOVR will give NOAA’s Space Weather Prediction Center(SWPC) forecasters more reliable measurements of solar wind conditions, improving their ability to monitor potentially harmful solar activity.

When it reaches its final destination about 110 days from now, and after it completes a series of initialization checks, DSCOVR will be the nation’s first operational satellite in deep space, orbiting between Earth and the Sun at a point called the Lagrange point, or L1. It will take its place at L1 alongside NASA’s Advanced Composition Explorer (ACE) research satellite, replacing the 17-year old ACE as America’s primary warning system for solar magnetic storms headed towards Earth. Meanwhile, ACE will continue its important role in space weather research.

Data from DSCOVR, coupled with a new forecast model that is set to come online later this year, will enable NOAA forecasters to predict geomagnetic storm magnitude on a regional basis. Geomagnetic storms occur when plasma and magnetic fields streaming from the sun impact Earth’s magnetic field. Large magnetic eruptions from the sun have the potential to bring major disruptions to power grids, aviation, telecommunications, and GPS systems.

“Located in line between the sun and the Earth, DSCOVR will be a point of early warning whenever it detects a surge of energy that could trigger a geomagnetic storm destined for Earth,” said Stephen Volz, Ph.D., assistant administrator for NOAA’s Satellite and Information Service. “According to the National Academies of Sciences, a major solar storm has the potential to cost upwards of $2 trillion, disrupting telecommunications, GPS systems, and the energy grid.  As the nation’s space weather prediction agency, when DSCOVR is fully operational and our new space weather forecast models are in place, we will be able to provide vital information to industries and communities to help them prepare for these storms.” 

In addition to space weather–monitoring instruments, DSCOVR is carrying two NASA Earth-observing instruments that will gather a range of measurements from ozone and aerosol amounts, to changes in Earth's radiation.

The DSCOVR mission is a partnership between NOAA, NASA, and the U.S. Air Force. NOAA will operate DSCOVR from its NOAA Satellite Operations Facility in Suitland, Maryland, and process the space weather data at SWPC in Boulder, Colorado, one of NOAA’s nine National Centers for Environmental Prediction.  SWPC will then distribute these space weather data to users within the United States and around the world. The data will be archived at NOAA’s National Geophysical Data Center also in Boulder.

NASA received funding from NOAA to refurbish the DSCOVR spacecraft and its solar wind instruments, develop the command and control portion of the ground segment, and manage launch and activation of the satellite. The Air Force funds and oversees the Falcon 9 launch services for DSCOVR. This is the first Falcon 9 mission launched for the Department of Defense. DSCOVR will also host NASA-funded secondary sensors for Earth and Space science observations. The Earth science data will be processed at NASA’s DSCOVR Science Operations Center and archived and distributed by NASA’s Atmospheric Science Data Center.

 A moderately realistic accretion disk, created by Double Negative artists and gravitationally lensed by a black hole.   Credit  ©Classical and Quantum Gravity, 2015.

The team responsible for the Oscar-nominated visual effects at the centre of Christopher Nolan's epic, Interstellar, have turned science fiction into science fact by providing new insights into the powerful effects of black holes.

In a paper published today, 13 February, in IOP Publishing's journal Classical and Quantum Gravity, the team describe the innovative supercomputer code that was used to generate the movie's iconic images of the wormhole, black hole and various celestial objects, and explain how the code has led them to new science discoveries.

Using their code, the Interstellar team, comprising London-based visual effects company Double Negative and Caltech theoretical physicist Kip Thorne, found that when a camera is close up to a rapidly spinning black hole, peculiar surfaces in space, known as caustics, create more than a dozen images of individual stars and of the thin, bright plane of the galaxy in which the black hole lives. They found that the images are concentrated along one edge of the black hole's shadow.

These multiple images are caused by the black hole dragging space into a whirling motion and stretching the caustics around itself many times. It is the first time that the effects of caustics have been computed for a camera near a black hole, and the resulting images give some idea of what a person would see if they were orbiting around a hole.

The discoveries were made possible by the team's supercomputer code, which, as the paper describes, mapped the paths of millions of lights beams and their evolving cross-sections as they passed through the black hole's warped spacetime. The supercomputer code was used to create images of the movie's wormhole and the black hole, Gargantua, and its glowing accretion disk, with unparalleled smoothness and clarity.

It showed portions of the accretion disk swinging up over the top and down under Gargantua's shadow, and also in front of the shadow's equator, producing an image of a split shadow that has become iconic for the movie.

This weird distortion of the glowing disk was caused by gravitational lensing--a process by which light beams from different parts of the disk, or from distant stars, are bent and distorted by the black hole, before they arrive at the movie's simulated camera.

This lensing happens because the black hole creates an extremely strong gravitational field, literally bending the fabric of spacetime around itself, like a bowling ball lying on a stretched out bed sheet.

Early in their work on the movie, with the black hole encircled within a rich field of distant stars and nebulae instead of an accretion disk, the team found that the standard approach of using just one light ray for one pixel in a supercomputer code--in this instance, for an IMAX picture, a total of 23 million pixels--resulted in flickering as the stars and nebulae moved across the screen.

Co-author of the study and chief scientist at Double Negative, Oliver James, said: "To get rid of the flickering and produce realistically smooth pictures for the movie, we changed our code in a manner that has never been done before. Instead of tracing the paths of individual light rays using Einstein's equations--one per pixel--we traced the distorted paths and shapes of light beams."

Co-author of the study Kip Thorne said: "This new approach to making images will be of great value to astrophysicists like me. We, too, need smooth images."

Oliver James continued: "Once our code, called DNGR for Double Negative Gravitational Renderer, was mature and creating the images you see in the movie Interstellar, we realised we had a tool that could easily be adapted for scientific research."

In their paper, the team report how they used DNGR to carry out a number of research simulations exploring the influence of caustics--peculiar, creased surfaces in space--on the images of distant star fields as seen by a camera near a fast spinning black hole.

"A light beam emitted from any point on a caustic surface gets focussed by the black hole into a bright cusp of light at a given point," James continued. "All of the caustics, except one, wrap around the sky many times when the camera is close to the black hole. This sky-wrapping is caused by the black hole's spin, dragging space into a whirling motion around itself like the air in a whirling tornado, and stretching the caustics around the black hole many times."

As each caustic passes by a star, it either creates two new images of the star as seen by the camera, or annihilates two old images of the star. As the camera orbits around the black hole, film clips from the DNGR simulations showed that the caustics were constantly creating and annihilating a huge number of stellar images.

The team identified as many as 13 simultaneous images of the same star, and as many as 13 images of the thin, bright plane of the galaxy in which the black hole lives.

These multiple images were only seen when the black hole was spinning rapidly and only near the side of the black hole where the hole's whirling space was moving toward the camera, which they deduced was because the space whirl was 'flinging' the images outward from the hole's shadow edge. On the shadow's opposite side, where space is whirling away from the camera, the team deduced that there were also multiple images of each star, but that the whirl of space compressed them inward, so close to the black hole's shadow that they could not be seen in the simulations. 

This is a graphical representation of the question of how fine-tuned life on Earth is under variations of the average light quark mass and α_{EM}. Image courtesy of Dean Lee.

For nearly half a century, theoretical physicists have made a series of discoveries that certain constants in fundamental physics seem extraordinarily fine-tuned to allow for the emergence of a life-enabling universe. Constants that crisscross the Standard Model of Particle Physics guided the formation of hydrogen nuclei during the Big Bang, along with the carbon and oxygen atoms initially fused at the center of massive first-generation stars that exploded as supernovae; these processes in turn set the stage for solar systems and planets capable of supporting carbon-based life dependent on water and oxygen.

The theory that an Anthropic Principle guided the physics and evolution of the universe was initially proposed by Brandon Carter while he was a post-doctoral researcher in astrophysics at the University of Cambridge; this theory was later debated by Cambridge scholar Stephen Hawking and a widening web of physicists around the world.

German scholar Ulf-G Meißner, chair in theoretical nuclear physics at the Helmholtz Institute, University of Bonn, adds to a series of discoveries that support this Anthropic Principle.

In a new study titled "Anthropic considerations in nuclear physics" and published in the Beijing-based journal Science Bulletin (previously titled Chinese Science Bulletin), Professor Meißner provides an overview of the Anthropic Principle (AP) in astrophysics and particle physics and states: "One can indeed perform physics tests of this rather abstract [AP] statement for specific processes like element generation."

"This can be done with the help of high performance [super]computers that allow us to simulate worlds in which the fundamental parameters underlying nuclear physics take values different from the ones in Nature," he explains.

"Specific physics problems we want to address, namely how sensitive the generation of the light elements in the Big Bang is to changes in the light quark mass m_q and also, how robust the resonance condition in the triple alpha process, i.e. the closeness of the so-called Hoyle state to the energy of 4He+8Be, is under variations in m_q and the electromagnetic fine structure constant α_{EM}," he adds.

Brandon Carter initially posited the theory: "The universe (and hence the fundamental parameters on which it depends) must be such as to admit the creation of observers within it at some stage."

Stephen Hawking, expert on the Big Bang and cosmic inflation, extended the dialogue on the Anthropic Principle in a series of papers and books. In "A Brief History of Time," he outlines an array of astrophysics phenomena and constants that seem to support the AP theory, and asks: "Why did the universe start out with so nearly the critical rate of expansion that separates models that recollapse from those that go on expanding forever, that even now, ten thousand million years later, it is still expanding at nearly the critical rate?"

"If the rate of expansion one second after the Big Bang had been smaller by even one part in a hundred thousand million million," he explains, "the universe would have recollapsed before it ever reached its present size."

Professor Ulf-G Meißner, in explaining his new groundbreaking study, states: "The Universe we live in is characterized by certain parameters that take specific values that appear to be remarkably fine-tuned to make life, including on Earth, possible. "

"For example, the age of the Universe must be large enough to allow for the formation of galaxies, stars and planets, and for second- and third-generation stars that incorporated the carbon and oxygen propagated by earlier exploding stars," he says.

"On more microscopic scales, he adds, "certain fundamental parameters of the Standard Model of light quark masses or the electromagnetic fine structure constant must take values that allow for the formation of neutrons, protons and atomic nuclei."

And while the Big Bang Nucleosynthesis gave rise to hydrogen nuclei and alpha particles (4He nuclei), elements widely regarded as essential to life including carbon and oxygen were only produced later, inside massive stars that burned bright and died quickly, some through a supernova explosion that spread these elements to later generations of star systems.

In one series of experiments involving intricate supercomputer simulations on JUQUUEN at Forschungszentrum Jülich, Professor Meißner and his colleagues altered the values of light quark masses from those found in Nature to determine how great a variation would prevent the formation of carbon or oxygen inside massive stars. "Variations in the light quark masses of up to 2-3 percent are unlikely to be catastrophic to the formation of life-essential carbon and oxygen," he concludes. (please see Figure 1)

And earlier, during the Big Bang's generation of the nuclei of first two elements in the Periodic Table, he notes, "From the observed element abundances and the fact that the free neutron decays in about 882 seconds and the surviving neutrons are mostly captured in 4He, one finds a stringent bound on the light quark mass variations ... under the reasonable assumption that the masses of all quarks and leptons appearing in neutron β-decay scale with the Higgs vacuum expectation value."

"Thus," Professor Meißner states, "the Big Bang Nucleosynthesis sets indeed very tight limits on the variations of the light quark mass."

"Such extreme fine-tuning supports the anthropic view of our Universe," he adds.

"Clearly, one can think of many universes, the multiverse, in which various fundamental parameters take different values leading to environments very different from ours," Professor Meißner states.

Professor Stephen Hawking states that even slight alterations in the life-enabling constants of fundamental physics in this hypothesized multiverse could "give rise to universes that, although they might be very beautiful, would contain no one able to wonder at that beauty."

Professor Meißner agrees: "In that sense," he says, "our Universe has a preferred status, and this is the basis of the so-called Anthropic Principle."

Page 7 of 13