LSU prof develops astrophysics code that rapidly models stellar collisions

A breakthrough astrophysics code, named Octo-Tiger, simulates the evolution of self-gravitating and rotating systems of arbitrary geometry using adaptive mesh refinement and a new method to parallelize the code to achieve superior speeds.

This new code to model stellar collisions is more expeditious than the established code used for numerical simulations. The research came from a unique collaboration between experimental computer scientists and astrophysicists in the Louisiana State University Department of Physics & Astronomy, the LSU Center for Computation & Technology, Indiana University Kokomo, and Macquarie University, Australia, culminating in over a year of benchmark testing and scientific simulations, supported by multiple NSF grants, including one specifically designed to break the barrier between computer science and astrophysics. 

{media load=media,id=254,width=350,align=left,display=inline}

"Thanks to a significant effort across this collaboration, we now have a reliable computational framework to simulate stellar mergers," said Patrick Motl, professor of physics at Indiana University Kokomo. "By substantially reducing the computational time to complete a simulation, we can begin to ask new questions that could not be addressed when a single-merger simulation was precious and very time-consuming. We can explore more parameter space, examine a simulation at very high spatial resolution or for longer times after a merger, and we can extend the simulations to include more complete physical models by incorporating radiative transfer, for example."

Recently published paper investigates the code performance and precision through benchmark testing. The authors, Dominic C. Marcello, postdoctoral researcher; Sagiv Shiber, postdoctoral researcher; Juhan Frank, professor; Geoffrey C. Clayton, professor; Patrick Diehl, research scientist; and Hartmut Kaiser, research scientist, all at Louisiana State University--together with collaborators Orsola De Marco, professor at Macquarie University and Patrick M. Motl, professor at Indiana University Kokomo--compared their results to analytic solutions when known and other grid-based codes, such as the popular FLASH. Also, they computed the interaction between two white dwarfs from the early mass transfer through to the merger and compared the results with past simulations of similar systems.

"A test on Australia's fastest supercomputer, Gadi, showed that Octo-Tiger, running on a core count over 80,000, displays excellent performance for large models of merging stars," De Marco said. "With Octo-Tiger, we cannot only reduce the wait time dramatically, but our models can answer many more of the questions we care to ask."

Octo-Tiger is currently optimized to simulate the merger of well-resolved stars that can be approximated by barotropic structures, such as white dwarfs or main-sequence stars. The gravity solver conserves angular momentum to machine precision, thanks to a correction algorithm. This code uses HPX parallelization, allowing the overlap of work and communication and leading to excellent scaling properties to solve large problems in shorter time frames.

"This paper demonstrates how an asynchronous task-based runtime system can be used as a practical alternative to Message Passing Interface to support an important astrophysical problem," Diehl said.

The research outlines the current and planned areas of development aimed at tackling several physical phenomena connected to observations of transients.

"While our particular research interest is in stellar mergers and their aftermath, there are a variety of problems in computational astrophysics that Octo-Tiger can address with its basic infrastructure for self-gravitating fluids," Motl said.

The animation was prepared by Shiber, who says: "Octo-Tiger shows remarkable performance both in the accuracy of the solutions and in scaling to tens of thousands of cores. These results demonstrate Octo-Tiger as an ideal code for modeling mass transfer in binary systems and in simulating stellar mergers."

UMass Lowell physics prof wins $1M in funding for quantum supercomputing research

Kamal receives early career awards from NSF, Air Force

UMass Lowell researcher Archana Kamal has won two early career development awards totaling more than $1 million from the U.S. Air Force and the National Science Foundation (NSF) for her research in the emerging field of quantum information processing (QIP) with open quantum systems.

QIP is based on the principles of quantum mechanics, which mathematically describe the behavior and interaction of matter and light on the atomic and subatomic scale.

While today's digital computers encode data in the form of binary digits, or "bits," which are a series of zeros and ones, quantum supercomputers convert information into quantum bits, or "qubits." A qubit, which is the basic unit of quantum information, represents a two-state, or two-level, quantum systems, such as the up and down spin of an electron or the horizontal and vertical polarization of a photon.

Scientists worldwide, including Kamal, an assistant professor in UMass Lowell's Department of Physics and Applied Physics, are developing next-generation quantum supercomputing technologies with processors that can solve large, highly complex problems much faster than existing supercomputers using the best-known algorithms. This is the reason why for more than a decade, tech giants like Google, IBM, and Microsoft have been investing heavily in quantum supercomputer hardware research.

The biggest challenge to realizing usable quantum processors are "decoherence," the loss and erasure of quantum information due to strong interactions between qubits and the uncontrolled, "noisy" environment around them. This is the central issue that Kamal is addressing in her research. 261197 web 07070

Aside from quantum computing, Kamal's projects could lead to advances in QIP applications and other innovative technologies, including quantum sensing, quantum communication, and quantum cryptography (using quantum mechanical properties to store and transmit data securely).

Kamal was recognized by the Air Force Office of Scientific Research with a Young Investigator Program grant - worth $450,000 over three years - for her work on tunable quantum dissipation, which can be used to develop autonomous, quantum error-correction protocols.

The grant is awarded to faculty researchers who "show exceptional ability and promise" in conducting creative, fundamental research in science and engineering, according to the Air Force.

"My project aims to develop self-correcting qubits using the new field of quantum reservoir engineering; that is, correcting quantum errors by controlling the environment seen by a quantum system instead of controlling the system directly," said Kamal, who lives in Lowell.

She said this approach turns the table on decoherence by designing environments that preserve "quantumness" instead of destroying it.

"Specifically, the Air Force project will focus on extending this unique approach to large, multi-qubit networks, and realizing scalable error correction," Kamal said.

Kamal's five-year NSF CAREER grant totaling more than $557,000 will support her research into the entanglement dynamics of quantum systems in the presence of non-trivial noise.

The CAREER grant is the NSF's "most prestigious award in support of early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization," according to the agency.

Kamal's project will explore how quantum reservoir engineering can be implemented for environments that retain "memory" of the quantum states of the system, or an environment that has a non-trivial quantum dynamic of its own that it can imprint to some degree on the system, causing the quantum system to be controlled in a way which was not possible if it were left alone.

"These aspects of autonomous quantum control are interesting and uncharted territory, both theoretically and experimentally," said Kamal. "Our ultimate goal is to enable quantum technologies that can form the backbone of future quantum computers, which hold out the promise of offering unprecedented advantages over their classical (non-quantum) counterparts, and to answer fundamental questions in quantum physics in the process."

Reading the physics hiding in data

A multidisciplinary team of scientists finds a way for detecting phase transitions in raw data

Information is encoded in data. This is true for most aspects of modern everyday life, but it is also true in most branches of contemporary physics, and extracting useful and meaningful information from very large data sets is a key mission for many physicists.

In statistical mechanics, large data sets are daily business. A classic example is the partition function, a complex mathematical object that describes physical systems at equilibrium. This mathematical object can be seen as made up of many points, each describing a degree of freedom of a physical system, that is, the minimum number of data that can describe all of its properties.

An interdisciplinary team of scientists from the Abdus Salam International Centre for Theoretical Physics (ICTP) and the Scuola Internazionale Superiore di Studi Avanzati (SISSA) showed that such a massive collection of data can be combed through, bringing out fundamental physical properties of an unknown system. Mendes Santos Turkeshi Dalmonte and Rodriguez 820x180 5000d

These results were highlighted in a paper just published in Physical Review X, introducing a new data-based viewpoint on phase transitions. The team showed that a generic statistical property of large data sets that describe a broad range of physical systems at equilibrium, known as intrinsic dimension, can in fact reveal the occurrence of a phase transition.

The authors of the paper, coordinated by Marcello Dalmonte, a researcher in ICTP's Condensed Matter and Statistical Physics Section and SISSA collaborator, come from different backgrounds. Tiago Mendes, a former postdoctoral fellow at ICTP and now at the Max Planck Institute for the Physics of Complex Systems, in Dresden, Germany, works mainly in numerical methods applied to statistical mechanics. Alex Rodriguez is a chemist, previously working at SISSA and now at ICTP, who works in the implementation of complex system algorithms and the development of machine learning methods. Xhek Turkeshi, a Ph.D. student at SISSA, works mostly in statistical physics.

The researchers focussed on a generic statistical property of the data sets, called the intrinsic dimension. The simplest way to describe this property is as the minimum number of variables needed to represent a given data set, without any loss of information. "Take, for example, all the people around the world," explains Rodriguez. "That is a data set by itself. Now, if you want to specify the position of the people around the world, in theory, you would need the coordinates of all their positions in space, that is, three data for each person. But since we can approximate the Earth as a bidimensional surface, we will only need two parameters, that is, the latitude and the longitude. This is what intrinsic dimension is: if the data set was humanity then the intrinsic dimension would be 2, not 3."

In the more theoretical context of statistical systems, the paper shows that this property of intrinsic dimension can reveal collective properties of partition functions at thermal phase transitions. This means that, regardless of what system is under consideration, the data can show if and when that system is undergoing a phase transition. The team has developed a theoretical framework to explain why generic data exhibit such a 'universal' behavior, common to a broad range of different phase transitions, from melting ice to ferromagnets.

"The work introduces a new viewpoint on phase transitions by showing how the intrinsic dimension reveals correspondent structural transitions in data space," say the scientists, "when the ice melts, its data structure does as well."

What is really new in this work is that raw data mirror the physical behavior of the systems under consideration, and that is important for physicists, as it allows them to analyze a system without knowing the physics underlying it. Looking at the data is enough to see if there is a transition happening in the system or not, without even knowing what kind of transition it is. "We could say that this method is completely agnostic," says Mendes. "You don't need to know a priori all the parameters of the system; you just work with raw data and see what comes out of them."

After the interesting results obtained in this research, the team plans to continue working together in the same direction, broadening their field of analysis. They are already working on a second paper, focussing on the so-called 'quantum phase transitions', that is, quantum systems where phase transitions happen at a temperature equal to zero and are induced by external parameters, like the magnetic field.

In terms of applications of these findings, the possibilities are many - from experiments with supercomputer simulations of quantum systems to more fundamental branches of physics, such as quantum chromodynamics, that could also have an impact on nuclear physics. "An interesting possibility of application is in the use of statistical physics techniques to understand machine learning," says Rodriguez. "In this kind of research, that goes from quantum computing to the study of neural networks, for example, phase transitions are very often involved and we could try to use our method to tackle all these kinds of different problems."