RIT scientists confirm a highly eccentric black hole merger for the first time

For the first time, scientists believe they have detected a merger of two black holes with eccentric orbits. According to researchers from Rochester Institute of Technology’s Center for Computational Relativity and Gravitation and the University of Florida, this can help explain how some of the black hole mergers detected by LIGO Scientific Collaboration and the Virgo Collaboration are much heavier than previously thought possible. Mark Myers, ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav)  Artist’s impression of binary black holes about to collide.

Eccentric orbits are a sign that black holes could be repeatedly gobbling up others during chance encounters in areas densely populated with black holes such as galactic nuclei. The scientists studied the most massive gravitational wave binary observed to date, GW190521, to determine if the merger had eccentric orbits.

“The estimated masses of the black holes are more than 70 times the size of our sun each, placing them well above the estimated maximum mass predicted currently by stellar evolution theory,” said Carlos Lousto, a professor in the School of Mathematical Sciences and a member of the CCRG. “This makes an interesting case to study as a second-generation binary black hole system and opens up to new possibilities of formation scenarios of black holes in dense star clusters.”

A team of RIT researchers including Lousto, Research Associate James Healy, Jacob Lange ’20 Ph.D. (astrophysical sciences and technology), Professor and CCRG Director Manuela Campanelli, Associate Professor Richard O’Shaughnessy, and collaborators from the University of Florida formed to give a fresh look at the data to see if the black holes had highly eccentric orbits before they merged. They found the merger is best explained by a high-eccentricity, precessing model. To achieve this, the team performed hundreds of new full numerical simulations in local and national lab supercomputers, taking nearly a year to complete.

“This represents a major advancement in our understanding of how black holes merge,” said Campanelli. “Through our sophisticated supercomputer simulations and the wealth of new data provided by LIGO and Virgo’s rapidly advancing detectors, we are making new discoveries about the universe at astonishing rates.”

An extension of this analysis by the same RIT and UFL team used a possible electromagnetic counterpart observed by the Zwicky Transient Facility to compute independently the cosmological Hubble constant with GW150521 as an eccentric binary black hole merger. They found excellent agreement with the expected values and recently published the work in the Astrophysical Journal.

Canada's new simulations can improve avalanche forecasting

Supercomputer simulations of snow cover can accurately forecast avalanche hazards, according to a new international study involving researchers from Simon Fraser University. 1642612936144 fd129

Currently, avalanche forecasts in Canada are made by experienced professionals who rely on data from local weather stations and on-the-ground observations from ski and backcountry ski operators, avalanche control workers for transportation and industry, and volunteers who manually test the snowpack.

But simulated snow cover models developed by a team of researchers can detect and track weak layers of snow and identify avalanche hazards in a completely different way—and can provide forecasters with another reliable tool when local data is insufficient or not available, according to a new study published in the journal Cold Regions Science and Technology.

“As far as natural hazards go, avalanches are still one of the leading causes of fatalities in Canada,” says Simon Horton, a post-doctoral fellow with the SFU Centre for Natural Hazards Research and a forecaster with Avalanche Canada.

“We’ve had these complex models that simulate the layers in the snowpack for a few decades now and they’re getting more and more accurate, but it’s been difficult to find out how to apply that to actual decision-making and improving safety.”

Researchers took 16 years’ worth of daily meteorological, snow cover, and avalanche data from two sites in Canada (Whistler and Rogers Pass, both in British Columbia) and Weissfluhjoch in Davos, Switzerland, and ran supercomputer simulations that could classify different avalanche situations.

The simulations could determine avalanche risk, for either natural or artificial release, for problem types such as new snow, wind slab, persistent weak layers, and wet snow conditions. 

“In the avalanche forecasting world, describing avalanche problems – the common scenarios that you might expect to find – are a good way for forecasters to describe avalanche hazard and communicate it to the public, so they know what kind of conditions to expect when they head out,” says Horton. “So that information is already available, except those are all done through an expert assessment based on what they know from available field observations. In a lot of situations, there’s a fair bit of uncertainty about the human assessment of what these types of avalanche problems will be.

“That’s where having more automated tools that can help predict potential hazards can help forecasters better prepare an accurate, precise forecast.”

The results of the study showed the modeling was consistent with the real observed frequencies of avalanches over those 16 years and that the approach has the potential to support avalanche forecasting in the future.

Researchers also believe the modeling might be useful to study the future impacts of climate change on snow instability.

RIKEN scientists achieve key elements for fault-tolerant quantum supercomputation in silicon spin qubits

Researchers from RIKEN and QuTech have achieved a key milestone toward the development of a fault-tolerant quantum supercomputer. They were able to demonstrate a two-qubit gate fidelity of 99.5 percent—higher than the 99 percent considered to be the threshold for building fault-tolerant computers—using electron spin qubits in silicon, which are promising for large-scale quantum supercomputers as the nanofabrication technology for building them already exists.

The world is currently in a race to develop large-scale quantum supercomputers that could vastly outperform classical computers in certain areas. However, these efforts have been hindered by many factors, including in particular the problem of decoherence, or noise generated in the qubits. This problem becomes more serious with the number of qubits, hampering scaling up. To achieve a large-scale computer that could be used for useful applications, it is believed that a two-qubit gate fidelity of at least 99 percent to implement the surface code for error correction is required. This has been achieved in certain types of computers, using qubits based on superconducting circuits, trapped ions, and nitrogen-vacancy centers in diamond, but these are hard to scale up to the millions of qubits required to implement practical quantum computation with an error correction.

To do the current work, the group decided to experiment with a quantum dot structure that was fabricated by nanofabrication on a strained silicon/silicon-germanium quantum well substrate, using a controlled-NOT (CNOT) gate. In previous experiments, the gate fidelity was limited due to slow gate speed. To improve the gate speed, they carefully designed the device and tuned the device operation condition by voltages applied to gate electrodes to combine established fast single-spin rotation technique using micromagnets and a large two-qubit coupling. This allows them to enhance the gate speed by a factor of 10 compared to the previous works. Interestingly, it was previously believed that increasing gate speed would always lead to better fidelity, but they found that there was a limit and that beyond that the increasing speed made the fidelity worse.

Through the work, they discovered that a property called the Rabi frequency—a marker of how the qubits change states in response to an oscillating field—is key to the performance of the system, and they found a range of frequencies for which the single-qubit gate fidelity was 99.8 percent and the two-qubit gate fidelity was 99.5 percent, clearing the required threshold.

Through this, they demonstrated that they could achieve universal operations, meaning that all the basic operations that constitute quantum operations, consisting of a single qubit operation and a two-qubit operation, could be performed with the gate fidelities above the error correction threshold.

To test the capability of the new system, the researchers implemented a two-qubit Deutsch-Jozsa algorithm and the Grover search algorithm. Both algorithms output correct results with high fidelity of 96-97%, demonstrating that silicon quantum supercomputers can perform quantum calculations with high accuracy.

Akito Noiri, the first writer of the study, says, “We are very happy to have achieved a high-fidelity universal quantum gate set, one of the key challenges for silicon quantum computers.”

Seigo Tarucha, leader of the research groups, said, "The presented result makes spin qubits, for the first time, competitive against superconducting circuits and ion traps in terms of universal quantum control performance. This study demonstrates that silicon quantum computers are promising candidates, along with superconductivity and ion traps, for research and development toward the realization of large-scale quantum computers."