Huge simulation finds new origin of supermassive black holes

Supercomputer simulations conducted by astrophysicists at Tohoku University in Japan have revealed a new theory for the origin of supermassive black holes. In this theory, the precursors of supermassive black holes grow by swallowing up not only interstellar gas but also smaller stars as well. This helps to explain the large number of supermassive black holes observed today.

Almost every galaxy in the modern Universe has a supermassive black hole at its center. Their masses can sometimes reach up to 10 billion times the mass of the Sun. However, their origin is still one of the great mysteries of astronomy. A popular theory is the direct collapse model where primordial clouds of interstellar gas collapse under self-gravity to form supermassive stars which then evolve into supermassive black holes. But previous studies have shown that direct collapse only works with pristine gas consisting of only hydrogen and helium. Heavier elements such as carbon and oxygen change the gas dynamics, causing the collapsing gas to fragment into many smaller clouds which form small stars of their own, rather than a few supermassive stars. Direct collapse from pristine gas alone can't explain the large number of supermassive black holes seen today. CAPTION Snapshots of the simulations showing the distribution of matter in the Universe at the time of black hole formation (top) and the density distribution of black hole-producing gas clouds (bottom). In the bottom panel, the black dots near the center of the figure represent massive stars, which are thought to evolve into a black hole in time. The white dots represent stars that are smaller than 10 solar mass and were formed by the fragmentation of the gas cloud. Many of the smaller stars merge with the supermassive stars at the center, allowing the massive stars to grow efficiently.  CREDIT Sunmyon Chon{module INSIDE STORY}

Sunmyon Chon, a postdoctoral fellow at the Japan Society for the Promotion of Science and Tohoku University and his team used the National Astronomical Observatory of Japan's supercomputer "ATERUI II" to perform long-term 3D high-resolution simulations to test the possibility that supermassive stars could form even in heavy-element-enriched gas. Star formation in gas clouds including heavy elements has been difficult to simulate because of the computational cost of simulating the violent splitting of the gas, but advances in supercomputing power, specifically the high calculation speed of "ATERUI II" commissioned in 2018, allowed the team to overcome this challenge. These new simulations make it possible to study the formation of stars from gas clouds in more detail.

Contrary to previous predictions, the research team found that supermassive stars can still form from heavy-element enriched gas clouds. As expected, the gas cloud breaks up violently and many smaller stars form. However, there is a strong gas flow towards the center of the cloud; the smaller stars are dragged by this flow and are swallowed up by the massive stars in the center. The simulations resulted in the formation of a massive star 10,000 times more massive than the Sun. "This is the first time that we have shown the formation of such a large black hole precursor in clouds enriched in heavy-elements. We believe that the giant star thus formed will continue to grow and evolve into a giant black hole," says Chon.

This new model shows that not only primordial gas but also gas containing heavy elements can form giant stars, which are the seeds of black holes. "Our new model is able to explain the origin of more black holes than the previous studies, and this result leads to a unified understanding of the origin of supermassive black holes," says Kazuyuki Omukai, a professor at Tohoku University.

Crusoe Energy Systems donates supercomputing resources to coronavirus vaccine research, discovery efforts

Wasted natural gas to power the fight against COVID-19

Crusoe Energy Systems has deployed more than twenty energy-intensive computing modules throughout America’s oil and gas fields as part of its Digital Flare Mitigation system, which captures otherwise flared or wasted natural gas to power computing processes at the wellhead. Today the company announces that it has begun allocating a portion of its computing systems to the search for a coronavirus vaccine.

Crusoe is working with the Folding@Home Consortium, a distributed supercomputing system for life-science research launched out of Stanford University. The Consortium allows researchers to remotely utilize Crusoe’s computational resources for the vaccine search and discovery process and recently launched a new protein folding simulation project specifically targeting vaccines and therapeutic antibodies for COVID-19.

Crusoe has configured eight of its most advanced graphic processing units to support the Consortium’s vaccine development project and commenced work units for COVID-19 research in Crusoe’s field operations center in North Dakota earlier this week. Crusoe is now one of the largest contributors of computing power to the protein folding Consortium, ranking in the top 10% of computational power providers for the vaccine research system. Crusoe ultimately plans to deploy protein folding servers to multiple flare gas-powered computing modules in the oilfield after expanding network bandwidth at selected sites.

COVID-19 is closely related to the SARS coronavirus. Both coronaviruses infect the lungs when viral proteins bind to receptor proteins in lung cells. A SARS therapeutic antibody, which is a protein that can prevent the SARS coronavirus from binding to lung receptors, has been developed previously. To develop a similar antibody for COVID-19, researchers need to better evaluate how the COVID-19 spike protein binds to receptors in the human body. The Consortium’s new protein folding project simulates antibody proteins and how they might prevent COVID-19 viral infection, however, the simulation process is very computationally intensive and therefore energy-intensive.

Crusoe can support this vaccine research using its distributed computing resources deployed at natural gas flaring sites in Montana, North Dakota, Wyoming, and Colorado. Today, Crusoe consumes millions of cubic feet of natural gas per day that would have otherwise been wasted by burning in the air, or “flared.” Instead, that waste gas powers Crusoe’s mobile, modular computing systems, which are deployed directly to the wellhead to mitigate flaring. Crusoe’s initial computational use case was blockchain processing. More recently the company has been developing high performance and general-purpose cloud computing solutions, which are used in a variety of applications including machine learning, artificial intelligence, and protein folding.

“At this time of growing global concern around the coronavirus, we are grateful to have the opportunity to support the Folding@Home Consortium’s search for a vaccine,” said Chase Lochmiller, CEO and co-founder of Crusoe. “We’ve configured very powerful computing hardware that is typically used for machine learning and artificial intelligence research to search for helpful therapies against coronavirus. This is very much in keeping with Crusoe’s vision that distributed computing resources have an important role to play in solving real-world problems.”

Crusoe began processing work units for COVID-19 on March 15th. In addition to COVID-19, the Company has previously completed work units related to cancer research.

Scientists from Lyon, São Paulo present technologies that enable light in quantum supercomputing systems at FAPESP Week France

The researchers covered integrated photonics, nanotechnology, and quantum entanglement, among other fields that have broadened the research possibilities in optical communication systems and could support the development of quantum supercomputers.

“Quantum characteristics inspire new concepts in information science, in a context in which efficiencies in computing, storage, and information transport can be increased,” said Paulo Nussenzveig, a full professor at the University of São Paulo’s Physics Institute (IFUSP), at the symposium held in Lyon and Paris until November 27th of 2019.

Nussenzveig spoke about quantum information with multiple modes of light and highlighted that quantum light is suitable not only for communication but also for supercomputing. IMG 1072 baixa 62d85{module INSIDE STORY}

The researcher also spoke about another highly important concept in studies for developing quantum information transmission devices: entanglement.  

Quantum entanglement is a phenomenon of quantum mechanics according to which two or more objects (which can be waves or particles) are so interlinked that one object cannot be correctly described without the other being mentioned, even if they are separated by millions of kilometers. This leads to very strong correlations between the observable physical properties of various subatomic particles.

Nussenzveig remembered what Einstein, Podolsky and Rosen Due wrote when they introduced the concept of quantum entanglement, “measures in one particle provide information about the state of another particle. Since, at the time of measuring, the systems are not interacting, no change can occur in the second system as a consequence of something that occurs in the first system,” said Nussenzveig.

In an experiment carried out by Marcelo Martinelli’s and Nussenzveig’s group at IFUSP, the researchers managed to obtain an entanglement of six light waves, generated by a laser beam source developed by the scientists, called an Optical Parametric Oscillator (OPO). 

The results of the research – conducted within the scope of the Thematic Project Exploring quantum information with atoms, crystals, and chips,” with FAPESP’s support, conducted by Martinelli – were published in the journal Physical Review Letters, in 2018.

Previously, in 2009, the group at IFUSP obtained the entanglement of three light waves. The research was published in Science.

Integrated quantum photonics

Felippe Alexandre Silva Barbosa, of the Gleb Wataghin Institute of Physics of the University of Campinas (UNICAMP), spoke about integrated quantum photonics in the same session at FAPESP Week France.

“In integrated photonics, the main objective is to develop technologies that enable information to be codified, transmitted, and processed using light,” said Barbosa, who is also carrying out research with the Optical Parametric Oscillator together with Nussenzveig and Martinelli – all three are authors of the 2018 article in Physical Review Letters.

“The area of integrated photonics has been at the front line of scientific and technological research for more than a decade. More recently, developments in integrated photonics platforms have led to progress in other fields, such as non-linear optics, microfluidics, machine learning, and quantum information,” he said.

According to Barbosa, many advances in the areas of quantum information and integrated photonics have helped to “push the limits both in fundamental scientific research and in cutting-edge technology.”

“In the last few years, there has been a growing intersection between these two areas of research, resulting in the emerging field of integrated quantum photonics,” he said. With FAPESP’s support, Barbosa is carrying out a research project whose main objective is to study this new field.

“The research is focused on preparing compressed and entangled states of individual photons, using thin films of silicon nitride and lithium niobate, two scalable and integrated photonic platforms,” he said.

Nanotechnology in Lyon

“The Lyon Institute of Nanotechnology is a collaborative structure with a presence in various campuses of the University of Lyon. We are around 200 people working in materials science and on new concepts of devices and systems integration,” said Christian Seassal, assistant director of the Lyon Institute of Nanotechnology, to Agência FAPESP.

One of the speakers at FAPESP Week France, Seassal mentioned that the Lyon Institute of Nanotechnology has research in four main areas: Functional Materials; Electronics; Photonics and Photovoltaic Energy; and Biotechnology and Materials Engineering.  

“Since 2007, we have worked with dedicated technologies in terms of functional oxides, semiconductor materials, nanomaterials including nanowires, for example, and on new concepts in information processing, energy generation, biotechnology, and health, among others,” he said. 

Photonic crystals, nanotechnological systems integration, silicon photovoltaic devices, biomedical sensors, intelligent clothing, and nanofluids are other research topics at the Lyon Institute of Technology.