NSF renews funding for Two-Dimensional Crystal Consortium

Penn State facility enables the development of new ultra-thin materials for advanced electronics

The National Science Foundation (NSF) announced a renewal of funding for the Materials Innovation Platform (MIP) national user facility at Penn State's Materials Research Institute (MRI), the Two-Dimensional Crystal Consortium (2DCC). The 2DCC is one of four MIPs in the United States and was awarded $20.1 million over five years, an increase of 13% above the initial award in 2016.

MIPs are NSF facilities focused on a specific topic that is funded to stimulate materials research innovation and foster the growth of a national community of users to develop next-generation materials. These groups seek to substantially increase the rate at which new materials and new materials phenomena are discovered. The 2DCC at Penn State follows the "materials by design" concept, combining synthesis, characterization, and theory/simulation applied to targeted outcomes to accelerate materials discovery.

The 2DCC received its first five years of funding in 2016, which was used to nucleate and grow the MIP by developing state-of-the-art equipment for thin film deposition with integrated characterization tools, establishing a bulk growth facility, developing new computational tools and a facility-wide database, and initiating an external user program. In 2020, the 2DCC underwent a renewal process for the second five years of funding.

"Over the past five years, the 2DCC has established itself as a premier facility for the synthesis of wafer-scale 2D films and bulk crystals with unique quantum properties," said Joan Redwing, director of the 2DCC and synthesis lead, and professor of materials science and engineering and electrical engineering.

Two-dimensional (2D) materials are atomically thin layers that, due to the restricted electron and atom motion, have physical characteristics that are not present in three dimensions. The 2DCC focuses on the bulk and thin-film synthesis of 2D chalcogenides, i.e., layered compounds of transition elements such as selenium and sulfur. By controlling the growth of these materials on the atomic level, new materials can be created with unique properties and exotic quantum states that hold the potential for revolutionary new device technologies, such as flexible electronics and quantum computing.

"As an inaugural Materials Innovation Platform, 2DCC MIP exemplifies the power of the Materials Genome Initiative approach with close experiment-theory interactions," said Charles Ying, program director for MIPs and National Facilities and Instrumentation with the Division of Materials Research of the National Science Foundation. "Multi-year efforts of studying and refining growth conditions have paid off, leading to reproducible synthesis of 2D materials that have already benefited more than 100 scientists nationwide. The new experimental and data tools will bring 2DCC to a new level in its second five years."

As a core component of the 2DCC's efforts in synthesizing ground-breaking 2D chalcogenide materials, the 2DCC offers a user program that advances 2D materials research across the U.S., not just at Penn State.

"Researchers outside of Penn State at other universities, companies, or national labs can come on-site to receive training in the facility and carry out their research or request samples grown by 2DCC staff," said Redwing. "In addition to the user program, we also have an in-house team of researchers who collaborate with users on their projects. We've sponsored over 125 user projects in our facility since we started in 2016. So, a big part of the MIP is indeed the user program."

The Penn State MIP has proven to be very beneficial for the development of 2D materials.

"Even before the MIP, Penn State had a number of faculty working on 2D material research," said Redwing. "But getting the MIP funded enabled us to expand and more deeply integrate that activity and initiate research collaborations with other universities and national labs through our user program. It's really helped to make Penn State one of the main centers of activity in 2D materials in the world."

During its first five years, the 2DCC has managed to meet challenges of complexity, scale, and even an unexpected obstacle that affected the entire globe.

"The discovery of high-performance materials is a complex process, and the framework of the MIP integrates research methodologies that efficiently aid the optimal synthesis of 2D materials, with a teaming of theory, synthesis science, in situ metrologies, and machine learning from the large data sets," said Clive Randall, director of MRI and distinguished professor of materials science and engineering. "In addition, the outreach has been very impressive, aiding researchers from all over the United States, and even globally. The 2DCC also maintained their mission during the COVID crisis, including holding a virtual research experience for undergraduates program in the summer of 2020."

The 2DCC is one of four user facilities in MRI, along with the Materials Characterization Lab, the Nanofabrication Lab, and the Materials Computation Center. The 2DCC research staff includes 17 faculty and 13 doctorate-level researchers. Graduate students are also involved in the in-house research program.

Along with Redwing, the 2DCC executive leadership team includes Nitin Samarth, associate director and characterization lead and professor and George A. and Margaret M. Downsbrough Department Head in Physics; Vincent Crespi, theory lead and distinguished professor of physics, materials science and engineering and chemistry; Joshua Robinson, director of user programs and professor of materials science and engineering; Eric Hudson, director of education, outreach and diversity programs and associate professor of physics; Zhiqiang Mao, bulk growth lead and professor of physics; Roman Engel-Herbert, industry lead and associate professor of materials science and engineering and physics; Adri van Duin, distinguished professor of mechanical engineering, chemistry, materials science & engineering, chemical engineering and engineering science and mechanics; Jun Zhu, professor of physics; Wes Reinhart, assistant professor of materials science and engineering and Institute for Computational and Data Sciences faculty co-hire; and Kevin Dressler, operations and user facilities director and affiliate assistant professor of civil engineering.

"The 2DCC has created this critical mass of research activity that has brought considerable attention to Penn State and MRI over the last five years," said Redwing. "The funding we received for new equipment, research support, and other activities has established Penn State and MRI as one of the leading institutes for 2D materials research."

With the renewed funding, the 2DCC will work to build on the progress made in 2D materials research through new collaborations and the existing ones created in the MIP's first five years. Plans for the next five years include the addition of a double crucible Bridgman system for synthesis of bulk crystals with improved composition control, development of an integrated etch/deposition tool for the synthesis of 2D metals, and an expansion of the facility database to enable materials discovery through data science methods.

"We are very proud of the 2DCC leadership, researchers, and staff who have partnered with the NSF to develop, refine and model the MIP program as one of the inaugural awardees back in 2016," said Randall. "We are looking forward to this next era and the scientific discoveries that will inevitable come with the multiple university partnerships which will emerge via the MIP program."

Penn State astrophysicist builds dark matter map that reveals hidden bridges between galaxies

A new map of dark matter in the local universe reveals several previously undiscovered filamentary structures connecting galaxies. The map, developed using machine learning by an international team including a Penn State astrophysicist, could enable studies about the nature of dark matter as well as about the history and future of our local universe.

Dark matter is an elusive substance that makes up 80% of the universe. It also provides the skeleton for what cosmologists call the cosmic web, the large-scale structure of the universe that, due to its gravitational influence, dictates the motion of galaxies and other cosmic material. However, the distribution of local dark matter is currently unknown because it cannot be measured directly. Researchers must instead infer its distribution based on its gravitational influence on other objects in the universe, like galaxies. An international team of researchers has produced a map of the dark matter within the local universe, using a model to infer its location due to its gravitational influence on galaxies (black dots). These density maps--each a cross section in different dimensions--reproduce known, prominent features of the universe (red) and also reveal smaller filamentary features (yellow) that act as hidden bridges between galaxies. The X denotes the Milky Way galaxy and arrows denote the motion of the local universe due to gravity.

"Ironically, it's easier to study the distribution of dark matter much further away because it reflects the very distant past, which is much less complex," said Donghui Jeong, associate professor of astronomy and astrophysics at Penn State and a corresponding author of the study. "Over time, as the large-scale structure of the universe has grown, the complexity of the universe has increased, so it is inherently harder to make measurements about dark matter locally."

Previous attempts to map the cosmic web started with a model of the early universe and then simulated the evolution of the model over billions of years. However, this method is computationally intensive and so far has not been able to produce results detailed enough to see the local universe. In the new study, the researchers took a completely different approach, using machine learning to build a model that uses information about the distribution and motion of galaxies to predict the distribution of dark matter.

The researchers built and trained their model using a large set of galaxy simulations, called Illustris-TNG, which includes galaxies, gasses, other visible matter, as well as dark matter. The team specifically selected simulated galaxies comparable to those in the Milky Way and ultimately identified which properties of galaxies are needed to predict the dark matter distribution.

"When given certain information, the model can essentially fill in the gaps based on what it has looked at before," said Jeong. "The map from our models doesn't perfectly fit the simulation data, but we can still reconstruct very detailed structures. We found that including the motion of galaxies--their radial peculiar velocities--in addition to their distribution drastically enhanced the quality of the map and allowed us to see these details."

The research team then applied their model to real data from the local universe from the Cosmicflow-3 galaxy catalog. The catalog contains comprehensive data about the distribution and movement of more than 17 thousand galaxies in the vicinity of the Milky Way--within 200 megaparsecs. The resulting map of the local cosmic web is published in a paper appearing online on May 26 in The Astrophysical Journal.

The map successively reproduced known prominent structures in the local universe, including the "local sheet"--a region of space containing the Milky Way, nearby galaxies in the "local group," and galaxies in the Virgo cluster--and the "local void"--a relatively empty region of space next to the local group. Additionally, it identified several new structures that require further investigation, including smaller filamentary structures that connect galaxies.

"Having a local map of the cosmic web opens up a new chapter of the cosmological study," said Jeong. "We can study how the distribution of dark matter relates to other emission data, which will help us understand the nature of dark matter. And we can study these filamentary structures directly, these hidden bridges between galaxies."

For example, it has been suggested that the Milky Way and Andromeda galaxies may be slowly moving toward each other, but whether they may collide in many billions of years remains unclear. Studying the dark matter filaments connecting the two galaxies could provide important insights into their future.

"Because dark matter dominates the dynamics of the universe, it basically determines our fate," said Jeong. "So we can ask a [super]computer to evolve the map for billions of years to see what will happen in the local universe. And we can evolve the model back in time to understand the history of our cosmic neighborhood."

The researchers believe they can improve the accuracy of their map by adding more galaxies. Planned astronomical surveys, for example using the James Web Space Telescope, could allow them to add faint or small galaxies that have yet to be observed and galaxies that are further away.

BU researchers use artificial intelligence to determine extent of damage in kidney disease

Chronic kidney disease (CKD) is caused by diabetes and hypertension. In 2017, the global prevalence of CKD was 9.1 percent, which is approximately 700 million cases. Chronic kidney damage is assessed by scoring the amount of interstitial fibrosis and tubular atrophy (IFTA) in a renal biopsy sample. Although image digitization and morphometric (measuring external shapes and dimensions) techniques can better quantify the extent of histologic damage, a more widely applicable way to stratify kidney disease severity is needed.

Now, researchers from Boston University School of Medicine (BUSM) have developed a novel Artificial Intelligence (AI) tool to predict the grade of IFTA, a known structural correlate of progressive and chronic kidney disease.

"Having a computer model that can mimic an expert pathologist's workflow and assess disease grade is an exciting idea because this technology has the potential to increase efficiency in clinical practices," explained corresponding author Vijaya B. Kolachalama, Ph.D., assistant professor of medicine at BUSM.

Typical workflow by the pathologist on the microscope involves manual operations such as panning as well as zooming in and out of specific regions on the slide to evaluate various aspects of the pathology. In the 'zoom out' assessment, pathologists review the entire slide and perform a 'global' evaluation of the kidney core. In the 'zoom in' assessment, they perform in-depth, microscopic evaluation of 'local' pathology in the regions of interest.

An international team of five practicing nephropathologists independently determined IFTA scores on the same set of digitized human kidney biopsies using web-based software (PixelView, deepPath Inc.). Their average scores were taken as a reference estimate to build the deep learning model. To emulate the nephropathologist's approach to grading the biopsy slides under a microscope, the researchers used AI to incorporate patterns and features from sub-regions (or patches) of the digitized kidney biopsy image as well as the entire (global) digitized image to quantify the extent of IFTA. Through this combination of patch-level and global-level data, a deep learning model was designed to accurately predict IFTA grade.

When validated, Kolachalama believes AI models that can automatically score the extent of chronic damage in the kidney can serve as second opinion tools in clinical practices. "Eventually, it may be possible to use this algorithm to study other organ-specific pathologies focused on evaluating fibrosis. Such methods may hold the potential to give more reproducible IFTA readings than readings by nephropathologists," he adds.

Japanese astrophysicist's supercomputer simulations of plasma jets reveal magnetic fields far, far away

Radio telescope images enable a new way to study magnetic fields in galaxy clusters millions of light-years away

For the first time, researchers have observed plasma jets interacting with magnetic fields in a massive galaxy cluster 600 million light-years away, thanks to the help of radio telescopes and supercomputer simulations. The findings can help clarify how such galaxy clusters evolve.

Galaxy clusters can contain up to thousands of galaxies bound together by gravity. Abell 3376 is a huge cluster forming as a result of a violent collision between two sub-clusters of galaxies. Very little is known about the magnetic fields that exist within this and similar galaxy clusters.

"It is generally difficult to directly examine the structure of intracluster magnetic fields," says Nagoya University astrophysicist Tsutomu Takeuchi, who was involved in the research. "Our results clearly demonstrate how long-wavelength radio observations can help explore this interaction." A black hole (marked by the red x) at the centre of galaxy MRC 0600-399 emits a jet of particles that bends into a "double-scythe" T-shape that follows the magnetic field lines at the galaxy subcluster's boundary.

An international team of scientists has been using the MeerKAT radio telescope in the Northern Cape of South Africa to learn more about Abell 3376's huge magnetic fields. One of the telescope's very high-resolution images revealed something unexpected: plasma jets emitted by a supermassive black hole in the cluster bend to form a unique T-shape as they extend outwards for distances as far as 326,156 light-years away. The black hole is in galaxy MRC 0600-399, which is near the centre of Abell 3376.

The team combined their MeerKAT radio telescope data with X-ray data from the European Space Agency's space telescope XXM-Newton to find that the plasma jet bend occurs at the boundary of the subcluster in which MRC 0600-399 exists.

"This told us that the plasma jets from MRC 0600-399 were interacting with something in the heated gas, called the intracluster medium, that exists between the galaxies within Abell 3376," explains Takeuchi.

To figure out what was happening, the team conducted 3D 'magnetohydrodynamic' simulations using one of the world's most powerful supercomputer, ATERUI II, located at the National Astronomical Observatory of Japan.

The simulations showed that the jet streams emitted by MRC 0600-399's black hole eventually reach and interact with magnetic fields at the border of the galaxy subcluster. The jet stream compresses the magnetic field lines and moves along them, forming the characteristic T-shape.

"This is the first discovery of an interaction between cluster galaxy plasma jets and intracluster magnetic fields," says Takeuchi.

An international team has just begun construction of what is planned to be the world's largest radio telescope, called the Square Kilometre Array (SKA).

"New facilities like the SKA are expected to reveal the roles and origins of cosmic magnetism and even to help us understand how the universe evolved," says Takeuchi. "Our study is a good example of the power of radio observation, one of the last frontiers in astronomy."

Michigan physicist suggests a fix to the cosmological cornerstone Hubble constant

More than 90 years ago, astronomer Edwin Hubble observed the first hint of the rate at which the universe expands, called the Hubble constant.

Almost immediately, astronomers began arguing about the actual value of this constant, and over time, realized that there was a discrepancy in this number between early universe observations and late universe observations.

Early in the universe's existence, the light moved through plasma--there were no stars yet--and from oscillations similar to sound waves created by this, scientists deduced that the Hubble constant was about 67. This means the universe expands about 67 kilometers per second faster every 3.26 million light-years. Pictured is the supernova of the type Ia star 1994D, in galaxy NGC 4526. The supernova is the bright spot in the lower left corner of the image.

But this observation differs when scientists look at the universe's later life after stars were born and galaxies formed. The gravity of these objects causes what's called gravitational lensing, which distorts light between a distant source and its observer.

Other phenomena in this late universe include extreme explosions and events related to the end of a star's life. Based on these later life observations, scientists calculated a different value, around 74. This discrepancy is called the Hubble tension.

Now, an international team including a University of Michigan physicist has analyzed a database of more than 1,000 supernovae explosions, supporting the idea that the Hubble constant might not actually be constant.

Instead, it may change based on the expansion of the universe, growing as the universe expands. This explanation likely requires new physics to explain the increasing rate of expansion, such as a modified version of Einstein's gravity.

The team's results are published in the Astrophysical Journal.

"The point is that there seems to be a tension between the larger values for late universe observations and lower values for early universe observation," said Enrico Rinaldi, a research fellow in the U-M Department of Physics. "The question we asked in this paper is: What if the Hubble constant is not constant? What if it actually changes?"

The researchers used a dataset of supernovae--spectacular explosions that mark the final stage of a star's life. When they shine, they emit a specific type of light. Specifically, the researchers were looking at Type Ia supernovae.

These types of supernovae stars were used to discover that the universe was expanding and accelerating, Rinaldi said, and they are known as "standard candles," like a series of lighthouses with the same lightbulb. If scientists know their luminosity, they can calculate their distance by observing their intensity in the sky.

Next, the astronomers use what's called the "redshift" to calculate how the universe's rate of expansion might have increased over time. Redshift is the name of the phenomenon that occurs when light stretches as the universe expands.

The essence of Hubble's original observation is that the further away from the observer, the more wavelength becomes lengthened--like you tacked a Slinky to a wall and walked away from it, holding one end in your hands. Redshift and distance are related.

In Rinaldi's team's study, each bin of stars has a fixed reference value of redshift. By comparing the redshift of each bin of stars, the researchers can extract the Hubble constant for each of the different bins.

In their analysis, the researchers separated these stars based on intervals of redshift. They placed the stars at one interval of distance in one "bin," than an equal number of stars at the next interval of distance in another bin, and so on. The closer the bin to Earth, the younger the stars are.

"If it's a constant, then it should not be different when we extract it from bins of different distances. But our main result is that it actually changes with distance," Rinaldi said. "The tension of the Hubble constant can be explained by some intrinsic dependence of this constant on the distance of the objects that you use."

Additionally, the researchers found that their analysis of the Hubble constant changing with redshift allows them to smoothly "connect" the value of constant from the early universe probes and the value from the late universe probes, Rinaldi said.

"The extracted parameters are still compatible with the standard cosmological understanding that we have," he said. "But this time they just shift a little bit as we change the distance, and this small shift is enough to explain why we have this tension."

The researchers say there are several possible explanations for this apparent change in the Hubble constant--one being the possibility of observational biases in the data sample. To help correct for potential biases, astronomers are using Hyper Suprime-Cam on the Subaru Telescope to observe fainter supernovae over a wide area. Data from this instrument will increase the sample of observed supernovae from remote regions and reduce the uncertainty in the data.