LATEST

NIH award to propel biomedical informatic, clinical, and animal studies

Researchers from Case Western Reserve University School of Medicine and collaborators have received a five-year, $2.8 million grant from the National Institute on Aging to identify FDA-approved medications that could be repurposed to treat Alzheimer's disease. The award enables the researchers to develop supercomputer algorithms that search existing drug databases, and to test the most promising drug candidates using patient electronic health records and Alzheimer's disease mouse models.

Rong Xu, PhD is principal investigator on the new award and associate professor of biomedical informatics in the department of population and quantitative health sciences at Case Western Reserve University School of Medicine. The project builds on Xu's work developing DrugPredict, a computer program that connects drug characteristics with information about how they may interact with human proteins in specific diseases--like Alzheimer's. The program has already helped researchers identify new indications for old drugs.

"We will use DrugPredict, but the scope of this project is much more ambitious," Xu said. Xu plans to develop new algorithms as well as apply DrugPredict to Alzheimer's, building a publicly available database of putative Alzheimer's drugs in the process. The database will include drugs that have potentially beneficial mechanisms of action and that are highly likely to cross the blood-brain barrier.

The blood-brain barrier has been a major obstacle in drug discovery for brain disorders, including Alzheimer's disease. The protective membrane surrounds the brain to keep out foreign objects--like microbes--but can also keep out beneficial drugs. "Finding drugs that can pass the blood-brain barrier is the 'holy grail' for neurological drug discovery," Xu said. "With this award, we will develop novel machine-learning and artificial intelligence algorithms to predict whether chemicals can pass the blood-brain barrier and whether they may be effective in treating Alzheimer's disease."

The research team includes XiaoFeng Zhu, PhD, professor of population and quantitative health sciences at Case Western Reserve University School of Medicine, and David Kaelber, MD, PhD, chief medical informatics officer at MetroHealth. Together, they will work with Xu's research group to validate repurposed candidate drugs with computer-simulated clinical trials, involving electronic health records from more than 53 million patients nationwide. Xu will also team up with co-principal investigator on the award Riquiang Yan, PhD, vice chair of neurosciences at Cleveland Clinic's Lerner Research Institute, to study the most promising drug candidates in mice genetically modified to have Alzheimer's-like disease.

The new award could lead to human studies testing alternative medications for Alzheimer's disease. "The unique and powerful strength of our project is our ability to seamlessly combine novel computational predictions, clinical corroboration, and experimental testing," Xu said. "This approach allows us to rapidly identify innovative drug candidates that may work in real-world Alzheimer's disease patients. We anticipate the findings could then be expeditiously translated into clinical trials."

Xu was invited to speak about the newly funded project at the National Institutes of Health Alzheimer's Research Summit this March. She will also serve as a panelist on Emerging Therapeutics at the conference.

CAPTION The schematic figure illustrates the concept and behavior of magnetoresistance. The spins are generated in topological insulators. Those at the interface between ferromagnet and topological insulators interact with the ferromagnet and result in either high or low resistance of the device, depending on the relative directions of magnetization and spins.

University of Minnesota researchers demonstrate the existence of a new kind of magnetoresistance involving topological insulators

From various magnetic tapes, floppy disks and computer hard disk drives, magnetic materials have been storing our electronic information along with our valuable knowledge and memories for well over half of a century.

In more recent years, the new types phenomena known as magnetoresistance, which is the tendency of a material to change its electrical resistance when an externally-applied magnetic field or its own magnetization is changed, has found its success in hard disk drive read heads, magnetic field sensors and the rising star in the memory technologies, the magnetoresistive random access memory.

A new discovery, led by researchers at the University of Minnesota (Seymour Cray's alma mater), demonstrates the existence of a new kind of magnetoresistance involving topological insulators that could result in improvements in future computing and computer storage. The details of their research are published in the most recent issue of the scientific journal Nature Communications.

"Our discovery is one missing piece of the puzzle to improve the future of low-power computing and memory for the semiconductor industry, including brain-like computing and chips for robots and 3D magnetic memory," said University of Minnesota Robert F. Hartmann Professor of Electrical and Computer Engineering Jian-Ping Wang, director of the Center for Spintronic Materials, Interfaces, and Novel Structures (C-SPIN) based at the University of Minnesota and co-author of the study.

Emerging technology using topological insulators

While magnetic recording still dominates data storage applications, the magnetoresistive random access memory is gradually finding its place in the field of computing memory. From the outside, they are unlike the hard disk drives which have mechanically spinning disks and swinging heads--they are more like any other type of memory. They are chips (solid state) which you'd find being soldered on circuit boards in a computer or mobile device.

Recently, a group of materials called topological insulators has been found to further improve the writing energy efficiency of magnetoresistive random access memory cells in electronics. However, the new device geometry demands a new magnetoresistance phenomenon to accomplish the read function of the memory cell in 3D system and network.

Following the recent discovery of the unidirectional spin Hall magnetoresistance in a conventional metal bilayer material systems, researchers at the University of Minnesota collaborated with colleagues at Pennsylvania State University and demonstrated for the first time the existence of such magnetoresistance in the topological insulator-ferromagnet bilayers.

The study confirms the existence of such unidirectional magnetoresistance and reveals that the adoption of topological insulators, compared to heavy metals, doubles the magnetoresistance performance at 150 Kelvin (-123.15 Celsius). From an application perspective, this work provides the missing piece of the puzzle to create a proposed 3D and cross-bar type computing and memory device involving topological insulators by adding the previously missing or very inconvenient read functionality.

In addition to Wang, researchers involved in this study include Yang Lv, Delin Zhang and Mahdi Jamali from the University of Minnesota Department of Electrical and Computer Engineering and James Kally, Joon Sue Lee and Nitin Samarth from Pennsylvania State University Department of Physics.

This research was funded by the Center for Spintronic Materials, Interfaces and Novel Architectures (C-SPIN) at the University of Minnesota, a Semiconductor Research Corporation program sponsored by the Microelectronics Advanced Research Corp. (MARCO) and the Defense Advanced Research Projects Agency (DARPA).

To read the full research paper entitled "Unidirectional spin-Hall and Rashba-Edelstein magnetoresistance in topological insulator-ferromagnet layer heterostructures," visit the Nature Communications website.

Starboard Value, the largest shareholder of Mellanox Technologies, with an ownership interest of approximately 10.7% of the Company's outstanding shares, has delivered a letter to Eyal Waldman, Mellanox's President and Chief Executive Officer, and the Mellanox Board of Directors. Starboard's letter attributed the Mellanox's underperformance to “weak execution,” “excessive spending” and “missed growth opportunities.” This letter is accompanied by a set of supplemental slides that provide further detail on several key topics.

“We invested in Mellanox because we believe the company is deeply undervalued and there are significant opportunities to create value based on actions within the control of management and the Board,” Peter Feld, a managing member of Starboard wrote. “Mellanox has a highly differentiated product set and a leading market position in a number of key verticals.” However, “Despite an extremely strong product and technology portfolio, Mellanox has been one of the worst performing semiconductor companies for an extended period of time,” Feld wrote.

The full text of Starboard's letter to the CEO and Board and the accompanying slides can be viewed at the following links:

Click the following link for this Letter .

Click the following link for the slides .

CAPTION This is Mohammed Alshayeb (left) and Afnan Barri. CREDIT Rick Hellman

With global warming an ever-present worry, renewable energy - particularly solar power -- is a burgeoning field. Now, two doctoral students in the School of Architecture & Design (Arc/D) have demonstrated methods of optimizing the capture of sunlight in experiments at the Center for Design Research.

Green-roof boost

Mohammed Alshayeb started by asking himself what might be done to boost the performance of solar panels. "The efficiency of a photovoltaic panel is measured under standard testing conditions - at 77 degrees Fahrenheit," he said. "Every degree that the temperature increases decreases performance."

Alshayeb wondered if there was a way to "extract the heat out of the panels" when the temperature rises above 77. Because most solar panels are installed on building roofs, Alshayeb decided to compare the effects of three different types of roof materials - highly reflective (i.e., white), conventional (black) and vegetated (green) - on the panels' performance.

The CDR roof is mostly covered with sedum, planted in trays. So Alshayeb established his test bed there, installing a solar panel monitoring system over the green roof, as well as nearby white and black portions. He also installed temperature, humidity and light sensors and a weather station to record conditions like wind speed. The sensors made recordings every five minutes for a year, and Alshayeb then analyzed the data.

What he found was that, contrary to industry practice, which favors white roofs over black, white roofs actually slightly decreased the efficiency of the solar panels due to the heat they reflected up toward the panels. However, compared to the vegetated roof, the high-reflective and conventional roof materials were not significantly different from one another. Panels installed over the green roof performed best, generating an average of 1.4 percent more energy as compared to those over the white and black roofs.

"There is a lot of research in this area, but nothing as comprehensive as he has done," said Alshayeb's faculty adviser, Associate Professor of Architecture Jae D. Chang. "The next step is to see the effect of increasing the height of the panel over the roof."

Bending light

Another of Chang's students, Afnan Barri, wanted to see whether she could improve the performance of light shelves. A traditional light shelf is a fixed, horizontally mounted plane that can be placed either outside, inside or on both sides of a window in order to reflect and redirect sunlight inside a building. Light shelves can thus reduce the use of artificial lighting and electricity.

Traditional, fixed light-shelf systems have limited effectiveness, as they are only capable of functioning while the angle of the sun to the earth is just right. Previous experiments have shown that movable light shelves and ones with curved surfaces can diffuse sunlight with greater efficiency than traditional fixed, flat systems. This is where Barri's idea of a Dynamic Thermal-Adaptive Curved Lightshelf (DTACL) came about. She thought: "What if there were a system that could combine all these methods to enhance the delivery of natural light into buildings throughout the day without the use of mechanical and electrical controls, and unlike existing movable systems?"

Her project includes supercomputer simulations and a field experiment to collect a year's worth of data on the performance of the DTACL system through different weather conditions on the KU campus. She created and placed on the lawn of the CDR four experimental rooms the size of refrigerators fitted with sensors and light shelves. Three of the rooms have fixed light shelves in various configurations, while one, the DTACL, uses an adaptive, composite material called Thermadapt, invented by Ronald P. Barrett and commercialized by a company he runs with his son, KU Professor of Engineering Ron Barrett-Gonzalez. Thermadapt changes shape in response to heat and sunlight, curving upward. When it cools, it flattens back out.

Barri theorized that the DTACL system would transfer light inside a building more efficiently than the fixed systems, and her initial results have proven that to be the case.

"I am still in the process on collecting, comparing and analyzing these data," she said. "However, based on a two-month pilot study and computer simulations, the indoor light intensity of the DTCAL system is twice as great as the intensity of a fixed, traditional light shelf.

"I'd like to take it overseas and perform an experiment like this in more extreme temperatures," said the native of Saudi Arabia.

Michael Schwarz, Moritz Lipp and Daniel Gruss (v.l.) from TU Graz were chiefly involved in the recent descovery of devastating vulnerabilities in computer processors.

Researchers with Alphabet Inc's Google Project Zero, in conjunction with an international team including TU Graz in Austria (Nikola Tesla’s Alma Mater), have revealed two new vulnerabilities in computer processors: Meltdown and Spectre. PCs, servers and cloud services are affected. A software patch could help. But, the patch will bleed off system performance by as much as 30 percent. In supercomputers, this is both security and performance concerns that could result in significant expenditures in man-hours to patch systems and potentially affect critical available processor resources.

 

Around the turn of the year speculation was rife about new, serious vulnerabilities that could affect all modern processors. Now it is official: two newly-discovered exploits, Meltdown and Spectre, could allow unauthorised users to gain direct access to kernel memory, at the heart of computer systems. The issue was discovered by an 10-strong international research team, in which Graz University of Technology’s Institute of Applied Information Processing and Communications plays a central role. Both exploits take advantage of a central operating principle of fast processors. Intel, AMD and ARM chips are chiefly affected.

“Meltdown is a very simple exploit – only four lines of code are needed to gain access,” explained Moritz Lipp, Michael Schwarz and Daniel Gruss from TU Graz. “Spectre is significantly more labour-intensive, and consequently more difficult to protect against. It uses the code itself to trick the system into giving up its secrets.” The vulnerabilities affect private computers as well as most server infrastructure and cloud-based services currently in use.

In order for computer systems to work faster, modern processors perform calculations in parallel rather than sequentially. In parallel to lengthy tasks, the processor attempts to predict the next steps that will be required and prepare for them. “For performance reasons, at that point no check is made as to whether the program accessing data actually has permission to do so,” the Graz researchers explained. If a predicted step is not required, or if permissions are not present, the processor discards the preparations it has made. This preparation phase can be exploited to read data from the kernel – for example passwords saved in commonly used browsers.

Patch from Graz protects against Meltdown exploit

KAISER, a patch developed by the Graz researchers at the institute, is designed to help secure the vulnerabilities. Because attacks would be directed against hardware but carried out via software, developers from the major IT companies have adapted and further developed their proposal, and are delivering patches with their latest security updates. “The update affects the central functions of fast processors, and could make a difference especially in terms of speed,” explained Gruss, Lipp und Schwarz. “Nevertheless, we would appeal to all users to install these updates. The largest providers of cloud and server solutions will implement them in the coming days.” Manufacturers still have work to do to solve the problem in the hardware itself – especially since the patch is effective against Meltdown, but not against the Spectre exploit.

The research was done in the framework of the ERC-funded project “Sophia”. ERC-grant-awardee Stefan Mangard of TU Graz´s Institute of Applied Information Processing and Communications played a central role.

CAPTION SMEAR II is a station for measuring environmental data in Hyytiälä, Finland. CREDIT Juho Aalto

We also need to share our data. So says one of the world's most prominent geoscientist, Markku Kulmala, professor of physics at the University of Helsinki, Finland, and head of the Aerosol and Haze Laboratory at the Beijing University of Chemical Technology, China.

Environmental challenges, climate change, water and food security and urban air pollution, they are all interlinked, yet each is studied as such, separately. This is not a sustainable situation, for anybody anymore. To tackle this, professor Markku Kulmala calls for a continuous, comprehensive monitoring of interactions between the planet's surface and atmosphere in his article published in Nature, January 4, 2018.

In his article "Build a global Earth observatory", he refers to his long experience of collecting environmental data. He has built a station, and not just one, but probably the most impressive station, called SMEAR II (Station for Measuring Ecosystem-Atmosphere Relationships), in the boreal forests of Finland showing how a rounded set of environmental measurements can be obtained.

Now building on a large scale, the answer is a global Earth observatory, consisting of 1,000 or more well-equipped ground stations around the world that track environments and key ecosystems fully and continuously. Data from these stations would be linked to data from satellite-based remote sensing, laboratory experiments and supercomputer models accordingly.

"Incomplete coverage from ground stations is the main limit to observations of Earth's conditions. Satellites can continuously, online 24/7, monitor some compounds, such as CO2, ozone and aerosols, almost planet-wide. But they cannot resolve processes or fluxes, or trace the hundreds more compounds of interest. Satellite data must be 'ground-truthed', professor Kulmala says.

This global observatory of 1,000 super stations needs to be established soon, within 10-15 years.

"The costs would be around €10 million (US$11.8 million) to €20 million per station, which can be compared to the building costs of the Large Hadron Collider at CERN, Geneva, Switzerland, or that of US President Donald Trump's proposed Mexican wall."

Nevertheless, a shift in how environmental data are collected and disseminated is needed, there is no question about that.

"There is a scientific interest, as well, in this data," professor Markku Kulmala says, "the researchers could find new mechanisms and feedback loops in this coherent data set." 

Nucleic acid sequencing methods, which determine the order of nucleotides in DNA fragments, are rapidly progressing. These processes yield large quantities of sequence data—some of which is dynamic—that helps researchers understand how and why organisms function like they do. Sequencing also benefits epidemiological studies, such as the identification, diagnosis, and treatment of genetic and/or contagious diseases. Advanced sequencing technologies reveal valuable information about the time evolution of pathogen sequences. Because researchers can estimate how a mutation behaves under the pressure of natural selection, they are thus able to predict the impact of each mutation—in terms of survival and propagation—on the fitness of the pathogen in question. These predictions lend insight to infectious disease epistemology, pathogen evolution, and population dynamics.

In a paper published earlier this month in the SIAM Journal on Applied Mathematics, Ryosuke Omori and Jianhong Wu develop an inductive algorithm to study site-specific nucleotide frequencies using a multi-strain susceptible-infective-removed (SIR) model. A SIR model is a simple compartmental model that places each individual in a population at a given time into one of the three aforementioned categories to compute the theoretical number of people affected by an infectious disease. The authors use their algorithm to calculate Tajima’s D, a popular statistical test that measures natural selection at a specific site by analyzing differences in a sample of sequences from a population. In a non-endemic situation, Tajima’s D can change over time. Investigating the time evolution of Tajima’s D during an outbreak allows researchers to estimate mutations relevant to pathogen fitness. Omori and Wu aim to understand the impact of disease dynamics on Tajima’s D, thus leading to a better understanding of a mutation’s pathogenicity, severity, and host specificity.

The sign of Tajima’s D is determined by both natural selection and population dynamics. “Tajima's D equals 0 if the evolution is neutral — no natural selection and a constant population size,” Omori said. “A nonzero value of Tajima's D suggests natural selection and/or change in population size. If no natural selection can be assumed, Tajima's D is a function of the population size. Hence, it can be used to estimate time-series changes in population size, i.e., how the epidemic proceeds.”

Differential equations, which model the rates of change of the numbers of individuals in each model compartment, can describe population dynamics. In this case, the population dynamics of hosts infected with the strain carrying a given sequence are modeled by a set of differential equations for that sequence, which include terms describing the mutation rate from one sequence to another. When setting up their multi-strain SIR model, Omori and Wu assume that the population dynamics of the pathogen is proportional to the disease dynamics. i.e., the number of pathogens are proportional to the number of infected hosts. This assumption allows the value of Tajima’s D to change.

In population genetics, researchers believe that the sign of Tajima’s D is affected by population dynamics. However, the authors show that in the case of a SIR deterministic model, Tajima’s D is independent of the disease dynamics (specifically, independent of the parameters for disease transmission rate and disease recovery rate). They also observe that while Tajima’s D is often negative during an outbreak’s onset, it frequently becomes positive with the passage of time. “The negative sign does not imply an expansion of the infected population in a deterministic model,” Omori said. “We also found the dependence of Tajima's D on the disease transmission dynamics can be attributed to the stochasticity of the transmission dynamics at the population level. This dependence is different from the aforementioned existing assumption about the relation between population dynamics and the sign of Tajima's D.”

Ultimately, Omori and Wu prove that Tajima’s D in a deterministic SIR model is completely determined by mutation rate and sample size, and that the time evolution of an infectious disease pathogen’s genetic diversity is fully determined by the mutation rate. “This work revealed some dependence of Tajima's D on the (disease transmission dynamics) basic reproduction number (R0) and mutation rate,” Omori said. “With the assumption of neutral evolution, we can then estimate mutation rate or R0 from sequence data.”

Given the demand for tools that analyze evolutionary and disease dynamics, the observation that Tajima’s D depends on the stochasticity of the dynamics is useful when estimating epidemiological parameters. For example, if sequences of pathogens are sampled from a small outbreak in a limited host population, then Tajima’s D depends on both the mutation rate and R0; therefore, a joint estimate of these parameters from Tajima’s D is possible. “We are applying this theoretic result to analyze real-world epidemiological data,” Omori said. “We should also see if our approach can be used to investigate non-equilibrium disease dynamics with natural selection.”

Ryosuke Omori was funded by a Japan Society for the Promotion of Science (JSPS) talented expert project, led by Hiroshi Nishiura at Hokkaido University.

This year, Nicholas Reich's Lab is leading an effort to improve influenza forecasting by increasing the collaboration between groups participating in the Centers for Disease Control's flu forecasting challenge.

Research teams, one led by biostatistician Nicholas Reich at UMass Amherst, are participating in a national influenza forecasting challenge to try to predict the onset, progress and peaks of regional flu outbreaks

Research teams, including one led by biostatistician Nicholas Reich at the University of Massachusetts Amherst, are participating in a national influenza forecasting challenge to try to predict the onset, progress and peaks of regional flu outbreaks to aid prevention and control. This year, the Reich Lab is leading an effort to improve the forecasting by increasing the collaboration between groups.

Reich explains, "Every year the Centers for Disease Control host a flu forecasting challenge. It's the most organized and public effort at forecasting any infectious disease anywhere in the world. Our lab is now in our third year of participating, and we find that each year we get a little better and learn a bit more."

"This year, we wanted to take it to the next level, so we worked with other teams year-round to develop a way that our models could work together to make a single best forecast for influenza. This entire effort is public, so anyone can go to the website and see the forecasts."

While this flu season has started earlier than usual in the northeastern and southern U.S. according to the most recent data, the forecasts are still showing a fair amount of uncertainty about how big a season it will be, says Reich. "The holiday season is a notoriously difficult time to forecast because typically fewer people go to the doctor, and yet everyone is traveling around spreading or being exposed to infections such as flu."

Reich and colleagues at UMass Amherst's School of Public Health and Health Sciences collaborate with teams at Carnegie Mellon University, Columbia University and a group at Los Alamos National Laboratory, New Mexico, in a group they have dubbed the "FluSight Network." It issues a new flu season forecast every Monday for public health researchers and practitioners that compares the flu trajectory this year to past years.

In a recent publication, Reich and colleagues state that their aim is to "combine forecasting models for seasonal influenza in the U.S. to create a single ensemble forecast. The central question is, can we provide better information to decision makers by combining forecasting models and specifically, by using past performance of the models to inform the ensemble approach." Reich adds, "We are working closely with our collaborators at the CDC to determine how to improve the timeliness and relevance of our forecasts."

To prepare for this flu season, he and colleagues spent many hours designing a standard structure that each team needed to use when submitting models. This allowed for comparison of methods over the past seven years of flu data in the U.S. They also conducted a cross-validation study of data from the past seven flu seasons to compare five different methods for combining models into a single ensemble forecast. They found that four of their collaborative ensemble methods had higher average scores than any of the individual models.

The team is now submitting forecasts from their best performing model and are posting them once a week this season to the CDC's 2017-18 FluSight Challenge. Reich estimates that there are about 20 teams this year participating in the CDC challenge nationwide, who produce about 30 different models. Each model forecasts the onset of the flu season, how it will progress over the coming few weeks, when it will peak, and how intense the peak will be compared to other seasons.

In a heavy flu season, between 5-12 percent of doctor's visits are for influenza-like-illness, and that number varies regionally in the U.S. This metric is one of the key indicators for the CDC of how bad the flu season is, and it is the measure used in the forecasting challenges.

Reich says, "Certainly for the CDC, there are policy decisions that could be impacted by these forecasts, including the timing of public communication about flu season starting and when to get vaccinated. Models can help with all of that. Also, hospitals often try to have enhanced precautions in place during a certain peak period for the disease. If you do that too early, or for too long, you run the risk of individuals getting tired of taking the extra time to comply with the policies."

Hospital epidemiologists and others responsible for public health decisions do not declare the onset of flu season lightly, Reich says. In hospitals, flu onset - a technical set of symptoms reported to physicians - triggers many extra time-consuming and costly precautions and procedures such as added gloves, masks and gowns, donning and doffing time, special decontamination procedures, increased surveillance and reduced visitor access, for example. There is also healthcare worker fatigue to consider. Hospitals want to be as effective and efficient as possible in their preparations and response to reduce time and money spent and worker burnout.

The public health effort to improve flu season forecasts is relatively recent, Reich says. "There has been tremendous progress in how we think about infectious disease forecasting in just the last five years," he notes. "If you compare that to something like weather forecasting, which has been going on for decades, we're in the middle of a long process of learning and improvement. Someday we might be able to imagine having a flu forecast on our smart phones that tells us, for example, it's an early season and I'd better get Mom to the clinic to get her vaccination early this year. We're close, but that's not here quite yet."

Credit: NRAO/AUI/NSF: D. Berry

Big data distinguish between different theoretical models

Three months of observations with the National Science Foundation's Karl G. Jansky Very Large Array (VLA) have allowed astronomers to zero in on the most likely explanation for what happened in the aftermath of the violent collision of a pair of neutron stars in a galaxy 130 million light-years from Earth. What they learned means that astronomers will be able to see and study many more such collisions.

On August 17, 2017, the LIGO and VIRGO gravitational-wave observatories combined to locate the faint ripples in spacetime caused by the merger of two superdense neutron stars. It was the first confirmed detection of such a merger and only the fifth direct detection ever of gravitational waves, predicted more than a century ago by Albert Einstein.

The gravitational waves were followed by outbursts of gamma rays, X-rays, and visible light from the event. The VLA detected the first radio waves coming from the event on September 2. This was the first time any astronomical object had been seen with both gravitational waves and electromagnetic waves.

The timing and strength of the electromagnetic radiation at different wavelengths provided scientists with clues about the nature of the phenomena created by the initial neutron-star collision. Prior to the August event, theorists had proposed several ideas -- theoretical models -- about these phenomena. As the first such collision to be positively identified, the August event provided the first opportunity to compare predictions of the supercomputer models to actual observations.

Astronomers using the VLA, along with the Australia Telescope Compact Array and the Giant Metrewave Radio Telescope in India, regularly observed the object from September onward. The radio telescopes showed the radio emission steadily gaining strength. Based on this, the astronomers identified the most likely scenario for the merger's aftermath.

"The gradual brightening of the radio signal indicates we are seeing a wide-angle outflow of material, traveling at speeds comparable to the speed of light, from the neutron star merger," said Kunal Mooley, now a National Radio Astronomy Observatory (NRAO) Jansky Postdoctoral Fellow hosted by Caltech.

The observed measurements are helping the astronomers figure out the sequence of events triggered by the collision of the neutron stars.

The initial merger of the two superdense objects caused an explosion, called a kilonova, that propelled a spherical shell of debris outward. The neutron stars collapsed into a remnant, possibly a black hole, whose powerful gravity began pulling material toward it. That material formed a rapidly-spinning disk that generated a pair of narrow, superfast jets of material flowing outward from its poles.

If one of the jets were pointed directly toward Earth, we would have seen a short-duration gamma-ray burst, like many seen before, the scientists said.

"That clearly was not the case," Mooley said.

Some of the early measurements of the August event suggested instead that one of the jets may have been pointed slightly away from Earth. This model would explain the fact that the radio and X-ray emission were seen only some time after the collision.

"That simple model -- of a jet with no structure (a so-called top-hat jet) seen off-axis -- would have the radio and X-ray emission slowly getting weaker. As we watched the radio emission strengthening, we realized that the explanation required a different model," said Alessandra Corsi, of Texas Tech University.

The astronomers looked to a supercomputer model published in October by Mansi Kasliwal of Caltech, and colleagues, and further developed by Ore Gottlieb, of Tel Aviv University, and his colleagues. In that model, the jet does not make its way out of the sphere of explosion debris. Instead, it gathers up surrounding material as it moves outward, producing a broad "cocoon" that absorbs the jet's energy.

The astronomers favored this scenario based on the information they gathered from using the radio telescopes. Soon after the initial observations of the merger site, the Earth's annual trip around the Sun placed the object too close to the Sun in the sky for X-ray and visible-light telescopes to observe. For weeks, the radio telescopes were the only way to continue gathering big data about the event.

"If the radio waves and X-rays both are coming from an expanding cocoon, we realized that our radio measurements meant that, when NASA's Chandra X-ray Observatory could observe once again, it would find the X-rays, like the radio waves, had increased in strength," Corsi said.

Mooley and his colleagues posted a paper with their radio measurements, their favored scenario for the event, and this prediction online on November 30. Chandra was scheduled to observe the object on December 2 and 6.

"On December 7, the Chandra results came out, and the X-ray emission had brightened just as we predicted," said Gregg Hallinan, of Caltech.

"The agreement between the radio and X-ray data suggests that the X-rays are originating from the same outflow that's producing the radio waves," Mooley said.

"It was very exciting to see our prediction confirmed," Hallinan said. He added, "An important implication of the cocoon model is that we should be able to see many more of these collisions by detecting their electromagnetic, not just their gravitational, waves."

Mooley, Hallinan, Corsi, and their colleagues reported their findings in the scientific journal Nature.

  1. Germany’s Tübingen University physicists are the first to link atoms and superconductors in key step towards new hardware for quantum supercomputers, networks
  2. How great is the influence, risk of social, political bots?
  3. Easter Island had a cooperative community, analysis of giant hats reveals
  4. Study resolves controversy about electron structure of defects in graphene
  5. AI insights could help reduce injuries in construction industry
  6. SuperComputational modeling key to design supercharged virus-killing nanoparticles
  7. UW's Hyak supercomputer overcomes obstacles in peptide drug development
  8. Brigham and Women's Hospital's Golden findings show potential use of AI in detecting spread of breast cancer
  9. NUS scientist develops new toolboxes for quantum cybersecurity
  10. UCLA chemists synthesize narrow ribbons of graphene using only light, heat
  11. New machine learning algorithm recognizes distinct dolphin clicks in underwater recordings
  12. UB's Pokhriyal uses machine learning framework to create new maps for fighting extreme poverty
  13. SwRI’s Marchi models the massive collisions after moon formation remodeled early Earth
  14. Canadian swarm-based supercomputer simulation strategy proves significantly shorter
  15. German astrophysicist simulates the size of neutron stars on the brink of collapse
  16. Patkar’s new supercomputational framework shows shifting protein networks in breast cancer may alter gene function
  17. Army researchers join international team to defeat disinformation cyberattacks
  18. John Hopkins researcher Winslow builds new supercomputer model sheds light on biological events leading to sudden cardiac death
  19. Italian scientist Baroni creates new method to supercompute heat transfer from optimally short MD simulations
  20. French researchers develop supercomputer models for better understanding of railway ballast

Page 5 of 42