LATEST

In the midst of an unseasonably warm winter in the Pacific Northwest, a comparison of four publicly available climate projections has shown broad agreement that the region will become considerably warmer in the next century if greenhouse gas concentrations in the atmosphere rise to the highest levels projected in the the Intergovernmental Panel on Climate Change (IPCC) "business-as-usual" scenario.

In this scenario, carbon dioxide concentrations are projected to continue to rise and to exceed 900 parts per million, more than double today's level of just over 400 parts per million. Annual average global temperatures are projected to rise between 1.5 and 7 degrees Celsius (2.7 to 12.6 degrees Fahrenheit), and precipitation is expected to increase during the winter and decrease in the summer.

To examine projections of future climates in the Northwest, researchers in the College of Forestry at Oregon State University and the U.S. Forest Service obtained outouts from more than 30 climate models, known as general circulation models. These models simulate the Earth's climate at scales that are generally too large to be applied with confidence to local areas, such as the watersheds of small rivers and streams.

The scientists examined four different versions of the model outputs, each one translated for the region with data from weather stations in the Northwest through a process called "downscaling." While the resulting fine-resolution climate projections vary for parts of the Northwest, such as coastal watersheds and mountain crests, the general agreement among them gives scientists increasing confidence in using fine-resolution climate projections for exploring future climate change impacts. The differences among them were no more than 0.3 degrees Celsius (about 0.5 degrees Fahrenheit) for the region.

The results were published this week in the journal Scientific Data.

"From a regional perspective, the differences in projected future changes are minor when you look at how much each projection says climate will change for the business-as-usual scenario," said Yueyang Jiang, lead author and a postdoctoral scientist at OSU. "The climate projections were created using different downscaling methods, but the projected changes in climate among them are similar at the regional scale."

The researchers chose to analyze projections for the recent past as well as for three 29-year periods from 2011 to 2100. Their goal was to characterize the differences to inform and guide scientists and land managers who are evaluating the projected impacts of climate change on local resources.

The fine-resolution climate projections vary in downscaling techniques and in the choice of historically observed weather datasets used to calibrate their calculations. Jiang and his team confirmed that the methods used to downscale each of the models had little to no effect on the data. They showed instead that differences arose from the choice of historical observation datasets used, which vary due to highly variable weather patterns or due to a lack of data in areas where weather stations are far apart.

"These differences become enhanced in areas with strong geographic features, such as the coastline and at the crest of mountain ranges," said John Kim, co-author on the paper and a scientist with the Pacific Northwest Research Station of the U.S. Forest Service.

Nevertheless, Kim added, the analysis reveals "a fairly consistent high-resolution picture of climate change" under the highest greenhouse gas concentration scenario projected by the IPCC. "So, individuals and organizations that are interested in how much climate may change for most parts of the region can use any of the datasets we examined."

However, the researchers also caution against using only one projection to explore the effects of climate change at specific threshholds, such as how plants and animals might respond to a decrease in days with temperatures below freezing. Scientists interested in such climate effects should use several models, they added.

CAPTION Phagocytosis, or cell eating, is a key feature of eukaryotic cells that is not present in the archaea or bacteria. Current genomic data provides the key to this and other complex processes with hints at how they may have originated. CREDIT © AMNH/S. Thurston

New research disputes that 'eukaryote-like' Asgard archaea are cell-eating

A team of American Museum of Natural History researchers has created a supercomputational model capable of predicting whether or not organisms have the ability to "eat" other cells through a process known as phagocytosis. The model may be a useful tool for large-scale microbe surveys and provides valuable insight into the evolution of complex life on Earth, challenging ideas put forward in recent studies. The model and researchers' findings are featured in a paper published today in the journal Nature Ecology & Evolution.

"Phagocytosis is a major mechanism of nutritional uptake in many single-celled organisms, and it's vital to immune defenses in a number of living things, including humans," said co-author Eunsoo Kim, an associate curator in the American Museum of Natural History's Division of Invertebrate Zoology. "But lesser known is the idea that phagocytosis dates back some 2 to 3 billion years and played a role in those symbiotic associations that likely started the cascading evolution toward the more diverse and complex life we see on the planet today. Our research provides some hints as to how phagocytosis first arose."

Prokaryotes, a group that includes bacteria and archaea, are microscopic, mostly single-celled organisms with relatively simple internal structure. Eukaryotes, the group that comprises animals, plants, fungi, and protists (representing assemblages of diverse, unrelated lineages such as amoebozoans and green algae), have generally larger cells that are filled with a number of internal components including the nucleus, where DNA is stored, and energy-generating organelles called mitochondria. Scientific theory dictates that between about 2 to 3 billion years ago a single bacterium merged with an unrelated prokaryotic microbe, leading to the evolution of mitochondria-a key feature of eukaryotic cells. This merger only happened once, and some scientists suggest that the process involved cellular "engulfment." Yet there are no known prokaryotes capable of phagocytosis today. So under what circumstances did the trait arise?

To investigate this long-standing question, the research team used genetic patterns common to phagocytic cells to build a supercomputational model that uses machine learning to predict whether an organism feeds via phagocytosis.

"There's no single set of genes that are strongly predictive of phagocytosis because it's a very complicated process that can involve more than 1,000 genes, and those genes can greatly vary from species to species," said lead author John Burns, a research scientist in the Museum's Sackler Institute for Comparative Genomics." But as we started looking at the genomes of more and more eukaryotes, a genetic pattern emerged, and it exists across diversity, even though it's slightly different in each species. That pattern is the core of our model, and we can use it to very quickly and efficiently predict which cells are likely to be 'eaters' and which are not."

Because many eukaryotes, such as some green algae, can be "picky"-only eating under certain conditions-it can be hard to determine whether they are capable of phagocytosis simply by watching them under a microscope. This new model can help microbiologists make a quick assessment, and it may also be valuable for use in large DNA sequencing projects that involve multiple unknown microbial species, for example, surveying sea water samples.

The researchers applied the model to a group of "eukaryote-like" microbes called Asgard archaea. In recent years, some researchers have proposed that Asgard microbes might be living relatives of long-sought-after microbes that merged with the bacterium that became mitochondria. But the researchers' new work finds that these microbes most likely do not use phagocytosis. They also used the gene set developed as part of the model-building process to look more closely at the ancient lineages of archaea and bacteria, and found that, as a whole, neither group on its own has the genes required for phagocytosis. While the new work eliminates one scenario for the birth of mitrochondria-that Asgard archaea engulfed a bacteria-many other options remain.

"When you tease apart the components of these predictive genes, some have roots in archaea, some have roots in bacteria, and some are only unique to eukaryotes," Kim said. "Our data are consistent with the hypothesis that our cells are a chimera of archaeal and bacterial components, and that the process of phagocytosis arose only after that combination took place. We still have a lot of work to do in this field."

For 2017, Cray reported total revenue of $392.5 million, which compares with $629.8 million in 2016. Net loss for 2017 was $133.8 million, or $3.33 per diluted share, compared to net income of $10.6 million, or $0.26 per diluted share in 2016.  Non-GAAP net loss, which adjusts for selected unusual and non-cash items, was $40.5 million, or $1.01 per diluted share for 2017, compared to non-GAAP net income of $19.9 million, or $0.49 per diluted share in 2016.

Revenue for the fourth quarter of 2017 was $166.6 million, compared to $346.6 million in the fourth quarter of 2016.  Net loss for the fourth quarter of 2017 was $97.5 million, or $2.42 per diluted share, compared to net income of $51.8 million, or $1.27 per diluted share in the fourth quarter of 2016.  Non-GAAP net income was $9.2 million, or $0.22 per diluted share for the fourth quarter of 2017, compared to non-GAAP net income of $56.3 million, or $1.38 per diluted share for the same period in 2016.

The Company’s GAAP Net Loss for the fourth quarter and year ended December 31, 2017 was significantly impacted by both the enactment of the Tax Cuts and Jobs Act of 2017 and by its decision to record a valuation allowance against all of its U.S. deferred tax assets.  The combined GAAP impact totaled $103 million.  These items have been excluded for non-GAAP purposes.

For 2017, overall gross profit margin on a GAAP and non-GAAP basis was 33% and 34%, respectively, compared to 35% on a GAAP and non-GAAP basis for 2016.

Operating expenses for 2017 were $196.4 million, compared to $211.1 million in 2016.  Non-GAAP operating expenses for 2017 were $176.5 million, compared to $199.7 million in 2016.  GAAP operating expenses in 2017 included $8.6 million in restructuring charges associated with our recent workforce reduction.

As of December 31, 2017, cash, investments and restricted cash totaled $147 million.  Working capital at the end of the fourth quarter was $354 million, compared to $373 million at December 31, 2016.

“Despite difficult conditions in our core market we finished 2017 strong, highlighted by several large acceptances at multiple sites around the world, including completing the installation of what is now the largest supercomputing complex in India at the Ministry of Earth Sciences,” said Peter Ungaro, president and CEO of Cray.  “As we shift to 2018, we’re seeing signs of a rebound at the high-end of supercomputing as well as considerable growth opportunities in the coming years.  Supercomputing continues to expand in importance to both government and commercial customers, driving growth and competitiveness across many different disciplines and industries.  As the leader at the high-end of the market, we’re poised to play a key role in this growth and I’m excited about where we’re headed.”

Entangled qubits are sent to measurement devices which output a sequence of zeroes and ones. This pattern heavily depends on the type of measurements performed on individual qubits. If we pick the set of measurements in a peculiar way, entanglement will leave unique fingerprints in the measurement patterns. Credit: Juan Palomino

Quantum entanglement is a key feature of a quantum supercomputer. Yet, how can we verify that a quantum supercomputer indeed incorporates a large-scale entanglement? Using conventional methods is hard since they require a large number of repeated measurements. Aleksandra Dimić from the University of Belgrade and Borivoje Dakić from the Austrian Academy of Sciences and the University of Vienna have developed a novel method where in many cases even a single experimental run suffices to prove the presence of entanglement. Their surprising results will be published in the online open access journal npj Quantum Information of the Nature Publishing group.

The ultimate goal of quantum information science is to develop a quantum computer, a fully-fledged controllable device which makes use of the quantum states of subatomic particles to store information. As with all quantum technologies, quantum supercomputing is based on a peculiar feature of quantum mechanics, quantum entanglement. The basic units of quantum information, the qubits, need to correlate in this particular way in order for the quantum supercomputer to achieve its full potential.

One of the main challenges is to make sure that a fully functional quantum supercomputer is working as anticipated. In particular, scientists need to show that the large number of qubits are reliably entangled. Conventional methods require a large number of repeated measurements on the qubits for reliable verification. The more often a measurement run is repeated the more certain one can be about the presence of entanglement. Therefore, if one wants to benchmark entanglement in large quantum systems it will require a lot of resources and time, which is practically difficult or simply impossible. The main question arises: can we prove entanglement with only a low number of measurement trials?

Now researchers from the University of Belgrade, the University of Vienna and the Austrian Academy of Sciences have developed a novel verification method which requires significantly fewer resources and, in many cases, even only a single measurement run to prove large-scale entanglement with a high confidence. For Aleksandra Dimić from the University of Belgrade, the best way to understand this phenomenon is to use the following analogy: "Let us consider a machine which simultaneously tosses, say, ten coins. We manufactured the machine such that it should produce correlated coins. We now want to validate whether the machine produces the anticipated result. Imagine a single trial revealing all coins landing on tails. This is a clear signature of correlations, as ten independent coins have 0.01% chance to land on the same side simultaneously. From such an event, we certify the presence of correlations with more than 99.9% confidence. This situation is very similar to quantum correlations captured by entanglement." Borivoje Dakić says: "In contrast to classical coins, qubits can be measured in many, many different ways. The measurement result is still a sequence of zeros and ones, but its structure heavily depends on how we choose to measure individual qubits", he continues. "We realized that, if we pick these measurements in a peculiar way, entanglement will leave unique fingerprints in the measured pattern", he concludes.

The developed method promises a dramatic reduction in time and resources needed for reliable benchmark of future quantum devices.

The U.S. Army Research Laboratory (ARL) has selected ICF, a global consulting and digital services provider, as one of two awardees eligible to compete for scientific and engineering support services under a new indefinite delivery, indefinite quantity (IDIQ) contract.

The new Command, Control, Communications, Computers, Combat Systems, Intelligence, Surveillance, and Reconnaissance (C5ISR) IDIQ was awarded by the Computational and Information Sciences Directorate, the principal Army organization for basic and applied research in information sciences, network sciences, battlefield environment and advanced computing and computational sciences. The agreement has a shared ceiling value of $175 million and a possible term of up to eight years.

"ICF's selection for this work further distinguishes us as leaders in a broader spectrum of information sciences," said Randy James, senior vice president for ICF. "It allows ICF to promote the initiatives of ARL's research partners in academia, industry and across the federal government and to continue the lab's global scientific leadership and STEM outreach efforts. Through this effort, ICF will support ARL's goals in achieving global superiority to deliver the next generation of computer and security technologies."

ICF was also awarded its first task order under the IDIQ: an estimated $20 million engagement to perform full spectrum defensive cyber operations and research and development. The firm will engage its expertise as a Cybersecurity Service Provider to create opportunities where new concepts and technologies can be researched, tested and applied to operational environments that can be transitioned to the joint warfighter supporting our nation's defense. The task order has a term of three years including one base and two one-year options.

For more than two decades, ICF has partnered with ARL to develop new concepts and research, test and apply new technologies to our nation's defense. In 2017, ICF was re-engaged to support ARL's Cybersecurity Service Program where the firm is currently helping in the lab's efforts to develop cyber tools and techniques and advance state-of-the-art computer network defense.

ICF's cybersecurity specialists help the military, national security and commercial clients build and successfully defend the most aggressively attacked infrastructures on the planet.

CAPTION Rosette Nebula image is based on data obtained as part of the INT Photometric H-Alpha Survey of the Northern Galactic Plane, prepared by Nick Wright, Keele University, on behalf of the IPHAS Collaboration CREDIT Nick Wright, Keele University

A hole at the heart of a stunning rose-like interstellar cloud has puzzled astronomers for decades. But new research, led by the University of Leeds, offers an explanation for the discrepancy between the size and age of the Rosetta Nebula's central cavity and that of its central stars.

The Rosette Nebula is located in the Milky Way Galaxy roughly 5,000 light-years from Earth and is known for its rose-like shape and distinctive hole at its centre. The nebula is an interstellar cloud of dust, hydrogen, helium and other ionized gases with several massive stars found in a cluster at its heart.

Stellar winds and ionising radiation from these massive stars affect the shape of the giant molecular cloud. But the size and age of the cavity observed in the centre of Rosette Nebula is too small when compared to the age of its central stars.

Through supercomputer simulations, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud's centre would account for the comparatively small size of the central cavity.

Study lead author, Dr Christopher Wareing, from the School of Physics and Astronomy said: "The massive stars that make up the Rosette Nebula's central cluster are a few millions of years old and halfway through their lifecycle. For the length of time their stellar winds would have been flowing, you would expect a central cavity up to ten times bigger.

"We simulated the stellar wind feedback and formation of the nebula in various molecular cloud models including a clumpy sphere, a thick filamentary disc and a thin disc, all created from the same low density initial atomic cloud.

"It was the thin disc that reproduced the physical appearance - cavity size, shape and magnetic field alignment -- of the Nebula, at an age compatible with the central stars and their wind strengths.

"To have a model that so accurately reproduces the physical appearance in line with the observational data, without setting out to do this, is rather extraordinary.

"We were also fortunate to be able to apply data to our models from the ongoing Gaia survey, as a number of the bright stars in the Rosette Nebula are part of the survey.

Applying this data to our models gave us new understanding of the roles individual stars play in the Rosette Nebula. Next we'll look at the many other similar objects in our Galaxy and see if we can figure out their shape as well."

The simulations, published today in the Monthly Notices of the Royal Astronomical Society, were run using the Advanced Research Computing centre at Leeds. The nine simulations required roughly half a million CPU hours -- the equivalent to 57 years on a standard desktop computer.

Martin Callaghan, a member of the Advanced Research Computing team, said: "The fact that the Rosette Nebula simulations would have taken more than five decades to complete on a standard desktop computer is one of the key reasons we provide powerful supercomputing research tools. These tools enabled the simulations of the Rosette Nebula to be done in a matter of a few weeks."

CAPTION Molecular models of the lung cancer drug osimertinib in complex with EGFR. On the left, osimertinib assumes a conformation that allows it to react with Cys797 leading to the effective inhibition of the enzyme and to cancer cell death. On the right, the presence within EGFR of a residue able to form additional interactions with osimertinib (as for the L718Q mutant form) prevents its reaction with Cys797 allowing EGFR to work and cancer cells to survive.

Scientists from the Universities of Bristol and Parma, Italy, have used molecular simulations to understand resistance to osimertinib - an anticancer drug used to treat types of lung cancer.

Osimertinib binds tightly to a protein, epidermal growth factor receptor (EGFR), which is overexpressed in many tumours.

EGFR is involved in a pathway that signals for cell proliferation, and so is a target for drugs. Blocking the action of EGFR (inhibiting it) can switch it off, and so is a good way to treat the disease.

Osimertinib is an effective anticancer drug that works in this way. It is used to treat non-small-cell lung cancer (NSCLC), in cases where the cancer cells have a particular (T790M) mutant form of EGFR.

It is a so-called 'third-generation' EGFR inhibitor, which was approved as a cancer treatment in 2017. Osimertinib is a covalent inhibitor: as such, it binds irreversibly to EGFR by forming a chemical bond with it.

Although patients generally respond well to osimertinib, most acquire drug resistance within one year of treatment, so the drug stops working.

Drug resistance arises because the EGFR protein mutates, so that the drug binds less tightly.

One such mutation, called L718Q, was recently discovered in patients in the clinic by the Medical Oncology Unit of the University Hospital of Parma.

In this drug resistant mutant, a single amino acid is changed. Unlike other drug resistant mutants, it was not at all clear how this change stops the drug from binding effectively, information potentially crucial in developing new drugs to overcome resistance.

Now, a collaboration between medicinal and computational chemists and clinical oncologists has revealed exactly how subtle changes in the protein target cause drug resistance.

Using a range of advanced molecular simulation techniques, scientists from the Universities of Bristol and Parma, Italy, showed that the structure of the mutant protein changes in a way that stops the drug reacting and binding to it.

Adrian Mulholland, Professor of Chemistry at the University of Bristol, said: "This work shows how molecular simulations can reveal mechanisms of drug resistance, which can be subtle and non-obvious.

"In particular, here we've used combined quantum mechanics/molecular mechanics (QM/MM) methods, which allow us to study chemical reactions in proteins.

"This is crucial in investigating covalent inhibitors, which react with their biological targets, and are the focus of growing interest in the pharmaceutical industry."

His collaborators, Professor Alessio Lodola and Professor Marco Mor of the Drug Design and Discovery group at the University of Parma, added: "It was an exciting experience to work closely with clinical colleagues who identified the mutant, and to help analyse its effects.

"Now the challenge is to exploit this discovery in the development of novel drugs targeting EGFR mutants for cancer treatment in future." 

Data is vital in research that tackles diseases like cancer and heart disease. Now Swansea University's leading role in this field has been recognised again, with news today that it is to become one of six substantive sites of the newly-formed Health Data Research UK (HDR UK), in a strategic partnership with Queen's University Belfast.

The six sites in the HDR project will share an initial investment of £30M for the next 5 years. Swansea University's involvement stems from its world-leading expertise in health informatics. The project will also strengthen the UK's position at the forefront of population data science.

The Wales and Northern Ireland HDR UK site is led by Professor Ronan Lyons at Swansea and Professor Mark Lawler at Queen's. The team will focus upon two major research initiatives; Modernising Public Health and Enabling Precision Medicine.

Both partner sites share a vision to upscale the quantity and impact of research in scientific discovery and its translation, patient and population health, policy and economic development by addressing major health challenges and to lead interdisciplinary research aligned to HDR UK's mission.

On confirmation of the Award, Professor Lyons of Swansea University Medical School commented:

"I am thrilled that Swansea is to become a founding substantive site of HDR UK. One of Swansea University's key strengths has always been to work collaboratively both with UK and global partners and also across disciplines based in Medicine, Computer Science, Mathematics and Engineering.

This will help strengthen our position at the forefront of rapidly advancing science to benefit the health, wellbeing and prosperity of the population. I look forward to working with colleagues at the other HDR UK substantive sites to apply cutting-edge science to address the most pressing health challenges".

This Award sees Swansea at the core of a collaborative research community working together to deliver the priorities of Health Data Research UK. This initial funding is awarded following a rigorous application process, which included interviews with an international panel of experts.

Professor Andrew Morris, Director of Health Data Research UK, said, "I am delighted to make today's announcement, which marks the start of a unique opportunity for scientists, researchers and clinicians to use their collective expertise to transform the health of the population. "

The six HDR UK sites, comprising 21 universities and research institutes, have tremendous individual strengths and will form a solid foundation for our long-term ambition. By working together and with NHS and industry partners to the highest ethical standards, our vision is to harness data science on a national scale.

This will unleash the potential for data and technologies to drive breakthroughs in medical research, improving the way we are able to prevent, detect and diagnose diseases like cancer, heart disease and asthma.

I am grateful to our funders who recognise the importance of collaboration at scale, and the pivotal contribution of health data research to the UK's ambition to be a global leader in life sciences, for health and economic benefit."

This is the first phase of investment to establish Health Data Research UK. A further £24 million will be invested in upcoming activities, including a Future Talent Programme and work to address targeted data research challenges through additional partnership sites.

Health Data Research UK is committed to the highest ethical standards and will work with experts in public engagement to ensure the public voice is central to its activity. It will work at scale and forge national and international partnerships to deliver:

  • New scientific discovery
  • A vibrant training environment for the next generation of data scientists
  • The creation of a trustworthy UK-wide research and innovation ecosystem for health data research.

Health Data Research UK is a joint investment co-ordinated by the Medical Research Council, working in partnership with the British Heart Foundation, the National Institute for Health Research, the Economic and Social Research Council, the Engineering and Physical Sciences Research Council, Health and Social Care Research and Development Division (Welsh Government), Health and Social Care Research and Development Division (Public Health Agency, Northern Ireland), Chief Scientist Office of the Scottish Government Health and Social Care Directorates, and Wellcome.

Other substantive sites will be led by consortia from Cambridge, Midlands, Scotland, London and Oxford. For further details, please visit the Health Data Research UK website.

Modeling of the evolution of the magnetic rope breaking though the cage when it is weaker than the rope.

Just one phenomenon may underlie all solar eruptions, according to researchers from the CNRS, École Polytechnique, CEA and INRIA in an article featured on the cover of the February 8 issue of Nature magazine. They have identified the presence of a confining 'cage' in which a magnetic rope forms, causing solar eruptions. It is the resistance of this cage to the attack of the rope that determines the power and type of the upcoming flare. This work has enabled the scientists to develop a model capable of predicting the maximum energy that can be released during a solar flare, which could have potentially devastating consequences for the Earth.

Just as on Earth, storms and hurricanes sweep through the atmosphere of the Sun. These phenomena are caused by a sudden, violent reconfiguration of the solar magnetic field, and are characterized by an intense release of energy in the form of light and particle emissions and, sometimes, by the ejection of a bubble of plasma. Studying these phenomena, which take place in the corona (the outermost region of the Sun), will enable scientists to develop forecasting models, just as they do for the Earth's weather. This should limit our technological vulnerability to solar eruptions, which can impact a number of sectors such as electricity distribution, GPS and communications systems.

In 2014, researchers showed that a characteristic structure, an entanglement of magnetic force lines twisted together like a hemp rope, gradually appears in the days preceding a solar flare. However, until recently then they had only observed this rope in eruptions that ejected bubbles of plasma. In this new study, the researchers studied other types of flare, the models of which are still being debated, by undertaking a more thorough analysis of the solar corona, a region where the Sun's atmosphere is so thin and hot that it is difficult to measure the solar magnetic field there. They did this by measuring stronger magnetic field at the surface of the Sun, and then using these data to reconstruct what was happening in the solar corona.

They applied this method to a major flare that developed over a few hours on October 24, 2014. They showed that, in the hours before the eruption, the evolving rope was confined within a multilayer magnetic 'cage'. Using evolutionary models running on supercomputer, they showed that the rope had insufficient energy to break through all the layers of the cage, making the ejection of a magnetic bubble impossible. Despite this, the high twist of the rope triggered an instability and the partial destruction of the cage, causing a powerful emission of radiation that led to disruptions on Earth.

Thanks to their method, which makes it possible to monitor the processes taking place in the last few hours leading up to a flare, the researchers have developed a model able to predict the maximum energy that can be released from the region of the Sun concerned. The model showed that for the 2014 eruption, a huge ejection of plasma would have occurred if the cage had been less resistant. This work demonstrates the crucial role played by the magnetic 'cage-rope' duo in controlling solar eruptions, as well as being a new step towards early prediction of such eruptions, which will have potentially significant societal impacts.

  1. HKUST scientists deploy big data methods to learn the fitness landscape of the HIV Envelope protein
  2. Brown researchers take terahertz data links around the bend
  3. University of Illinois at Chicago researchers measure the temperature of two-dimensional materials at the atomic level to allow engineers to design smaller, faster chips
  4. Japanese, American researchers develop an unbiased approach for sifting through big data
  5. Distant galaxy group contradicts common cosmological models, simulations
  6. UC Riverside's Kumar produces new research to advance spintronics technology
  7. Machine learning techniques generate clinical labels of medical scans
  8. Penn engineering research gives optical switches the 'contrast' of electronic transistors
  9. Five research teams across Columbia use data science to solve societal problems
  10. CaseMed researchers win $6.5 million NIH grant to use big data to tackle psoriasis
  11. Yale-NUS undergrads find two theoretical physics models to be equivalent
  12. Fujitsu president Tatsuya Tanaka receives the French government's prestigious Legion of Honor
  13. Dutch researchers accelerate the pace of development of silicon quantum supercomputer chip
  14. Belgian Ph.D. student decodes DNA, wins a Bitcoin
  15. Artificial intelligence predicts corruption in Spain
  16. UK scientists run new simulations that show Sanchi oil spill contamination could reach Japan within a month
  17. BU med school uses new AI to significantly improve human kidney analysis
  18. KU researchers use machine learning to predict new details of geothermal heat flux beneath the Greenland Ice Sheet
  19. SETI project homes in on strange 'fast radio bursts'
  20. NYC Health Department spots 10 outbreaks of foodborne illness using Yelp reviews since 2012

Page 1 of 42