LATEST

Modeling of the evolution of the magnetic rope breaking though the cage when it is weaker than the rope.

Just one phenomenon may underlie all solar eruptions, according to researchers from the CNRS, École Polytechnique, CEA and INRIA in an article featured on the cover of the February 8 issue of Nature magazine. They have identified the presence of a confining 'cage' in which a magnetic rope forms, causing solar eruptions. It is the resistance of this cage to the attack of the rope that determines the power and type of the upcoming flare. This work has enabled the scientists to develop a model capable of predicting the maximum energy that can be released during a solar flare, which could have potentially devastating consequences for the Earth.

Just as on Earth, storms and hurricanes sweep through the atmosphere of the Sun. These phenomena are caused by a sudden, violent reconfiguration of the solar magnetic field, and are characterized by an intense release of energy in the form of light and particle emissions and, sometimes, by the ejection of a bubble of plasma. Studying these phenomena, which take place in the corona (the outermost region of the Sun), will enable scientists to develop forecasting models, just as they do for the Earth's weather. This should limit our technological vulnerability to solar eruptions, which can impact a number of sectors such as electricity distribution, GPS and communications systems.

In 2014, researchers showed that a characteristic structure, an entanglement of magnetic force lines twisted together like a hemp rope, gradually appears in the days preceding a solar flare. However, until recently then they had only observed this rope in eruptions that ejected bubbles of plasma. In this new study, the researchers studied other types of flare, the models of which are still being debated, by undertaking a more thorough analysis of the solar corona, a region where the Sun's atmosphere is so thin and hot that it is difficult to measure the solar magnetic field there. They did this by measuring stronger magnetic field at the surface of the Sun, and then using these data to reconstruct what was happening in the solar corona.

They applied this method to a major flare that developed over a few hours on October 24, 2014. They showed that, in the hours before the eruption, the evolving rope was confined within a multilayer magnetic 'cage'. Using evolutionary models running on supercomputer, they showed that the rope had insufficient energy to break through all the layers of the cage, making the ejection of a magnetic bubble impossible. Despite this, the high twist of the rope triggered an instability and the partial destruction of the cage, causing a powerful emission of radiation that led to disruptions on Earth.

Thanks to their method, which makes it possible to monitor the processes taking place in the last few hours leading up to a flare, the researchers have developed a model able to predict the maximum energy that can be released from the region of the Sun concerned. The model showed that for the 2014 eruption, a huge ejection of plasma would have occurred if the cage had been less resistant. This work demonstrates the crucial role played by the magnetic 'cage-rope' duo in controlling solar eruptions, as well as being a new step towards early prediction of such eruptions, which will have potentially significant societal impacts.

This is a computational method to estimate the fitness landscape of HIV-envelope.

Despite significant advances in medicine, an effective vaccine for the human immunodeficiency virus (HIV) is still not available, although recent hope has emerged through the discovery of antibodies capable of neutralizing diverse HIV strains. However, HIV can sometimes evade known broadly neutralizing antibody responses via mutational pathways, which makes it all the more difficult to design an effective solution.

An ideal vaccine would elicit broadly neutralizing antibodies that target parts of the virus's spike proteins where mutations severely compromise the virus's fitness, or the virus' ability to reproduce and replicate. This requires knowledge of the fitness landscape, a mapping from sequence to fitness. To achieve this goal, data scientists from the HKUST have employed a supercomputational approach to estimate the fitness landscape of gp160, the polyprotein that comprises HIV's spike. The inferred landscape was then validated through comparisons with diverse experimental measurements.

Their findings were published in the journal PNAS in January 2018 (doi: 10.1073/pnas.1717765115).

"Without big data machine learning methods, it is simply impossible to make such a prediction," said Raymond Louie, co-author, Junior Fellow of HKUST's Institute for Advanced Study and Research Assistant Professor in the Department of Electronic & Computer Engineering. "The number of parameters needed to be estimated came close to 4.4 million."

The data processed by the team consisted of 815 residues and 20,043 sequences from 1,918 HIV-infected individuals.

"The computational method gave us fast and accurate results," said Matthew McKay, co-author and Hari Harilela Associate Professor in the Departments of Electronic & Computer Engineering and Chemical & Biological Engineering at HKUST. "The findings can assist biologists in proposing new immunogens and vaccination protocols that seek to force the virus to mutate to unfit states in order to evade immune responses, which is likely to thwart or limit viral infection."

"While this method was developed to address the specific challenges posed by the gp160 protein, which we could not address using methods we developed to obtain the fitness landscapes of other HIV proteins, the approach is general and may be applied to other high-dimensional maximum-entropy inference problems," said Arup K. Chakraborty, co-author and Robert T. Haslam Professor in Chemical Engineering, Physics, and Chemistry at MIT's Institute for Medical Engineering & Science. "Specifically, our fitness landscape could be clinically useful in the future for the selection of combination bnAb therapy and immunogen design."

"This is a multi-disciplinary study presenting an application of data science, and big data machine learning methods in particular, for addressing a challenging problem in biology", said McKay.

CAPTION New research shows that non-line-of-site terahertz data links are possible because the waves can bounce off of walls without losing too much data.  CREDIT Mittleman lab / Brown University

An off-the-wall new study by Brown University researchers shows that terahertz frequency data links can bounce around a room without dropping too much data. The results are good news for the feasibility of future terahertz wireless data networks, which have the potential to carry many times more data than current networks.

Today's cellular networks and Wi-Fi systems rely on microwave radiation to carry data, but the demand for more and more bandwidth is quickly becoming more than microwaves can handle. That has researchers thinking about transmitting data on higher-frequency terahertz waves, which have as much as 100 times the data-carrying capacity of microwaves. But terahertz communication technology is in its infancy. There's much basic research to be done and plenty of challenges to overcome.

For example, it's been assumed that terahertz links would require a direct line of sight between transmitter and receiver. Unlike microwaves, terahertz waves are entirely blocked by most solid objects. And the assumption has been that it's not possible to bounce a terahertz beam around--say, off a wall or two--to find a clear path around an object.

"I think it's fair to say that most people in the terahertz field would tell you that there would be too much power loss on those bounces, and so non-line-of-sight links are not going to be feasible in terahertz," said Daniel Mittleman, a professor in Brown University's School of Engineering and senior author of the new research published in APL Photonics. "But our work indicates that the loss is actually quite tolerable in some cases -- quite a bit less than many people would have thought."

For the study, Mittleman and his colleagues bounced terahertz waves at four different frequencies off of a variety of objects--mirrors, metal doors, cinderblock walls and others -- and measured the bit-error-rate of the data on the wave after the bounces. They showed that acceptable bit-error-rates were achievable with modest increases in signal power.

"The concern had been that in order to make those bounces and not lose your data, you'd need more power than was feasible to generate," Mittleman said. "We show that you don't need as much power as you might think because the loss on the bounce is not as much as you'd think."

In one experiment, the researchers bounced a beam off two walls, enabling a successful link when transmitter and receiver were around a corner from each other, with no direct line-of-sight whatsoever. That's a promising finding to support the idea of terahertz local-area networks.

"You can imagine a wireless network," Mittleman explained, "where someone's computer is connected to a terahertz router and there's direct line-of-sight between the two, but then someone walks in between and blocks the beam. If you can't find an alternative path, that link will be shut down. What we show is that you might still be able to maintain the link by searching for a new path that could involve bouncing off a wall somewhere. There are technologies today that can do that kind of path-finding for lower frequencies and there's no reason they can't be developed for terahertz."

The researchers also performed several outdoor experiments on terahertz wireless links. An experimental license issued by the FCC makes Brown the only place in the country where outdoor research can be done legally at these frequencies. The work is important because scientists are just beginning to understand the details of how terahertz data links behave in the elements, Mittleman says.

Their study focused on what's known as specular reflection. When a signal is transmitted over long distances, the waves fan out forming an ever-widening cone. As a result of that fanning out, a portion the waves will bounce off of the ground before reaching the receiver. That reflected radiation can interfere with the main signal unless a decoder compensates for it. It's a well-understood phenomenon in microwave transmission. Mittleman and his colleagues wanted to characterize it in the terahertz range.

They showed that this kind of interference indeed occurs in terahertz waves, but occurs to a lesser degree over grass compared to concrete. That's likely because grass has lots of water, which tends to absorb terahertz waves. So over grass, the reflected beam is absorbed to a greater degree than concrete, leaving less of it to interfere with the main beam. That means that terahertz links over grass can be longer than those over concrete because there's less interference to deal with, Mittleman says.

But there's also an upside to that kind of interference with the ground.

"The specular reflection represents another possible path for your signal," Mittleman said. "You can imagine that if your line-of-site path is blocked, you could think about bouncing it off the ground to get there."

Mittleman says that these kinds of basic studies on the nature of terahertz data transmission are critical for understanding how to design the network architecture for future terahertz data systems.

Robert Klie, professor of physics. Photo: Jenny Fontaine

Researchers at the University of Illinois at Chicago describe a new technique for precisely measuring the temperature and behavior of new two-dimensional materials that will allow engineers to design smaller and faster microprocessors. Their findings are reported in the journal Physical Review Letters.

Newly developed two-dimensional materials, such as graphene -- which consists of a single layer of carbon atoms -- have the potential to replace traditional microprocessing chips based on silicon, which have reached the limit of how small they can get. But engineers have been stymied by the inability to measure how temperature will affect these new materials, collectively known as transition metal dichalcogenides, or TMDs.

Using scanning transmission electron microscopy combined with spectroscopy, researchers at UIC were able to measure the temperature of several two-dimensional materials at the atomic level, paving the way for much smaller and faster microprocessors. They were also able to use their technique to measure how the two-dimensional materials would expand when heated.

"Microprocessing chips in computers and other electronics get very hot, and we need to be able to measure not only how hot they can get, but how much the material will expand when heated," said Robert Klie, professor of physics at UIC and corresponding author of the paper. "Knowing how a material will expand is important because if a material expands too much, connections with other materials, such as metal wires, can break and the chip is useless."

Traditional ways to measure temperature don't work on tiny flakes of two-dimensional materials that would be used in microprocessors because they are just too small. Optical temperature measurements, which use a reflected laser light to measure temperature, can't be used on TMD chips because they don't have enough surface area to accommodate the laser beam.

"We need to understand how heat builds up and how it is transmitted at the interface between two materials in order to build efficient microprocessors that work," said Klie.

Klie and his colleagues devised a way to take temperature measurements of TMDs at the atomic level using scanning transition electron microscopy, which uses a beam of electrons transmitted through a specimen to form an image.

"Using this technique, we can zero in on and measure the vibration of atoms and electrons, which is essentially the temperature of a single atom in a two-dimensional material," said Klie. Temperature is a measure of the average kinetic energy of the random motions of the particles, or atoms that make up a material. As a material gets hotter, the frequency of the atomic vibration gets higher. At absolute zero, the lowest theoretical temperature, all atomic motion stops.

Klie and his colleagues heated microscopic "flakes" of various TMDs inside the chamber of a scanning transmission electron microscope to different temperatures and then aimed the microscope's electron beam at the material. Using a technique called electron energy-loss spectroscopy, they were able to measure the scattering of electrons off the two-dimensional materials caused by the electron beam. The scattering patterns were entered into a computer model that translated them into measurements of the vibrations of the atoms in the material - in other words, the temperature of the material at the atomic level.

"With this new technique, we can measure the temperature of a material with a resolution that is nearly 10 times better than conventional methods," said Klie. "With this new approach, we can design better electronic devices that will be less prone to overheating and consume less power."

The technique can also be used to predict how much materials will expand when heated and contract when cooled, which will help engineers build chips that are less prone to breaking at points where one material touches another, such as when a two-dimensional material chip makes contact with a wire.

"No other method can measure this effect at the spatial resolution we report," said Klie. "This will allow engineers to design devices that can manage temperature changes between two different materials at the nano-scale level."

CAPTION Nodes and lines represent the health-related variables and the strength of interdependence between two variables respectively. MENet helps build the Optimal Information Network (OIN) which indicates the most useful information to accurately characterize systemic health.  CREDIT Servadio J.L. and Convertino M, Science Advances, Feb. 2, 2018.

A new method could help researchers develop unbiased indicators for assessing complex systems such as population health.

Researchers have developed a complex system model to evaluate the health of populations in some U.S. cities based only on the most significant variables expressed in available data. Their unbiased network-based probabilistic approach to mine big data could be used to assess other complex systems, such as ranking universities or evaluating ocean sustainability.

Societies today are data-rich, which can both empower and overwhelm. Sifting through this data to determine which variables to use for the assessment of something like the health of a city's population is challenging. Researchers often choose these variables based on their personal experience. They might decide that adult obesity rates, mortality rates, and life expectancy are important variables for calculating a generalized metric of the residents' overall health. But are these the best variables to use? Are there other more important ones to consider?

Matteo Convertino of Hokkaido University in Japan and Joseph Servadio of the University of Minnesota in the U.S. have introduced a novel probabilistic method that allows the visualization of the relationships between variables in big data for complex systems. The approach is based on "maximum transfer entropy," which probabilistically measures the strength of relationships between multiple variables over time.

Using this method, Convertino and Servadio mined through a large amount of health data in the U.S. to build a "maximum entropy network" (MENet): a model composed of nodes representing health-related variables, and lines connecting the variables. The lines are darker the stronger the interdependence between two variables. This allowed the researchers to build an "Optimal Information Network" (OIN) by choosing the variables that had the most practical relevance for assessing the health status of populations in 26 U.S. cities from 2011 to 2014. By combining the data from each selected variable, the researchers were able to compute an "integrated health value" for each city. The higher the number, the less healthy a city's population.

They found that some cities, such as Detroit, had poor overall health during that timeframe. Others, such as San Francisco, had low values, indicating more favorable health outcomes. Some cities showed high variability over the four year period, such as Philadelphia. Cross-sectional comparisons showed tendencies for California cities to score better than other parts of the country. Also, Midwestern cities, including Denver, Minneapolis, and Chicago, appeared to perform poorly compared to other regions, contrary to national city rankings.

Convertino believes that methods like this, fed by large data sets and analysed via automated stochastic supercomputer models, could be used to optimize research and practice; for example for guiding optimal decisions about health. "These tools can be used by any country, at any administrative level, to process data in real-time and help personalize medical efforts," says Convertino.

But it is not just for health - "The model can be applied to any complex system to determine their Optimal Information Network, in fields from ecology and biology to finance and technology. Untangling their complexities and developing unbiased systemic indicators can help improve decision-making processes," Convertino added.

UCI, other astronomers find plane of dwarf satellites orbiting Centaurus A

An international team of astronomers has determined that Centaurus A, a massive elliptical galaxy 13 million light-years from Earth, is accompanied by a number of dwarf satellite galaxies orbiting the main body in a narrow disk. In a paper published today in Science, the researchers note that this is the first time such a galactic arrangement has been observed outside the Local Group, home to the Milky Way.

"The significance of this finding is that it calls into question the validity of certain cosmological models and simulations as explanations for the distribution of host and satellite galaxies in the universe," said co-author Marcel Pawlowski, a Hubble Fellow in the Department of Physics & Astronomy at the University of California, Irvine. 

He said that under the lambda cold dark matter model, smaller systems of stars should be more or less randomly scattered around their anchoring galaxies and should move in all directions. Yet Centaurus A is the third documented example, behind the Milky Way and Andromeda, of a "vast polar structure" in which satellite dwarves co-rotate around a central galactic mass in what Pawlowski calls "preferentially oriented alignment."

The difficulty of studying the movements of dwarf satellites around their hosts varies according to the target galaxy group. It's relatively easy for the Milky Way. "You get proper motions," Pawlowski said. "You take a picture now, wait three years or more, and then take another picture to see how the stars have moved; that gives you the tangential velocity."

Using this technique, scientists have measurements for 11 Milky Way satellite galaxies, eight of which are orbiting in a tight disk perpendicular to the spiral galaxy's plane. There are probably other satellites in the system that can't be seen from Earth because they're blocked by the Milky Way's dusty disk.

Andromeda provides observers on Earth a view of the full distribution of satellites around the galaxy's sprawling spiral. An earlier study found 27 dwarf galaxies, 15 arranged in a narrow plane. And Andromeda offers another advantage, according to Pawlowski: "Because you see the galaxy almost edge-on, you can look at the line-of-sight velocities of its satellites to see the ones that are approaching and those that are receding, so it very clearly presents as a rotating disk."

Centaurus A is much farther away, and its satellite companions are faint, making it more difficult to accurately measure distances and velocities to determine movements and distributions. But "sleeping in the archives," Pawlowski said, were data on 16 of Centaurus A's satellites.

"We could do the same game as with Andromeda, where we look at the line-of-sight velocities," he said. "And again we see that half of them are red-shifted, meaning they are receding from us, and the other half are blue-shifted, which tells us they are approaching."

The researchers were able to demonstrate that 14 of the 16 Centaurus A satellite galaxies follow a common motion pattern and rotate along the plane around the main galaxy - contradicting frequently used cosmological models and simulations suggesting that only about 0.5 percent of satellite galaxy systems in the nearby universe should exhibit this pattern.

"So this means that we are missing something," Pawlowski said. "Either the simulations lack some important ingredient, or the underlying model is wrong. This research may be seen as support for looking into alternative models."

This is Sandeep Kumar, an assistant professor of mechanical engineering at UC Riverside.

Discoveries will help realize the promise of faster, energy-efficient spintronic supercomputers and ultra-high-capacity data storage

Engineers at the University of California, Riverside, have reported advances in so-called "spintronic" devices that will help lead to a new technology for computing and data storage. They have developed methods to detect signals from spintronic components made of low-cost metals and silicon, which overcomes a major barrier to wide application of spintronics. Previously such devices depended on complex structures that used rare and expensive metals such as platinum. The researchers were led by Sandeep Kumar, an assistant professor of mechanical engineering. 

UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon.
UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon.

Spintronic devices promise to solve major problems in today's electronic computers, in that the computers use massive amounts of electricity and generate heat that requires expending even more energy for cooling. By contrast, spintronic devices generate little heat and use relatively minuscule amounts of electricity. Spintronic supercomputers would require no energy to maintain data in memory. They would also start instantly and have the potential to be far more powerful than today's computers.

While electronics depends on the charge of electrons to generate the binary ones or zeroes of computer data, spintronics depends on the property of electrons called spin. Spintronic materials register binary data via the "up" or "down" spin orientation of electrons--like the north and south of bar magnets--in the materials. A major barrier to development of spintronics devices is generating and detecting the infinitesimal electric spin signals in spintronic materials.

In one paper published in the January issue of the scientific journal Applied Physics Letters, Kumar and colleagues reported an efficient technique of detecting the spin currents in a simple two-layer sandwich of silicon and a nickel-iron alloy called Permalloy. All three of the components are both inexpensive and abundant and could provide the basis for commercial spintronic devices. They also operate at room temperature. The layers were created with the widely used electronics manufacturing processes called sputtering. Co-authors of the paper were graduate students Ravindra Bhardwaj and Paul Lou.

In their experiments, the researchers heated one side of the Permalloy-silicon bi-layer sandwich to create a temperature gradient, which generated an electrical voltage in the bi-layer. The voltage was due to phenomenon known as the spin-Seebeck effect. The engineers found that they could detect the resulting "spin current" in the bi-layer due to another phenomenon known as the "inverse spin-Hall effect."

The researchers said their findings will have application to efficient magnetic switching in computer memories, and "these scientific breakthroughs may give impetus" to development of such devices. More broadly, they concluded, "These results bring the ubiquitous Si (silicon) to forefront of spintronics research and will lay the foundation of energy efficient Si spintronics and Si spin caloritronics devices."

In two other scientific papers, the researchers demonstrated that they could generate a key property for spintronics materials, called antiferromagnetism, in silicon. The achievement opens an important pathway to commercial spintronics, said the researchers, given that silicon is inexpensive and can be manufactured using a mature technology with a long history of application in electronics.

Ferromagnetism is the property of magnetic materials in which the magnetic poles of the atoms are aligned in the same direction. In contrast, antiferromagnetism is a property in which the neighboring atoms are magnetically oriented in opposite directions. These "magnetic moments" are due to the spin of electrons in the atoms, and is central to the application of the materials in spintronics.

In the two papers, Kumar and Lou reported detecting antiferromagnetism in the two types of silicon--called n-type and p-type--used in transistors and other electronic components. N-type semiconductor silicon is "doped" with substances that cause it to have an abundance of negatively-charged electrons; and p-type silicon is doped to have a large concentration of positively charged "holes." Combining the two types enables switching of current in such devices as transistors used in computer memories and other electronics.

In the paper in the Journal of Magnetism and Magnetic Materials, Lou and Kumar reported detecting the spin-Hall effect and antiferromagnetism in n-silicon. Their experiments used a multilayer thin film comprising palladium, nickel-iron Permalloy, manganese oxide and n-silicon.

And in the second paper, in the scientific journal physica status solidi, they reported detecting in p-silicon spin-driven antiferromagnetism and a transition of silicon between metal and insulator properties. Those experiments used a thin film similar to those with the n-silicon.

The researchers wrote in the latter paper that "The observed emergent antiferromagnetic behavior may lay the foundation of Si (silicon) spintronics and may change every field involving Si thin films. These experiments also present potential electric control of magnetic behavior using simple semiconductor electronics physics. The observed large change in resistance and doping dependence of phase transformation encourages the development of antiferromagnetic and phase change spintronics devices."

In further studies, Kumar and his colleagues are developing technology to switch spin currents on and off in the materials, with the ultimate goal of creating a spin transistor. They are also working to generate larger, higher-voltage spintronic chips. The result of their work could be extremely low-power, compact transmitters and sensors, as well as energy-efficient data storage and computer memories, said Kumar.

The study's findings will help train artificial intelligence to diagnose diseases

Researchers used machine learning techniques, including natural language processing algorithms, to identify clinical concepts in radiologist reports for CT scans, according to a study conducted at the Icahn School of Medicine at Mount Sinai and published today in the journal Radiology. The technology is an important first step in the development of artificial intelligence that could interpret scans and diagnose conditions.

From an ATM reading handwriting on a check to Facebook suggesting a photo tag for a friend, computer vision powered by artificial intelligence is increasingly common in daily life. Artificial intelligence could one day help radiologists interpret X-rays, computed tomography (CT) scans, and magnetic resonance imaging (MRI) studies. But for the technology to be effective in the medical arena, computer software must be "taught" the difference between a normal study and abnormal findings.

This study aimed to train this technology how to understand text reports written by radiologists. Researchers created a series of algorithms to teach the computer clusters of phrases. Examples of terminology included words like phospholipid, heartburn, and colonoscopy.

Researchers trained the computer software using 96,303 radiologist reports associated with head CT scans performed at The Mount Sinai Hospital and Mount Sinai Queens between 2010 and 2016. To characterize the "lexical complexity" of radiologist reports, researchers calculated metrics that reflected the variety of language used in these reports and compared these to other large collections of text: thousands of books, Reuters news stories, inpatient physician notes, and Amazon product reviews.

"The language used in radiology has a natural structure, which makes it amenable to machine learning," says senior author Eric Oermann, MD, Instructor in the Department of Neurosurgery at the Icahn School of Medicine at Mount Sinai. "Machine learning models built upon massive radiological text datasets can facilitate the training of future artificial intelligence-based systems for analyzing radiological images."

Deep learning describes a subcategory of machine learning that uses multiple layers of neural networks (computer systems that learn progressively) to perform inference, requiring large amounts of training data to achieve high accuracy. Techniques used in this study led to an accuracy of 91 percent, demonstrating that it is possible to automatically identify concepts in text from the complex domain of radiology.

"The ultimate goal is to create algorithms that help doctors accurately diagnose patients," says first author John Zech, a medical student at the Icahn School of Medicine at Mount Sinai. "Deep learning has many potential applications in radiology -- triaging to identify studies that require immediate evaluation, flagging abnormal parts of cross-sectional imaging for further review, characterizing masses concerning for malignancy -- and those applications will require many labeled training examples."

"Research like this turns big data into useful data and is the critical first step in harnessing the power of artificial intelligence to help patients," says study co-author Joshua Bederson, MD, Professor and System Chair for the Department of Neurosurgery at Mount Sinai Health System and Clinical Director of the Neurosurgery Simulation Core.

CAPTION Ritesh Agarwal's research on photonic computing has been focused on finding the right combination and physical configuration of materials that can amplify and mix light waves in ways that are analogous to electronic computer components. In a paper published in Nature Communications, he and his colleagues have taken an important step: precisely controlling the mixing of optical signals via tailored electric fields, and obtaining outputs with a near perfect contrast and extremely large on/off ratios. Those properties are key to the creation of a working optical transistor.  CREDIT Sajal Dhara

Current computer systems represent bits of information, the 1's and 0's of binary code, with electricity. Circuit elements, such as transistors, operate on these electric signals, producing outputs that are dependent on their inputs.

As fast and powerful as supercomputers have become, Ritesh Agarwal, professor in the Department of Materials Science and Engineering in the University of Pennsylvania's School of Engineering and Applied Science, knows they could be more powerful. The field of photonic supercomputing aims to achieve that goal by using light as the medium.

Agarwal's research on photonic supercomputing has been focused on finding the right combination and physical configuration of materials that can amplify and mix light waves in ways that are analogous to electronic computer components.

In a paper published in Nature Communications, he and his colleagues have taken an important step: precisely controlling the mixing of optical signals via tailored electric fields, and obtaining outputs with a near perfect contrast and extremely large on/off ratios. Those properties are key to the creation of a working optical transistor.

"Currently, to compute '5+7,' we need to send an electrical signal for '5' and an electrical signal for '7,' and the transistor does the mixing to produce an electrical signal for '12,'" Agarwal said. "One of the hurdles in doing this with light is that materials that are able to mix optical signals also tend to have very strong background signals as well. That background signal would drastically reduce the contrast and on/off ratios leading to errors in the output."

With background signals washing out the intended output, necessarily computational qualities for optical transistors, such as their on/off ratio, modulation strength and signal mixing contrast have all been extremely poor. Electric transistors have high standards for these qualities to prevent errors.

The search for materials that can serve in optical transistors is complicated by additional property requirements. Only "nonlinear" materials are capable of this kind of optical signal mixing.

To address this issue, Agarwal's research group started by finding a system which has no background signal to start: a nanoscale "belt" made out of cadmium sulfide. Then, by applying an electrical field across the nanobelt, Agarwal and his colleagues were able to introduce optical nonlinearities to the system that enable a signal mixing output that was otherwise zero.

"Our system turns on from zero to extremely large values, and hence has perfect contrast, as well as large modulation and on/off ratios," Agarwal said. "Therefore, for the first time, we have an optical device with output that truly resembles an electronic transistor."

With one of the key components coming into focus, the next steps toward a photonic computer will involve integrating them with optical interconnects, modulators, and detectors in order to demonstrate actual computation.

  1. Five research teams across Columbia use data science to solve societal problems
  2. CaseMed researchers win $6.5 million NIH grant to use big data to tackle psoriasis
  3. Yale-NUS undergrads find two theoretical physics models to be equivalent
  4. Fujitsu president Tatsuya Tanaka receives the French government's prestigious Legion of Honor
  5. Dutch researchers accelerate the pace of development of silicon quantum supercomputer chip
  6. Belgian Ph.D. student decodes DNA, wins a Bitcoin
  7. Artificial intelligence predicts corruption in Spain
  8. UK scientists run new simulations that show Sanchi oil spill contamination could reach Japan within a month
  9. BU med school uses new AI to significantly improve human kidney analysis
  10. KU researchers use machine learning to predict new details of geothermal heat flux beneath the Greenland Ice Sheet
  11. SETI project homes in on strange 'fast radio bursts'
  12. NYC Health Department spots 10 outbreaks of foodborne illness using Yelp reviews since 2012
  13. CaseMed researchers win $2.8 million to repurpose FDA-approved drugs to treat Alzheimer's disease
  14. UMN makes new discovery that improves brain-like memory, supercomputing
  15. Activist investor Starboard calls for substantial change at Israel’s Mellanox
  16. KU researchers make solar energy more efficient
  17. Researchers discover serious processor vulnerability
  18. We need one global network of 1000 stations to build an Earth observatory
  19. Statistical test relates pathogen mutation to infectious disease progression
  20. Reich lab creates collaborative network for combining models to improve flu season forecasts

Page 2 of 42