LATEST

Naveen Vaidya

37 million people around the world today live with Human Immunodeficiency Virus (HIV), which is responsible for roughly 1.1 million deaths caused by AIDS-related conditions. 

The virus replicates by inserting itself into the genetic code of CD4+ memory T-cells, human immune cells essential to the body’s immune response. While antiretroviral therapy (ART) can interfere with this replication process, complete elimination of the virus is a challenge, since HIV maintains latent viral reservoirs within the body that can help re-establish infection. Viral reservoirs exist within resting CD4+ memory T-cells that maintain replication-competent HIV for extended time periods, allowing viral persistence even in the face of immune surveillance or antiretroviral therapy.

As Naveen Vaidya, a mathematics professor at San Diego State University, explains, “Currently there is no cure for HIV, presumably due to the establishment of latently infected cells that cannot be destroyed by available antiretroviral therapy. Hence, the primary focus of current HIV research has been to destruct HIV latent infections, and as part of this effort, initiation of antiretroviral therapy early in infection to avoid the formation of latently infected cells has been considered as potential means of successful HIV cure.”

While early treatment of infection has been shown to limit and even possibly eradicate the virus in the case of pre-exposure and post-exposure prophylactic treatments, some studies have shown mixed results with regard to their effect on viral rebound. Additionally, factors determining the success of early treatment are poorly understood and the accurate timing for establishment of latent reservoirs in humans post-infection is not known. This necessitates the development of effective strategies to control latently infected cells. The pharmacodynamic properties of drugs and their effects on success of treatment have so far received little attention.

In a paper publishing this week in the SIAM Journal on Applied Mathematics, Vaidya and Libin Rong, a mathematics professor at the University of Florida, propose a mathematical model that investigates the effects of drug parameters and dosing schedules on HIV latent reservoirs and viral load dynamics.

“Our research uses mathematical modeling to gain deeper insights into the effects of antiretroviral therapy on HIV latent infections, and highlights that the pharmacodynamics of drugs—and thus, the choice of drugs—used in treatment regimens can be a determinant factor for successful therapy,” Vaidya says.

While previous mathematical models have helped analyze dynamics of latently-infected cells, studies exploring antiretroviral therapy and the resulting pharmacodynamics in latent reservoir dynamics are lacking.

“We have developed theories of infection threshold that help identify values of drug-related parameters for avoiding latent infections,” Vaidya says. “Our results on detailed analysis of pharmacodynamics can contribute significantly to the study of drug-related parameters for controlling HIV latent infection and possibly HIV cure.”

Their model specifically focuses on the impact of antiretroviral therapy early in treatment to control latently infected cells. Using a realistic periodic drug intake scenario to obtain a periodic model system, the authors study local as well as global properties of infection dynamics, described via differential equations. The model takes into account uninfected target cells, productively infected cells, latently infected cells, and free virus concentrations as mutually exclusive compartments.

Currently available antiretroviral therapy demonstrates antiviral activity either by reducing the infection rate or viral production rate. Based on a classical dose-response relationship, the authors formulate residual viral infectivity and residual viral production during antiretroviral therapy.

Variations in specific drug parameters are shown to generate either an infection-free steady state or persistent infection. A viral invasion threshold, derived based on the model, is seen to govern the global stability of the infection-free steady state and viral persistence.

“Pharmacodynamic parameters and dosing schedule can have significant impact on outcomes of infection dynamics in HIV patients. This effect is particularly pronounced in early and preventive therapy,” Vaidya points out. The authors show that the invasion threshold is highly dependent on a few pharmacodynamic parameters. Vaidya continues, “These parameters can determine whether latent infection will establish or not; in general, treatment regimens containing drugs with a larger slope of the dose-response curve, a higher ratio of the maximum dosage to the 50% inhibitory concentration, a longer half-life and a smaller dosing interval, have the potential to prevent or postpone the establishment of viral infection. Thus, choice of drugs is key to successful cure via early therapy.”

Vaidya’s results demonstrate that prophylaxis or very early treatment using drugs with a good pharmacodynamics profile can potentially prevent or postpone establishment of viral infection. Only drugs with proper pharmacodynamic properties given at proper intervals can successfully combat infection. “However, once the latent infection is established, the pharmacodynamic parameters have less effect on the latent reservoir and virus dynamics,” Vaidya says. “This is because the latent reservoir can be maintained by hemostasis of latently infected cells or other mechanisms rather than ongoing residual viral replications.”

Efforts to maximize the impact of HIV therapy and the performance of different treatment regimens is essential to curtail the disease’s burden on public health. Mathematical modeling offers a theoretical framework to evaluate drug pharmacodynamics and their antiviral effects on HIV dynamics.

“Mathematical models can be used to analyze and simulate a large number of treatment scenarios, which are often impossible and/or extremely difficult to study in vivo and/or vitro experimental settings,” as Vaidya explains. “The results from these models can also provide novel themes for further experiments. For example, our modeling results in this study suggest that the drugs with a larger slope of the drug-response curve, such as protease inhibitors, are more effective in controlling latent infections, and thus such drugs in treatment regimens need to be included in further experimental studies.”

While these theoretical results offer useful ideas to develop treatment protocols, these in vivo and in vitro experimental studies are needed to properly design treatment regimens for successful control of latent infections. “Our group and collaborators will continue to develop mathematical models to study effects of pharmacodynamics on latent infection of HIV, including the models with the emergence of drug resistance,” says Vaidya. “Furthermore, our future modeling study will include the effects of drug pharmacodynamics on treatment outcomes in HIV patients under conditioning of drugs of abuse, and identifying optimal control regimens for successful reduction of latent infections.” 

The merger of two equal mass neutron stars is simulated using the 3-D code SNSPH. As the two stars merge, their outer edge ejects a spiral of neutron-rich material. The radioactivity in this ejected material is the primary power source for the optical and infrared light observed in the kilonova. A single hyper-massive neutron star remains at the center in a wide field of ejecta material. This hyper-massive neutron star will quickly collapse to a black hole.

Gravitational-wave observation confirms heavy-elements theory

Astrophysicist Chris Fryer was enjoying an evening with friends on August 25, 2017, when he got the news of a gravitational-wave detection by LIGO, the Laser Interferometer Gravitational-wave Observatory. The event appeared to be a merger of two neutron stars -- a specialty for the Los Alamos National Laboratory team of astrophysicists that Fryer leads. As the distant cosmic cataclysm unfolded, fresh observational data was pouring in from the observation -- only the fifth published since the observatory began operating almost two years ago.

"As soon as I heard the news, I knew that understanding all of the implications would require input from a broad, multi-disciplinary set of scientists," said Fryer, who leads Los Alamos' Center for Theoretical Astrophysics. Fryer's colleagues, Ryan Wollaeger and Oleg Korobkin, outlined a series of radiation transport calculations and were given priority on Los Alamos' supercomputers to run them. "Within a few hours, we were up and running."

They soon discovered the LIGO data showed more ejected mass from the merger than the simulations accounted for. Other researchers at Los Alamos began processing data from a variety of telescopes capturing optical, ultraviolet, x-ray, and gamma-ray signals at observatories around the world (and in space) that had all been quickly directed to the general location of the LIGO discovery.

The theorists tweaked their models and, to their delight, the new LIGO data confirmed that heavy elements beyond iron were formed by the r-process (rapid process) in the neutron-star merger. The gravitational wave observation was having a major impact on theory.

They also quickly noticed that, within seconds of the time of the gravitational waves, the Fermi spacecraft reported a burst of gamma rays from the same part of the sky. This is the first time that a gravitational wave source has been detected in any other way. It confirms Einstein's prediction that gravitational waves travel at the same speed as gamma rays: the speed of light.

When neutron stars collide

The gravitational wave emission and related electromagnetic outburst came from the merger of two neutron stars in a galaxy called NGC 4993, about 130 million light-years away in the constellation Hydra. The neutron stars are the crushed remains of massive stars that once blew up in tremendous explosions known as supernovas.

With masses 10 and 20 percent greater than the sun's and a footprint the size of Washington, D.C., the neutron stars whirled around each other toward their demise, spinning hundreds of times per second. As they drew closer like a spinning ice skater pulling in her arms, their mutual gravitational attraction smashed the stars apart in a high-energy flash called a short gamma-ray burst and emitted the tell-tale gravitational wave signal. Although short gamma-ray bursts have long been theorized to be produced through neutron star mergers, this event -- with both gamma-ray and gravity wave observations -- provides the first definitive evidence.

With Los Alamos's cross-disciplinary, multi-science expertise, the Los Alamos team was geared up and ready for just such an event. Laboratory researcher Oleg Korobkin is the lead theory author on a paper released yesterday in Science, while the Lab's Ryan Wollaeger is the second theory author on a paper released yesterday in Nature.

Beyond that theory work, though, Los Alamos scientists were engaged in a broad range of observations, astronomy, and data analysis tasks in support of the LIGO neutron-star discovery. Because the Laboratory's primary mission centers on the nation's nuclear stockpile, Los Alamos maintains deep expertise in nuclear physics and its cousin astrophysics, the physics of radiation transport, data analysis, and the computer codes that run massive nuclear simulations on world-leading supercomputers. In other words, the Laboratory is a logical partner for extending LIGO discoveries into theories and models and for confirming the conclusions about what the observatory discovers.

Broadens Emerson’s software capabilities to serve global oil and gas industry; Positions Emerson as the largest independent provider of Exploration & Production software solutions

Emerson has agreed to acquire Paradigm for $510 million, reflecting a multiple of 13 times expected 2017 EBITDA. Paradigm, joined with Emerson’s existing Roxar software business, creates a best-in-class, end-to-end Exploration & Production (E&P) software portfolio with offerings spanning seismic processing and interpretation to production modeling. Paradigm’s technology offerings will enable Emerson to better help oil and gas operators increase efficiency, reduce costs and improve return on investment.

Expanding Emerson’s E&P software and solutions enables the company to help operators achieve Top Quartile Performance on investment and operational goals within new and established reservoirs. Interpreting data and generating high-fidelity representations of existing brownfield assets allows oil and gas companies to maximize production and avoid non-productive drilling and exploration spending. In addition, Emerson’s expanded services now enable oil and gas operators, through machine learning and cloud computing, to make decisions in the field leading to more efficient operations.

“This acquisition is a significant technology investment that meets our customers’ growing demand for an independent, global provider of E&P software solutions,” said Emerson Chairman and Chief Executive Officer David N. Farr. “Paradigm broadens our leadership in the upstream oil and gas market by adding a range of subsurface software tools that complement our growing Automation Solutions portfolio.”

Paradigm is headquartered in Houston and has more than 500 employees globally. The company provides an array of tools that enable customers to gain deeper insight into the subsurface, reduce uncertainty and support responsible asset management.

“When combined with Emerson’s Roxar Software Solutions portfolio, Paradigm expands the global upstream oil and gas capability of our Plantweb digital ecosystem, creating a more comprehensive digital portfolio for our customers from exploration to production,” said Mike Train, Executive President of Emerson Automation Solutions. “Our offering can now help customers better maximize the value of their existing investments and reach Top Quartile Performance.” Top Quartile Performance is defined as achieving operations and capital performance in the top 25 percent of peer companies.

CAPTION This is a brief overview of the six newly-found families of periodic three-body orbits. Blue line: orbit of Body-1; red line: orbit of Body-2; black line: orbit of Body-3

The famous three-body problem can be traced back to Isaac Newton in 1680s, thereafter Lagrange, Euler, Poincare and so on. Studies on the three-body problem leaded to the discovery of the so-called sensitivity dependence of initial condition (SDIC) of chaotic dynamic system. Nowadays, the chaotic dynamics is widely regarded as the third great scientific revolution in physics in 20th century, comparable to the relativity and the quantum mechanics. Thus, the studies on three-body problem have very important scientific meanings.

Poincare in 1890 revealed that trajectories of three-body systems are commonly non-periodic, i.e. not repeating. This can explain why it is so hard to gain periodic orbits of three-body system. In the 300 years since three-body problem was first recognized, only three families of periodic orbits had been found, until 2013 when Suvakov and Dmitrasinovic [Phys. Rev. Lett. 110, 114301 (2013)] made a breakthrough to numerically find 13 new distinct periodic orbits, which belong to 11 new families of Newtonian planar three-body problem with equal mass and zero angular momentum (see http://www.sciencemag.org/news/2013/03/physicists-discover-whopping-13-new-solutions-three-body-problem). Currently, two scientists, XiaoMing Li and ShiJun Liao at Shanghai Jiaotong University, China, successfully gained 695 families of periodic orbits of the above-mentioned Newtonian planar three-body system by means of national supercomputer TH-2 at Guangzhou, China, which are published online via SCIENCE CHINA-Physics Mechanics Astronomy, 2017, Vol. 60, No. 12: 129511. The movies of these orbits are given on the website http://numericaltank.sjtu.edu.cn/three-body/three-body.htm

These 695 periodic orbits include the well-known figure-eight family found by Moore in 1993, the 11 families found by Suvakov and Dmitrasinovic in 2013, and especially more than 600 new families that have never been reported. The two scientists used the so-called "Clean Numerical Simulation (CNS)", a new numerical strategy for reliable simulations of chaotic dynamic systems proposed by the second author in 2009, which is based on high enough order of Taylor series and multiple precision data with many enough significant digits, plus a convergence/reliability check. The CNS can reduce truncation error and round-off error so greatly that numerical noises are negligible in a long enough interval of time, thus more periodic orbits of the three-body system can be gained.

As pointed out by Montgomery in 1998, each periodic orbit in real space of the three-body system corresponds to a closed curve on the so-called "shape sphere", which is characterized by its topology using the so-called "free group element". The averaged period of an orbit is equal to the period of the orbit divided by the length of the corresponding free group element. These 695 families suggest that there should exist the quasi Kepler's third law: the square of the average period times the cube of the total kinetic and potential energy approximately equals to a constant. The generalized Kepler's third law reveals that the three-body system has something in common, which might deepen our understandings and enrich our knowledges about three-body system.

"The discovery of the more than 600 new periodic orbits is mainly due to the advance in computer science and the use of the new strategy of numerical simulation for chaotic dynamic systems, namely the CNS", spoke the two scientists. It should be emphasized that 243 more new periodic orbits of the three-body system are found by means of the CNS. In other words, if traditional algorithms in double precision were used, about 40% new periodic orbits would be lost. This indicates the novelty and originality of the Clean Numerical Simulation (CNS), since any new methods must bring something completely new/different.

As shown in Figure 1, many pictures of these newly-found periodic orbits of the three-body system are beautiful and elegant, like modern paintings. "We are shocked and captivated by the perfect of them", spoke the two scientists.

"How many different cell types are there in a human body? And how do these differences develop? Nobody really knows," says Professor Stein Aerts from KU Leuven (University of Leuven) and VIB, Belgium. But thanks to a new method developed by his team, that may be about to change.

Even though each of the cells in our body carries the exact same DNA sequence, there's a huge variety of cell types and functions. These differences stem from how the DNA sequence is interpreted: not all genes are 'switched on' in each cell. 

Recent advances in single-cell sequencing have already made it possible to measure which of our 20,000 genes are active in an individual cell. With over 30 trillion cells in the human body, these techniques provide an unprecedented level of detail that is revolutionising research in biology and medicine. But when this method is applied to thousands of cells from different tissues, it becomes increasingly challenging to process the enormous amounts of data and detect meaningful patterns.

Computational biologist Stein Aerts (KU Leuven and VIB) and his team joined forces with mathematicians, bioengineers, and IT specialists to rise to the challenge. "We developed SCENIC, a supercomputer program that identifies different cell types based on their gene expression patterns, quickly and accurately. This allows for a better understanding of how the fate of the cell types is regulated, and for the identification of master regulators, which could also be potential drug targets."

"This novel method doesn't just help us to learn more about the different cells in our body," says postdoctoral researcher Sara Aibar Santos. "It also tells us how cell activity changes in the course of time, or when we get sick." 

The team has already applied the method to brain tissue of mice and humans. They also used it to analyse and compare cancer cells from brain and skin tumours, which led to the identification of new cell types related to metastasis, the spreading of cancer cells to other parts of the body.

The method could help develop the Human Cell Atlas, a global effort aimed at mapping all different cell types and states in the human body. "This atlas would be an invaluable source of information for both research and medicine," Aibar Santos continues. "It would allow us to systematically study the biological changes associated with different diseases, to understand where disease-associated genes are active in our bodies, analyse the molecular mechanisms that govern the production and activity of different cell types, and sort out how different cell types combine and work together to form tissues."

CAPTION Spatial-derivative STM images with 200x200 nm^2 at Vs = +1.5 V. Flat terraces become brighter and edges darker. The downstairs direction runs from left ((110) top-surface) to right ((-1-10) back-surface).

Researchers image perfectly smooth side-surfaces of 3-D silicon crystals with a scanning tunneling microscope, paving the way for smaller and faster computing devices

A research collaboration between Osaka University and the Nara Institute of Science and Technology for the first time used scanning tunneling microscopy (STM) to create images of atomically flat side-surfaces of 3D silicon crystals. This work helps semiconductor manufacturers continue to innovate while producing smaller, faster, and more energy-efficient computer chips for computers and smartphones. 

Our computers and smartphones each are loaded with millions of tiny transistors. The processing speed of these devices has increased dramatically over time as the number of transistors that can fit on a single computer chip continues to increase. Based on Moore's Law, the number of transistors per chip will double about every 2 years, and in this area it seems to be holding up. To keep up this pace of rapid innovation, computer manufacturers are continually on the lookout for new methods to make each transistor ever smaller. 

Current microprocessors are made by adding patterns of circuits to flat silicon wafers. A novel way to cram more transistors in the same space is to fabricate 3D-structures. Fin-type field effect transistors (FETs) are named as such because they have fin-like silicon structures that extend into the air, off the surface of the chip. However, this new method requires a silicon crystal with a perfectly flat top and side-surfaces, instead of just the top surface, as with current devices. Designing the next generation of chips will require new knowledge of the atomic structures of the side-surfaces.

Now, researchers at Osaka University and the Nara Institute of Science and Technology report that they have used STM to image the side-surface of a silicon crystal for the first time. STM is a powerful technique that allows the locations of the individual silicon atoms to be seen. By passing a sharp tip very close to the sample, electrons can jump across the gap and create an electrical current. The microscope monitored this current, and determined the location of the atoms in the sample. 

"Our study is a big first step toward the atomically resolved evaluation of transistors designed to have 3D-shapes," study coauthor Azusa Hattori says.

To make the side-surfaces as smooth as possible, the researchers first treated the crystals with a process called reactive ion etching. Coauthor Hidekazu Tanaka says, "Our ability to directly look at the side-surfaces using STM proves that we can make artificial 3D structures with near-perfect atomic surface ordering."

This image shows the percent of July precipitation change (Figure 4b from the paper).

Researchers have struggled to accurately model the changes to the abundant summer rains that sweep across the southwestern United States and northwestern Mexico, known to scientists as the "North American monsoon."

In a report published Oct. 9 in the journal Nature Climate Change, a team of Princeton and National Oceanic and Atmospheric Administration (NOAA) researchers have applied a key factor in improving climate models - correcting for sea surface temperatures - to the monsoon.

The research was made possible by enhancements to NOAA's research supercomputing capability including access to Gaea and Theia. 

The report's authors include Salvatore Pascale, an associate research scholar in atmospheric and oceanic sciences (AOS); Tom Delworth, a lecturer in geosciences and AOS and research scientist at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL); Sarah Kapnick, a 2004 Princeton alumna and former AOS postdoc who is currently a research physical scientist at GFDL; AOS associate research scholar Hiroyuki Murakami; and Gabriel Vecchi, a professor of geosciences and the Princeton Environmental Institute.

When they corrected for persistent sea surface temperature (SST) biases and used higher-resolution data for the regional geography, the researchers created a model that accurately reflects current rainfall conditions and suggests that future changes could have significant consequences for regional water resources and hazards.

"This study represents fundamental science relating to the physics of the North American monsoon, but feeds back onto weather to climate predictions and building resiliency for our water supply and responses to hazards," said Kapnick. "I am excited about this leap forward to improve our models and for the potential applications that they will provide in the future to society." 

Their results highlight the possibility of a strong precipitation reduction in the northern edge of the monsoon in response to warming, with consequences for regional water resources, agriculture and ecosystems.

"Monsoon rains are critical for the southwest U.S. and northwest Mexico, yet the fate of the North American monsoon is quite uncertain," said Pascale, the lead author on the paper. "The future of the monsoon will have direct impacts on agriculture, on livelihoods."

Previous general circulation models have suggested that the monsoons were simply shifting later, with decreased rains through July but increased precipitation in September and October.

"The consensus had been that global warming was delaying the monsoon ... which is also what we found with the simulation if you didn't correct the SST biases," Pascale said. "Uncontrolled, the SST biases can considerably change the response. They can trick us, introducing artefacts that are not real."

Once those biases were corrected for, the researchers discovered that the monsoon is not simply delayed, but that the total precipitation is facing a dramatic reduction.

That has significant implications for regional policymakers, explained Kapnick. "Water infrastructure projects take years to a decade to plan and build and can last decades. They require knowledge of future climate ... to ensure water supply in dry years. We had known previously that other broadly used global models didn't have a proper North American monsoon. This study addresses this need and highlights what we need to do to improve models for the North American monsoon and understanding water in the southwest."

The new model also suggests that the region's famous thunderstorms may become less common, as the decreased rain is associated with increased stability in the lower-to-middle troposphere and weakened atmospheric convection.

"The North American monsoon is also related to extreme precipitation events that can cause flash floods and loss of life," Kapnick said. "Knowing when the monsoon will start and predicting when major events will happen can be used for early warnings and planning to avoid loss of life and property damage. This paper represents the first major step towards building better systems for predicting the monsoon rains."

The researchers chose to tackle the region in part because previous, coarser-resolution models had shown that this area would be drying out, a prediction that has been borne out in the droughts and wildfires of recent years. But most of those droughts are attributed to the change in winter storms, said Pascale.

"The storm track is projected to shift northward, so these regions might get less rain in winter, but it was very uncertain what happens to the monsoon, which is the other contributor to the rains of the region. We didn't know, and it's crucial to know," he said. 

In their model, the researchers were able to tease out the impacts of one factor at a time, which allowed them to investigate and quantify the monsoon response to the doubling of atmospheric carbon dioxide, increased temperatures and other individual changes.

Pascale stressed the limits of this or any other climate model. "They need to be used with an understanding of their shortcomings and utilized to their expected potential but no further. They can give us quite reliable information about the large scale atmospheric circulation, but if you want to look at the regional, small-scale effects, you have to be very careful," he said. "Models are critical but they are not perfect, and small imperfections can lead to big misunderstandings."

He continued: "We are not saying, 'We are sure that this is what will be,' but we wanted to point out some mechanisms which are key, and have to be taken into account in future research on the North American monsoon. This is a difficult region, so future research will point out if we were right, and to what extent."

Paradigm k can predict fluid flow in complex fractured environments, in minutes – and without time consuming setup.

The new product will be unveiled and demonstrated at the 2017 SPE ATCE.

Paradigm (www.pdgm.com) has launched Paradigm k, its new Production Management solution. 

Paradigm k (www.pdgm.com/k) is a Cloud-based, collaborative workflow for production and reservoir engineers to support field operations in maintenance and management of the well system. 

Paradigm k applies fast and rigorous reservoir physics to real-time decision-making using advanced computational mathematics.  With advanced tools for surveillanceoptimization and collaboration, Paradigm k helps asset teams achieve an elevated level of situational awareness to meet field production targets and eliminate deferment. 

"Paradigm k is the latest addition to our rapidly expanding offering of solutions for production management," said Indy Chakrabarti, Paradigm's senior vice president of Product Management. "The Paradigm k technology provides unprecedented speed and a holistic view of physical measurements and critical production performance metrics at the resolution of individual wells.  This helps detect production issues before they transpire, and enables truly proactive production management at the speed of today's oilfield operations."

CAPTION A zero-index waveguide compatible with current silicon photonic technologies. CREDIT Second Bay Studios/Harvard SEAS

Researchers directly observe infinitely long wavelengths for the first time

In 2015, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) developed the first on-chip metamaterial with a refractive index of zero, meaning that the phase of light could be stretched infinitely long. The metamaterial represented a new method to manipulate light and was an important step forward for integrated photonic circuits, which use light rather than electrons to perform a wide variety of functions.

Now, SEAS researchers have pushed that technology further - developing a zero-index waveguide compatible with current silicon photonic technologies. In doing so, the team observed a physical phenomenon that is usually unobservable -- a standing wave of light. 

The research is published in ACS Photonics. The Harvard Office of Technology Development has filed a patent application and is exploring commercialization opportunities.

When a wavelength of light moves through a material, its crests and troughs get condensed or stretched, depending on the properties of the material. How much the crests of a light wave are condensed is expressed as a ratio called the refractive index -- the higher the index, the more squished the wavelength.

When the refractive index is reduced to zero the light no longer behaves as a moving wave, traveling through space in a series of crests and troughs, otherwise known as phases. Instead, the wave is stretched infinitely long, creating a constant phase. The phase oscillates only as a variable of time, not space.

This is exciting for integrated photonics because most optical devices use interactions between two or more waves, which need to propagate in sync as they move through the circuit. If the wavelength is infinitely long, matching the phase of the wavelengths of light isn't an issue, since the optical fields are the same everywhere.

But after the initial 2015 breakthrough, the research team ran into a catch-22. Because the team used prisms to test whether light on the chip was indeed infinitely stretched, all of the devices were built in the shape of a prism. But prisms aren't particularly useful shapes for integrated circuits. The team wanted to develop a device that could plug directly into existing photonic circuits and for that, the most useful shape is a straight wire or waveguide.

The researchers -- led by Eric Mazur, the Balkanski Professor of Physics -- built a waveguide but, without the help of a prism, had no easy way to prove if it had a refractive index of zero.

Then, postdoctoral fellows Orad Reshef and Philip Camayd-Muñoz had an idea.

Usually, a wavelength of light is too small and oscillates too quickly to measure anything but an average. The only way to actually see a wavelength is to combine two waves to create interference. 

Imagine strings on a guitar, pinned on either side. When a string is plucked, the wave travels through the string, hits the pin on the other side and gets reflected back -- creating two waves moving in opposite directions with the same frequency. This kind of interference is called a standing wave.

Reshef and Camayd-Muñoz applied the same idea to the light in the waveguide. They "pinned-down" the light by shining beams in opposite directions through the device to create a standing wave. The individual waves were still oscillating quickly but they were oscillating at the same frequency in opposite directions, meaning at certain points they canceled each other out and other points they added together, creating an all light or all dark pattern. And, because of the zero-index material, the team was able to stretch the wavelength large enough to see.

This may be the first time a standing wave with infinitely-long wavelengths has ever been seen. 

"We were able to observe a breath-taking demonstration of an index of zero," said Reshef, who recently accepted a position at the University of Ottawa. "By propagating through a medium with such a low index, these wave features, which in light are typically too small to detect directly, are expanded so you can see them with an ordinary microscope."

"This adds an important tool to the silicon photonics toolbox," said Camayd-Muñoz. "There's exotic physics in the zero-index regime, and now we're bringing that to integrated photonics. That's an important step, because it means we can plug directly into conventional optical devices, and find real uses for zero-index phenomena. In the future, quantum supercomputers may be based on networks of excited atoms that communicate via photons. The interaction range of the atoms is roughly equal to the wavelength of light. By making the wavelength large, we can enable long-range interactions to scale up quantum devices."

  1. NICT demos world record 53.3 Tb/s switching capacity for data center networks
  2. AI set to revolutionize retail banking, says GlobalData
  3. China builds world's first space-ground integrated quantum communication network
  4. Russian researchers simulate the motion of incompressible liquid
  5. RIT's Lousto maps black hole collisions, gives astronomers hitchhikers guide to help LIGO-Virgo pinpoint mergers
  6. University of Vienna scientists use machine learning to accelerate MD simulation of infrared spectra
  7. Italian researchers reveal autism-related genes including genes related to cancer
  8. Lehigh's Rangarajan wins grant for supercomputational discovery of new catalysts for olefin production through quantum chemistry calculations
  9. CSHL supercomputational biologists Jesse Gillis develops new tools to analyze neuronal cell types defined by gene activity shaping their communication patterns
  10. Oxford researcher Steven Reece develops machine learning approach to help hurricane relief efforts
  11. Pitt researcher John Keith develops machine learning methods to search for a greener cleaner
  12. Solar wind impacts on giant 'space hurricanes' may affect satellite safety
  13. VLA Sky Survey begins huge project of cosmic discovery
  14. Facebook announces major AI commitment to CIFAR
  15. NBI astrophysicist Michael Küffmeier shows how star formation is influenced by local environmental conditions
  16. UC Riverside physicist Hai-Bo Yu offers explanation for diverse galaxy rotations
  17. WPI's Ji develops highly specific supercomputer models to better diagnose concussions in real time
  18. Hershberg lab develops new data analysis, visualization tools to create atlas of specialized defense cells in the human body
  19. Caltech scientists build first on-chip nanoscale optical quantum memory
  20. Georgia Tech scientists propose supercomputational method to optimally position sensor-based security system for maximum surveillance

Page 9 of 41