Blanket of light may give better quantum supercomputers

Researchers from DTU Physics describe in an article in Science, how -- by simple means -- they have created a 'carpet' of thousands of quantum-mechanically entangled light pulses

Quantum mechanics is one of the most successful theories of natural science, and although its predictions are often counterintuitive, not a single experiment has been conducted to date of which the theory has not been able to give an adequate description.

Along with colleagues at bigQ (Center for Macroscopic Quantum States - a Danish National Research Foundation Center of Excellence), center leader Prof. Ulrik Lund Andersen is working on understanding and utilizing macroscopic quantum effects:

"The prevailing view among researchers is that quantum mechanics is a universally valid theory and therefore also applicable in the macroscopic day-to-day world we normally live in. This also means that it should be possible to observe quantum phenomena on a large scale, and this is precisely what we strive to do in the Danish National Research Foundation Center of Excellence bigQ," says Ulrik Lund Andersen. 

In a new article in the most prestigious international journal Science, the researchers describe how they have succeeded in creating entangled, squeezed light at room temperature. A discovery that could pave the way for less expensive and more powerful quantum supercomputers. CAPTION Artwork illustrating the cluster state generated in our work.  CREDIT Jonas S. Neergaard-Nielsen{module In-article}

Their work concerns one of the most notoriously difficult quantum phenomena to understand: entanglement. It describes how physical objects can be brought into a state in which they are so intricately linked that they can no longer be described individually.

If two objects are entangled, they must be seen as a unified whole regardless of how far from each other they are. They will still behave as one unit--and if the objects are measured individually, the results will be correlated to such a degree that it cannot be described based on the classical laws of nature. This is only possible using quantum mechanics.

Entanglement is not restricted to pairs of objects. In their efforts to observe quantum phenomena on a macroscopic scale, the researchers at bigQ managed to create a network of 30,000 entangled pulses of light arranged in a two-dimensional lattice distributed in space and time. It is almost like when a myriad of colored threads are woven together into a patterned blanket.

The researchers have produced light beams with special quantum mechanical properties (squeezed states) and woven them together using optical fiber components to form an extremely entangled quantum state with a two-dimensional lattice structure--also called a cluster state.

"As opposed to traditional cluster states, we make use of the temporal degree of freedom to obtain the two-dimensional entangled lattice of 30.000 light pulses. The experimental setup is surprisingly simple. Most of the effort was in developing the idea of the cluster state generation," says Mikkel Vilsbøll Larsen, the lead author of the work.

Creating such an extensive degree of quantum physical entanglement is--in itself--interesting basic research,

The cluster state is also a potential resource for creating an optical quantum computer. This approach is an interesting alternative to the more widespread superconducting technologies, as everything takes place at room temperature.

Also, the long coherence time of the laser light can be utilized--meaning that it is maintained as a precisely defined light wave even over very long distances.

An optical quantum computer will therefore not require costly and advanced refrigeration technology. At the same time, its information-carrying light-based qubits in the laser light will be much more durable than their ultra-cold electronic relatives used in superconductors.

"Through the distribution of the generated cluster state in space and time, an optical quantum computer can also more easily be scaled to contain hundreds of qubits. This makes it a potential candidate for the next generation of larger and more powerful quantum computers," adds Ulrik Lund Andersen.

Japanese researchers demo petabit per second network node

Gathering the latest advancements in optical fiber telecommunications technology towards practical petabit-class backbone networks

The Network System Research Institute at the National Institute of Information and Communications Technology (NICT, President: Hideyuki Tokuda, Ph.D.) has developed and demonstrated the first large-scale optical switching testbed capable of handling 1 Petabit per second optical signals. 1 Petabit per second is equivalent to the capacity to send 8K video to 10 million people simultaneously.

This demonstration made use of state-of-the-art large-scale and low-loss optical switches based on MEMS technology, three types of next-generation spatial-division multiplexing fibers, and included the routing of signals with capacities from 10 Terabit per second to 1 Petabit per second. This corresponds to more than 100 times the capacity of currently available networks.

This is a major step forward towards the early implementation of the petabit-class backbone optical networks capable of supporting the increasing requirements of internet services such as broadband video streaming, 5G mobile networks or the Internet of Things. As such, the results of this demonstration were acknowledged by the scientific community with a post-deadline presentation at the 45th European Conference on Optical Communication (ECOC 2019). {module In-article} 

Experimental setup

How aerosols affect our climate

Using a massive NASA dataset, Yale researchers have created a framework that helps explain just how sensitive local temperatures are to aerosols

For many, the word "aerosol" might conjure thoughts of hairspray or spray paint. More accurately, though, aerosols are simply particles found in the atmosphere. They can be human-made, like from car exhaust or biomass burning, or naturally occurring, from sources such as volcanic eruptions or sea spray.

Aerosols account for one of the greater uncertainties in understanding the Earth's climate and, through a cooling effect, mask a significant portion of the warming caused by the increase in greenhouse gas concentrations.

One unresolved issue in understanding aerosol-climate interactions is why, for a unit change in the energy imbalance at the top of the atmosphere, the surface temperature change is higher for aerosols than for greenhouse gases. This is known as climate sensitivity. The conventional understanding is that the higher climate sensitivity to aerosols is due to their higher concentrations over land surfaces, which heat up and cool down faster than oceans. {module In-article}

In a recently published paper in the American Geophysical Union's journal Geophysical Research Letters, Yale researchers demonstrate that it is not only the geographic distribution of aerosols that explains the higher climate sensitivity but also the specific local-scale interactions with the land surface.

Using a theoretical framework to separate surface temperature response to external forcing, the study also provides mechanistic insight into spatial patterns of the local temperature change due to aerosols.

"With traditional climate models, there are huge uncertainties in how aerosols affect surface temperature," said T.C. Chakraborty, a Ph.D. student at F&ES who co-authored the paper with Xuhui Lee, the Sara Shallenberger Brown Professor of Meteorology. "This framework helps explain why and how some of these uncertainties are coming into play."

Aerosols are known to increase radiation in the longer wavelengths (longwave) and decrease radiation in the shorter wavelengths (shortwave). The strength of these effects depends on the size and chemical nature of the aerosol particles. Using the framework to analyze a massive dataset developed by NASA, Chakraborty found that although the longwave effect of aerosols has generally been considered by the scientific community to be less important, the climate is more sensitive to it than to the shortwave effect.

This is because of the absence of the shortwave effect at night, a time when the atmosphere is more stable -- and thus more sensitive to radiation. It is also the result of the high climate sensitivity in arid regions, where the longwave effect is prevalent due to the presence of aerosols from coarse mineral dust. Combined, the longwave and shortwave effects reduce the terrestrial diurnal temperature range by almost one degree Fahrenheit. Aggregating the eight major regions of interest used in the study, about half of this reduction is due to human-made aerosols.

There are also long-term trends, Chakraborty said, that show an intensification of the local climate sensitivity in the tropics due to deforestation between 1980 and 2018, demonstrating the importance of vegetation in regulating interactions between aerosols and the climate.