Voltaire has announced that it continues to grow its adVantage Partner Program, which now includes more than fifty partner organizations consisting of resellers, VARs and distributors. The program provides comprehensive sales, marketing, technical training and support services to help partners worldwide sell Voltaire's industry-leading portfolio of 10 Gigabit Ethernet (10GbE) and InfiniBand datacenter solutions.

New members of the program include Dasher Technologies and Torrey Point which recommend Voltaire products to their customers because of their differentiated and solutions-oriented approach to networking. In addition to 10GbE and InfiniBand switching platforms, the Voltaire portfolio includes unique software offerings to address key networking requirements such as: managing networks in the age of cloud computing and virtualized datacenters, accelerating access to storage, and reducing latency for applications such as high frequency trading.

"Our strong technical expertise and vendor independence allow us to integrate best-of-breed software, hardware and services into a custom solution that directly impacts the business, said Al Chien, Vice President of Sales and Marketing, Dasher Technologies. "Voltaire's approach to 10GbE networks includes a software management layer in addition to the switches that is unique in the industry and serves as the foundation for cloud networking. We are pleased to be part of the adVantage program and to offer Voltaire solutions to our customers."

"Torrey Point designs networks with a focus on maximizing the return on technology investment for our worldwide clientele," said Steve Fazio, chief executive officer, Torrey Point. "Voltaire's 10GbE solutions meet the new infrastructure requirements for datacenter and cloud computing environments of all sizes. We are pleased to add Voltaire products to our offering."

"Growing the adVantage Partner Program to more than fifty channel partners worldwide represents a significant milestone for Voltaire's continued expansion into the 10GbE market," said Patrick Guay, executive vice president of global sales and general manager at Voltaire. "We are continually expanding our product set while providing our partners with more technical and sales content via our partner portal to enhance the program. The adVantage program provides our partners with the fundamental tools to help differentiate their business and to deliver an outstanding customer experience. The formula is simple: provide leading-edge products that enable unique solutions -- at an extremely competitive price point, with up-to-date technical training, and outstanding support."

Modern biotechnology is a rapidly growing field of research. Worldwide, numerous laboratories and research groups are producing huge amounts of data that they analyze with supercomputers and then use to develop computational models. There are currently few generally binding standards for the laboratory experiments and the computer-aided processing of the results. ISO intends to change this by establishing standards for the formatting, transfer, and integration of the data and models generated by different methods. Such consistent standards would be of high value for industrial, agricultural, and medical applications.

During the last meeting in Shenzhen, China, the ISO Technical Committee ISO/TC 276 Biotechnology founded a new working group for "Data processing and integration." The objective is to standardize interfaces between different data formats in order to facilitate the exchange and combination of data and computer models. With this aim, the committee will integrate “de facto” standards currently used in science. Martin Golebiewski, from the Heidelberg Institute for Theoretical Studies (HITS), Germany, has been appointed convener of the new ISO working group. He is already coordinator of the German NORMSYS-Project which focuses on standardizing models and data in systems biology. The project is funded by the German Federal Ministry for Economic Affairs and Energy and is carried out in collaboration with the University of Potsdam and the Berlin-based start-up business LifeGlimmer GmbH, with the aim of bringing system biologists in academia and industry together to agree on standards.

“The new ISO working group will help us to facilitate the transfer of scientific results into industrial applications by establishing international standards for data and computer models,” says Martin Golebiewski. “As the secretariat of this committee is at the German Institute for Standardization (DIN), and experts from Europe, Japan, the US and China have already joined it, this effort will strengthen the relationship between research initiatives in Germany and related efforts in other countries worldwide.” 

Interested scientists are invited to participate in the initiative through their national committees.

Professor Tanja Lange. Photo: Bart van Overbeeke

At this moment all bets are off and it is unclear whether we will ever learn the outcome of this competition. Scientists at TU Eindhoven and elsewhere are developing technology that can resist attacks using quantum computers. These cryptosystems need to be in place before big quantum computers become a reality, which is expected some time after 2025. Even if the scientists win that race quantum computers can still decrypt communication that we encrypt today with current technologies if the attacker has retained this data. 

Tanja Lange, TU/e professor for Cryptology, is leading a research consortium consisting of eleven universities and companies which is funded with 3.9 million Euros by the European Commission under the H2020 program to develop cryptology that resists the unmatched power of quantum computers. The project PQCRYPTO was publicly announced by Lange at a meeting at the US-American standardization institute NIST on this topic.

The expectation is that large quantum computers will be built some time after 2025. Such computers surmount the abilities of current computers and enable new types of attacks. Currently used methods such as RSA and ECC use keys that will still be unbroken in 100 years with current computer technology – but if quantum computers live up to their promises they can break these systems in a matter of days, if not hours. “2025 seems still far away but we might already be too late”, warns professor Lange, who has already worked on alternative cryptosystems since 2006. “It takes 15 to 20 years to introduce and standardize new cryptosystems and we are still in the research phase.”

To make things worse, spy agencies are not expected to announce when they have successfully built quantum computers. They can even break encrypted messages from the past if they had the foresight to record all messages. Lange suggests to already now deploy post-quantum cryptography to encrypt data with confidentiality requirements of more than 10 years, such as health records or top-secret documents. 

We already have cryptosystems that resist quantum computers but they are demanding in power which makes them unsuitable for smart phones or contactless cards. The quest is thus to develop new techniques that are unnoticeable on current devices while resisting the power of quantum computers. The European PQCRYPTO consortium will work on this for the next three years. The core targets are small devices, secure data storage in the cloud, and secure Internet.

CAPTION A photograph of the completed BGA trap assembly. The trap chip is at the center, sitting atop the larger interposer chip that fans out the wiring. The trap chip surface area is 1mm x 3mm, while the interposer is roughly 1 cm square. CREDIT D. Youngner, Honeywell

Researchers at the Georgia Tech Research Institute have developed a microfabricated ion trap architecture that holds promise for increasing the density of qubits in future quantum supercomputers

Quantum supercomputers are in theory capable of simulating the interactions of molecules at a level of detail far beyond the capabilities of even the largest supercomputers today. Such simulations could revolutionize chemistry, biology and material science, but the development of quantum supercomputers has been limited by the ability to increase the number of quantum bits, or qubits, that encode, store and access large amounts of data.

In a paper appearing this week in the Journal of Applied Physics, from AIP Publishing, a team of researchers at Georgia Tech Research Institute and Honeywell International have demonstrated a new device that allows more electrodes to be placed on a chip -- an important step that could help increase qubit densities and bring us one step closer to a quantum supercomputer that can simulate molecules or perform other algorithms of interest. 

"To write down the quantum state of a system of just 300 qubits, you would need 2^300 numbers, roughly the number of protons in the known universe, so no amount of Moore's Law scaling will ever make it possible for a classical supercomputer to process that many numbers," said Nicholas Guise, who led the research. "This is why it's impossible to fully simulate even a modest sized quantum system, let alone something like chemistry of complex molecules, unless we can build a quantum supercomputer to do it."

While existing supercomputers use classical bits of information, quantum supercomputers use "quantum bits" or qubits to store information. Classical bits use either a 0 or 1, but a qubit, exploiting a weird quantum property called superposition, can actually be in both 0 and 1 simultaneously, allowing much more information to be encoded. Since qubits can be correlated with each other in a way that classical bits cannot, they allow a new sort of massively parallel computation, but only if many qubits at a time can be produced and controlled. The challenge that the field has faced is scaling this technology up, much like moving from the first transistors to the first supercomputers. 

Creating the Building Blocks for Quantum SuperComputing

One leading qubit candidate is individual ions trapped inside a vacuum chamber and manipulated with lasers. The scalability of current trap architectures is limited since the connections for the electrodes needed to generate the trapping fields come at the edge of the chip, and their number are therefore limited by the chip perimeter. 

The GTRI/Honeywell approach uses new microfabrication techniques that allow more electrodes to fit onto the chip while preserving the laser access needed. 

The team's design borrows ideas from a type of packaging called a ball grid array (BGA) that is used to mount integrated circuits. The ball grid array's key feature is that it can bring electrical signals directly from the backside of the mount to the surface, thus increasing the potential density of electrical connections.

The researchers also freed up more chip space by replacing area-intensive surface or edge capacitors with trench capacitors and strategically moving wire connections.

The space-saving moves allowed tight focusing of an addressing laser beam for fast operations on single qubits. Despite early difficulties bonding the chips, a solution was developed in collaboration with Honeywell, and the device was trapping ions from the very first day.

The team was excited with the results. "Ions are very sensitive to stray electric fields and other noise sources, and a few microns of the wrong material in the wrong place can ruin a trap. But when we ran the BGA trap through a series of benchmarking tests we were pleasantly surprised that it performed at least as well as all our previous traps," Guise said.

Working with trapped ion qubits currently requires a room full of bulky equipment and several graduate students to make it all run properly, so the researchers say much work remains to be done to shrink the technology. The BGA project demonstrated that it's possible to fit more and more electrodes on a surface trap chip while wiring them from the back of the chip in a compact and extensible way. However, there are a host of engineering challenges that still need to be addressed to turn this into a miniaturized, robust and nicely packaged system that would enable quantum computing, the researchers say.

In the meantime, these advances have applications beyond quantum supercomputing. "We all hope that someday quantum computers will fulfill their vast promise, and this research gets us one step closer to that," Guise said. "But another reason that we work on such difficult problems is that it forces us to come up with solutions that may be useful elsewhere. For example, microfabrication techniques like those demonstrated here for ion traps are also very relevant for making miniature atomic devices like sensors, magnetometers and chip-scale atomic clocks."

The size of a quantum computer affects how quickly information can be distributed throughout it. The relation was thought to be logarithmic (blue). Progressively larger systems would need only a little more time. New findings suggest instead a power law relationship (red), meaning that the "speed limit" for quantum information transfer is far slower than previously believed.

If you're designing a new supercomputer, you want it to solve problems as fast as possible. Just how fast is possible is an open question when it comes to quantum supercomputers, but physicists at the National Institute of Standards and Technology (NIST) have narrowed the theoretical limits for where that "speed limit" is. The research implies that quantum processors will work more slowly than some research has suggested.

The work offers a better description of how quickly information can travel within a system built of quantum particles such as a group of individual atoms. Engineers will need to know this to build quantum computers, which will have vastly different designs and be able to solve certain problems much more easily than the supercomputers of today. While the new finding does not give an exact speed for how fast information will be able to travel in these as-yet-unbuilt computers--a longstanding question--it does place a far tighter constraint on where this speed limit could be.

Quantum computers will store data in a particle's quantum states--one of which is its spin, the property that confers magnetism. A quantum processor could suspend many particles in space in close proximity, and computing would involve moving data from particle to particle. Just as one magnet affects another, the spin of one particle influences its neighbor's, making quantum data transfer possible, but a big question is just how fast this influence can work.

The NIST team's findings advance a line of research that stretches back to the 1970s, when scientists discovered a limit on how quickly information could travel if a suspended particle only could communicate directly with its next-door neighbors. Since then, technology advanced to the point where scientists could investigate whether a particle might directly influence others that are more distant, a potential advantage. By 2005, theoretical studies incorporating this idea had increased the speed limit dramatically.

"Those results implied a quantum computer might be able to operate really fast, much faster than anyone had thought possible," says NIST's Michael Foss-Feig. "But over the next decade, no one saw any evidence that the information could actually travel that quickly."

Physicists exploring this aspect of the quantum world often line up several particles and watch how fast changing the spin of the first particle affects the one farthest down the line--a bit like standing up a row of dominoes and knocking the first one down to see how fast the chain reaction takes. The team looked at years of others' research and, because the dominoes never seemed to fall as fast as the 2005 prediction suggested, they developed a new mathematical proof that reveals a much tighter limit on how fast quantum information can propagate.

"The tighter a constraint we have, the better, because it means we'll have more realistic expectations of what quantum computers can do," says Foss-Feig.

The limit, their proof indicates, is far closer to the speed limits suggested by the 1970s result.

The proof addresses the rate at which entanglement propagates across quantum systems. Entanglement--the weird linkage of quantum information between two distant particles--is important, because the more quickly particles grow entangled with one another, the faster they can share data. The 2005 results indicated that even if the interaction strength decays quickly with distance, as a system grows, the time needed for entanglement to propagate through it grows only logarithmically with its size, implying that a system could get entangled very quickly. The team's work, however, shows that propagation time grows as a power of its size, meaning that while quantum computers may be able to solve problems that ordinary computers find devilishly complex, their processors will not be speed demons.

"On the other hand, the findings tell us something important about how entanglement works," says Foss-Feig. "They could help us understand how to model quantum systems more efficiently."

Page 8 of 61