Headlines

Supercomputing Online News is the only newspaper covering the rapidly evolving supercomputer marketplace.
  1. Amidst growing concerns over the low uptake of flu shots in Europe, scientists from the Italian National Research Council and the JRC confirm that vaccinations remain the best way forward when it comes to stopping the spread of infectious diseases.

    It's an option that is nearly always more effective than either doing nothing or attempting to contain an outbreak through quarantine.

    Under normal circumstances, the most effective way to prevent illness is to vaccinate according to national immunisation schedules.

    Widespread immunisation programmes in Europe have made previously deadly diseases such as smallpox and polio a thing of the past.

    This study looked specifically at epidemic outbreaks.
    They found that in such cases targeting carefully selected individuals with vaccination can be successful in containing the outbreak, even with only a relatively small number of individuals getting the relevant shot.

    The scientists ran physics-based simulations on networks which sought to replicate the way individuals interact with one another in the real world, such as through the global air transportation network.

    The simulations are simplified versions of computational frameworks commonly used to investigate the global spread of real-world epidemics, such as Severe Acute Respiratory Syndrome (SARS).

    Nevertheless, they help understanding basic features of the more complicated and realistic models.

    In the simulations, individuals correspond to 'nodes' that can transmit an infection through the links between them.

    The scientists found that quarantining nodes after the outbreak of an epidemic very quickly becomes ineffective. Quite early on in a simulated outbreak, even the 'do nothing' (non-intervention) strategy becomes preferable to quarantine.

    Targeted vaccination was found to be the best option in nearly all epidemic cases. The scientists used a vaccination strategy based on 'optimal percolation', which consists of finding the least set of nodes that, when removed from a network, can fragment it into small clusters.

    The idea behind this approach is that fragmenting the network ensures infections are contained within small groups, hence preventing the occurrence of large outbreaks.

    This might all seem like common sense, but preventive vaccination is not common practice for all illnesses and for some, vaccines do not yet exist.

    The norovirus outbreak at this year's Winter Olympic Games is an example where quarantine has been used as the option available to health officials.

    Medical professionals have attempted to initially contain the outbreak by imposing quarantine on the hundreds of staff who were unlucky enough to catch the virus. Despite these measures, the illness is continuing to spread and has started to affect some of the athletes.

    In recent years, physicists have made significant advances in the field of network immunisation, developing increasingly efficient techniques to immunise a network by the 'removal' (vaccination) of a few nodes. This knowledge can help to support health policy as policymakers look to ensure increased global security against epidemics.

  2. Supercomputer simulations of jet stream behavior in a warming climate suggest ranges of forecasts in the mid-century will be similar to those in present day, MU study finds

    According to the Intergovernmental Panel on Climate Change, temperatures are expected to rise between 2.5 and 10 degrees Fahrenheit over the next century. This warming is expected to contribute to rising sea levels and the melting of glaciers and permafrost, as well as other climate-related effects. Now, research from the University of Missouri suggests that even as rising carbon dioxide levels in the atmosphere drive the climate toward warmer temperatures, the weather will remain predictable.

    "The jet stream changes character every 10 to 12 days, and we use this pattern to predict the weather," said Anthony Lupo, professor of atmospheric science in MU's School of Natural Resources, which is located in the College of Agriculture, Food and Natural Resources. "We were curious about how this would change in a world with higher carbon dioxide levels. We found that in that warmer world, the variability of the jet stream remained the same."

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    Lupo and Andrew Jensen, who earned his doctorate at MU, used an existing climate model to simulate jet stream flow in the Northern Hemisphere. The supercomputer simulation monitored a variable that responds to jet stream flow changes and can indicate global-scale weather instability. Researchers used this variable to determine when the jet stream altered its flow. Since meteorologists can only accurately predict weather within the 10 to 12 days between jet stream flow changes, a shift in this time frame would directly impact weather predictability.

    Over the course of a simulated 31 years, their observations indicated the jet stream would change its character about 30 to 35 times per year, a number that is consistent with current jet stream patterns. As the time frame used to predict weather did not change, the researchers concluded that weather would likely remain as predictable in a warmer world as it is today. The results do not address the effects of climate change on the nature or frequency of weather events but instead focus on the range of predictability afforded by the jet stream. In addition, the researchers did not extend the simulation past the mid-century to ensure their data was as accurate as possible.

    "Climate change will continue to create a lot of ripple effects, but this experiment provides evidence that the range of forecasting will remain the same," Lupo said.

  3. An international research consortium developed, with significant involvement of Luxembourg Centre for Systems Biomedicine (LCSB) scientists, the first supercomputer model to include 3D in the representation of human metabolic processes.

    To this end, the researchers integrated the three-dimensional structures of over 4,000 metabolic products, or metabolites as they are known, and nearly 13,000 proteins into an existing computer model. They also added an enormous volume of genetic and chemical information to the model on which the simulation runs. The name of this new computer-based tool, which has been made available to the biomedical research community recently, is Recon3D. The researchers' results on Recon3D appear in the journal Nature Biotechnology(doi:10.1038/nbt.4072).

    The importance of scientific supercomputer models continues to increase. They make the existing knowledge tangible and thereby help scientists to accurately formulate and work on targeted problems in research. To create such models, researchers analyse all the publications and databases they can find on a topic and feed this information into their model.

    Human metabolism research is one example, as Prof. Ines Thiele, head of the Molecular Systems Physiology group at the LCSB of the University of Luxembourg and an ATTRACT fellow of the Luxembourg National Research Fund (FNR), says: "For the predecessor of Recon3D, Recon 2, a large team of research groups in different fields aggregated an enormous volume of data on the genome, chemical metabolic activities and physiological properties of the human organism." This data basis has now been considerably expanded yet again for Recon3D, Thiele reports.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    SuperComputer model of human metabolic processes

    What is unique about the new computer model, however, is the integrated three-dimensional structural data of proteins and metabolites. Dr Ronan Fleming, head of the LCSB Systems Biochemistry group which was responsible for integrating the structural data of the metabolites into Recon3D, says "so far, we have been able to say of a given metabolic reaction that substance A and B turn into substances C and D. Now, we know precisely which atoms each substance consists of, how the atoms are arranged in the starting materials and where those very same atoms can be found again in the products of the chemical reaction."

    To make this possible, the researchers first had to find out which computational program - or algorithm as scientists call it - imports the three-dimensional structures of the molecules most accurately from the literature into Recon3D. To find this out, Fleming's team researched the molecular structure of the starting materials of a series of chemical reactions and determined the whereabouts of every single atom after the metabolic reaction was completed. They then tested various algorithms on the same reactions to identify those with the best predictive power. Fleming reports "with these results, we were then able to import very accurate structures of more than 4,000 metabolites into the computer model." Fleming's colleagues from the University of California in the United States expanded Recon3D with data on almost 13,000 protein structures, thereby bridging the research fields of structural biology and systems biology.

    An expanded supercomputer model available to all researchers

    "Recon3D now allows us to study metabolic processes that run differently, for example, in Parkinson's patients than in healthy people, much better and with more accuracy," Thiele sums up the possibilities that the new computer model offers. Dr Andreas Dräger from the Center for Bioinformatics at the University of Tübingen (ZBIT) has put it into a standardised format so that other researchers can use the model for their scientific questions. "Recon3D lays the foundations for creating cell-type-specific models that could be used to simulate the function of tissues through to entire organs in the computer," explains Dräger and continues "it could help to better understand the interplay between pathogens, such as bacteria or viruses, and the human host." Fleming also adds that, thanks to the three-dimensional structures of the metabolites and proteins in Recon3D, it is now possible to follow, at atomic resolution, how a genetic mutation could affect the development of certain diseases. "This can open up entirely new avenues for research on therapeutic approaches," Fleming concludes.

  4. International research team makes important step on the path to solving certification problems

    Quantum computers may one day solve algorithmic problems which even the biggest supercomputers today can't manage. But how do you test a quantum computer to ensure it is working reliably? Depending on the algorithmic task, this could be an easy or a very difficult certification problem. An international team of researchers has taken an important step towards solving a difficult variation of this problem, using a statistical approach developed at the University of Freiburg. The results of their study are published in the latest edition of Nature Photonics.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    Their example of a difficult certification problem is sorting a defined number of photons after they have gone through a defined arrangement of several optical elements. The arrangement provides each photon with a number of transmission paths - depending on whether the photon is reflected or transmitted by an optical element. The task is to predict the probability of photons leaving the arrangement at defined points, for a given positioning of the photons at the entrance to the arrangement. With increasing size of the optical arrangement and increasing numbers of photons sent on their way, the number of possible paths and distributions of the photons at the end rises steeply as a result of the uncertainty principle which underlies quantum mechanics - so that there can be no prediction of the exact probability using the computers available to us today. Physical principles say that different types of particle - such as photons or electrons - should yield differing probability distributions. But how can scientists tell these distributions and differing optical arrangements apart when there is no way of making exact calculations?

    An approach developed in Freiburg by researchers from Rome, Milan; Redmond, USA; Paris, and Freiburg now makes it possible for the first time to identify characteristic statistical signatures across unmeasurable probability distributions. Instead of a complete "fingerprint," they were able to distill the information from data sets which were reduced to make them usable. Using that information, they were able to discriminate various particle types and distinctive features of optical arrangements. The team also showed that this distillation process can be improved, drawing upon established techniques of machine learning, whereby physics provides the key information on which data set should be used to seek the relevant patterns. And because this approach becomes more accurate for bigger numbers of particles, the researchers hope that their findings take us a key step closer to solving the certification problem.

  5. State-of-the-art supercomputer techniques help narrow down genetic targets for diagnosis and treatment of autism

    Autism is a spectrum of closely related symptoms involving behavioral, social and cognitive deficits. Early detection of autism in children is key to producing the best outcomes; however, searching for the genetic causes of autism is complicated by various symptoms found within the spectrum. Now, a multi-disciplinary team of researchers at the University of Missouri created a new supercomputational method that has connected several target genes to autism. Recent discoveries could lead to screening tools for young children and could help doctors determine correct interventions when diagnosing autism.

    Unlocking the genetic causes of autism requires data-intensive computations. In 2014, the National Science Foundation (NSF) awarded $1 million in two grants to MU to install a supercomputer enabling data-intensive research and education in the fields of bioinformatics and data-driven engineering applications.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    In this study we started with more than 2,591 families who had only one child with autism and neither the parents nor the siblings had been diagnosed with autism," said Chi-Ren Shyu, director of the Informatics Institute and the Paul K. and Dianne Shumaker Endowed Professor in the Department of Electrical Engineering and Computer Science in the MU College of Engineering. "This created a genetically diverse group composed of an estimated 10 million genetic variants. We narrowed it down to the 30,000 most promising variants, then used preset algorithms and the big data capabilities of our high-performance computing equipment at MU to 'mine' those genetic variables."

    The genetic samples were obtained from the Simons Foundation Autism Research Initiative. Samples from children with diagnosed cases of autism, and their unaffected parents and siblings were collected leading to more than 11,500 individuals. Using advanced computational techniques, Shyu and his team were able to identify 286 genes that were then collected into 12 subgroups that exhibited commonly seen characteristics of children on the spectrum. Of these genes, 193 potentially new genes not found in previous autism studies were discovered.

    "Autism is heterogeneous, meaning that the genetic causes are varied and complex," said Judith Miles, professor emerita of child health-genetics in the MU Thompson Center for Autism and Neurodevelopmental Disorders. "This complexity makes it tough for geneticists to get at the root of what triggers the development of autism in more conventional ways. The methods developed by Dr. Shyu and the results our team identified are giving geneticists a wealth of targets we'd not considered before--by narrowing down the genetic markers, we may be able to develop clinical programs and methods that can help diagnose and treat the disease. These results are a quantum leap forward in the study of the genetic causes of autism."

  6. A technology developed by Brazilian researchers can help fighting highly resistant agricultural pests by analyzing the connections between the pests' patterns of dispersal in crops and different configurations in diversified intercropping systems.

    In order to help farmers in the fight against pests with resistance to both insecticides and transgenic plants genetically engineered to express proteins with insecticidal action, a group of Brazilian scientists have come up with computational tools that can give clues on the pests' habits, thus enabling decision-makers to choose from a wider range of new and more efficient strategies of pest control.

    An article published in Scientific Reports shows some results from a study supported by the São Paulo Research Foundation - FAPESP which have developed mathematical models to describe the movements of agricultural pests, aiming at better understanding how these pests disperse in agricultural areas.

    "The idea is to use computer models to design strategies capable of reducing the damage done to crops by pest populations and containing their expansion on plantations," said Wesley Augusto Conde Godoy, coordinator of the project and also a professor at the University of São Paulo's Luiz de Queiroz College of Agriculture (ESALQ-USP), institution where the project took place. The investigation featured collaboration from researchers from the São Paulo State University (UNESP) in Botucatu, São Paulo State, Brazil.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    First, the researchers modeled the movements of the cucurbit beetle, Diabrotica speciosa, which attacks several crops, such as soybeans, corn and cotton.

    Using supercomputer modeling, they found that spatial configurations in diversified intercropping systems (growing two or more crops in proximity) could favor or inhibit pest dispersal. "We observed that the presence of corn strips distributed across farmed fields could reduce spatial dispersal of the insects," Godoy said.

    Motivated by the results obtained with D. speciosa, they investigated possible applications of supercomputer modeling to describe the spatial dynamics of other agricultural pests, such as the Fall armyworm (S. frugiperda), an insect which has developed resistance to Bt corn, cotton and soybeans.

    In order to delay the development of S. frugiperda and other pests' resistance to transgenic crops - which have been modified to repel parasites through the inclusion of genetic material from Bacillus thuringiensis Berliner (Bt) bacteria -, technicians have advised farmers to create refuges, strips within Bt crop fields of the same crop without a Bt trait.

    Refuges are intended to ensure the maintenance of individuals susceptible to Bt technology within the pest population. They mate with resistant individuals, and this prevents the population as a whole from developing resistance to Bt toxins, Godoy explained. "It's been demonstrated that the larger the refuge area is, the lower the frequency of Bt-resistant individuals," he said.

    Using a cellular automata-based computer model that predicts the movements of insects, the researchers measured the effectiveness of three different refuge configurations: mixed seeds, random blocks, and strips.

    "We succeeded in identifying the best refuge configuration and size to delay the development of Bt crop resistance in S. frugiperda," Godoy said.

    Comparing movement patterns

    The researchers combined the computer model with data on the insect's movements obtained in the laboratory to analyze and compare its behavior on leaves of Bt and non-Bt cotton. The results of the study showed that the insect moved around more on Bt cotton leaves than non-Bt cotton leaves.

    "We don't yet know what mechanisms may trigger this behavior," Godoy said. "However, the findings so far have important practical implications because they may correlate with faster development of Bt resistance."

    Less movement on non-Bt leaves could be associated with adaptation cost. This is frequently the case for resistant populations of this insect in the absence of selection pressure, he explained.

    "We plan to continue investigating this problem, as continuation of the research could produce significant contributions to pest management programs by improving crop configuration to delay the development of resistance to GM crops among these and other insects," Godoy said.

  7. One of Europe’s most prestigious universities honors Siebel’s computer engineering achievements

    C3 IoT CEO Thomas M. Siebel received an honorary Ph.D. from Politecnico di Torino, one of Europe’s most esteemed educational institutions with a 150-year history in theoretical and applied research in architecture and engineering. The academic honor recognizes Siebel’s worldwide leadership in computer engineering, impact on the technology industry, and his extraordinary contributions to the development of science.

    “A renowned engineer and computer scientist, Tom Siebel is distinguished by his extraordinary contributions in applying new and disruptive technologies to design, develop, and commercialize innovative solutions that transform businesses for positive social and economic benefit,” said Marco Gilli, Rector and Professor of Electrical Engineering, Politecnico di Torino. “We are grateful for Tom’s close collaboration with the University and laud his commitment to higher education, research, and innovation.”

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    “I strongly believe the conferment of this honorary degree elevates the prestige of the Italian Education and Research System. Mr. Siebel’s strong interdisciplinary education shows that cultural enrichment and strong professional success come from the deep intermingling of humanistic and scientific disciplines. This interdisciplinary approach represents a very simple yet underappreciated idea – that culture and knowledge are not detached and separate, but productive and outstanding in their whole,” added Ms. Valeria Fedeli, Minister of Education, University, and Research, Republic of Italy.

    Siebel is founder and CEO of C3 IoT, the leading software platform provider for rapidly developing and operating AI and IoT applications at industrial scale. He was founder, chairman, and CEO of Siebel Systems, a leader in application software with revenue exceeding $2 billion before merging with Oracle Corporation in 2006. He serves on the advisory boards of the engineering colleges at the University of Illinois at Urbana-Champaign and University of California, Berkeley. He is a former Princeton trustee. He was elected to the American Academy of Arts and Sciences in 2013.

    S. Shankar Sastry, UC Berkeley Dean, Roy W. Carlson, Professor of Engineering, and a member of the board of directors at C3 IoT, also received an honorary degree from Politecnico di Torino in telecommunications engineering in recognition of his research achievements in the fields of non-linear and adaptive control, autonomous robotic systems, and cyber-physical systems. The academic honor also acknowledges Sastry’s leading contributions to the studies of information technology’s societal impact as Director of the Blum Center for Developing Economies and former Director of the Center for Information Technology Research in the Interest of Society (CITRIS) at Berkeley. Sastry is Director of the Siebel Energy Institute, a global consortium for innovative and collaborative energy research.

    The honorary degrees were conferred on Feb. 16 during an awards ceremony held at the Hall of Honour in the historic Castello del Valentino, a 16th century palace in Turin, Italy.

  8. In the midst of an unseasonably warm winter in the Pacific Northwest, a comparison of four publicly available climate projections has shown broad agreement that the region will become considerably warmer in the next century if greenhouse gas concentrations in the atmosphere rise to the highest levels projected in the the Intergovernmental Panel on Climate Change (IPCC) "business-as-usual" scenario.

    In this scenario, carbon dioxide concentrations are projected to continue to rise and to exceed 900 parts per million, more than double today's level of just over 400 parts per million. Annual average global temperatures are projected to rise between 1.5 and 7 degrees Celsius (2.7 to 12.6 degrees Fahrenheit), and precipitation is expected to increase during the winter and decrease in the summer.

    To examine projections of future climates in the Northwest, researchers in the College of Forestry at Oregon State University and the U.S. Forest Service obtained outouts from more than 30 climate models, known as general circulation models. These models simulate the Earth's climate at scales that are generally too large to be applied with confidence to local areas, such as the watersheds of small rivers and streams.

    The scientists examined four different versions of the model outputs, each one translated for the region with data from weather stations in the Northwest through a process called "downscaling." While the resulting fine-resolution climate projections vary for parts of the Northwest, such as coastal watersheds and mountain crests, the general agreement among them gives scientists increasing confidence in using fine-resolution climate projections for exploring future climate change impacts. The differences among them were no more than 0.3 degrees Celsius (about 0.5 degrees Fahrenheit) for the region.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    The results were published this week in the journal Scientific Data.

    "From a regional perspective, the differences in projected future changes are minor when you look at how much each projection says climate will change for the business-as-usual scenario," said Yueyang Jiang, lead author and a postdoctoral scientist at OSU. "The climate projections were created using different downscaling methods, but the projected changes in climate among them are similar at the regional scale."

    The researchers chose to analyze projections for the recent past as well as for three 29-year periods from 2011 to 2100. Their goal was to characterize the differences to inform and guide scientists and land managers who are evaluating the projected impacts of climate change on local resources.

    The fine-resolution climate projections vary in downscaling techniques and in the choice of historically observed weather datasets used to calibrate their calculations. Jiang and his team confirmed that the methods used to downscale each of the models had little to no effect on the data. They showed instead that differences arose from the choice of historical observation datasets used, which vary due to highly variable weather patterns or due to a lack of data in areas where weather stations are far apart.

    "These differences become enhanced in areas with strong geographic features, such as the coastline and at the crest of mountain ranges," said John Kim, co-author on the paper and a scientist with the Pacific Northwest Research Station of the U.S. Forest Service.

    Nevertheless, Kim added, the analysis reveals "a fairly consistent high-resolution picture of climate change" under the highest greenhouse gas concentration scenario projected by the IPCC. "So, individuals and organizations that are interested in how much climate may change for most parts of the region can use any of the datasets we examined."

    However, the researchers also caution against using only one projection to explore the effects of climate change at specific threshholds, such as how plants and animals might respond to a decrease in days with temperatures below freezing. Scientists interested in such climate effects should use several models, they added.

  9. New research disputes that 'eukaryote-like' Asgard archaea are cell-eating

    A team of American Museum of Natural History researchers has created a supercomputational model capable of predicting whether or not organisms have the ability to "eat" other cells through a process known as phagocytosis. The model may be a useful tool for large-scale microbe surveys and provides valuable insight into the evolution of complex life on Earth, challenging ideas put forward in recent studies. The model and researchers' findings are featured in a paper published today in the journal Nature Ecology & Evolution.

    "Phagocytosis is a major mechanism of nutritional uptake in many single-celled organisms, and it's vital to immune defenses in a number of living things, including humans," said co-author Eunsoo Kim, an associate curator in the American Museum of Natural History's Division of Invertebrate Zoology. "But lesser known is the idea that phagocytosis dates back some 2 to 3 billion years and played a role in those symbiotic associations that likely started the cascading evolution toward the more diverse and complex life we see on the planet today. Our research provides some hints as to how phagocytosis first arose."

    Prokaryotes, a group that includes bacteria and archaea, are microscopic, mostly single-celled organisms with relatively simple internal structure. Eukaryotes, the group that comprises animals, plants, fungi, and protists (representing assemblages of diverse, unrelated lineages such as amoebozoans and green algae), have generally larger cells that are filled with a number of internal components including the nucleus, where DNA is stored, and energy-generating organelles called mitochondria. Scientific theory dictates that between about 2 to 3 billion years ago a single bacterium merged with an unrelated prokaryotic microbe, leading to the evolution of mitochondria-a key feature of eukaryotic cells. This merger only happened once, and some scientists suggest that the process involved cellular "engulfment." Yet there are no known prokaryotes capable of phagocytosis today. So under what circumstances did the trait arise?

    To investigate this long-standing question, the research team used genetic patterns common to phagocytic cells to build a supercomputational model that uses machine learning to predict whether an organism feeds via phagocytosis.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    "There's no single set of genes that are strongly predictive of phagocytosis because it's a very complicated process that can involve more than 1,000 genes, and those genes can greatly vary from species to species," said lead author John Burns, a research scientist in the Museum's Sackler Institute for Comparative Genomics." But as we started looking at the genomes of more and more eukaryotes, a genetic pattern emerged, and it exists across diversity, even though it's slightly different in each species. That pattern is the core of our model, and we can use it to very quickly and efficiently predict which cells are likely to be 'eaters' and which are not."

    Because many eukaryotes, such as some green algae, can be "picky"-only eating under certain conditions-it can be hard to determine whether they are capable of phagocytosis simply by watching them under a microscope. This new model can help microbiologists make a quick assessment, and it may also be valuable for use in large DNA sequencing projects that involve multiple unknown microbial species, for example, surveying sea water samples.

    The researchers applied the model to a group of "eukaryote-like" microbes called Asgard archaea. In recent years, some researchers have proposed that Asgard microbes might be living relatives of long-sought-after microbes that merged with the bacterium that became mitochondria. But the researchers' new work finds that these microbes most likely do not use phagocytosis. They also used the gene set developed as part of the model-building process to look more closely at the ancient lineages of archaea and bacteria, and found that, as a whole, neither group on its own has the genes required for phagocytosis. While the new work eliminates one scenario for the birth of mitrochondria-that Asgard archaea engulfed a bacteria-many other options remain.

    "When you tease apart the components of these predictive genes, some have roots in archaea, some have roots in bacteria, and some are only unique to eukaryotes," Kim said. "Our data are consistent with the hypothesis that our cells are a chimera of archaeal and bacterial components, and that the process of phagocytosis arose only after that combination took place. We still have a lot of work to do in this field."

  10. For 2017, Cray reported total revenue of $392.5 million, which compares with $629.8 million in 2016. Net loss for 2017 was $133.8 million, or $3.33 per diluted share, compared to net income of $10.6 million, or $0.26 per diluted share in 2016.  Non-GAAP net loss, which adjusts for selected unusual and non-cash items, was $40.5 million, or $1.01 per diluted share for 2017, compared to non-GAAP net income of $19.9 million, or $0.49 per diluted share in 2016.

    Revenue for the fourth quarter of 2017 was $166.6 million, compared to $346.6 million in the fourth quarter of 2016.  Net loss for the fourth quarter of 2017 was $97.5 million, or $2.42 per diluted share, compared to net income of $51.8 million, or $1.27 per diluted share in the fourth quarter of 2016.  Non-GAAP net income was $9.2 million, or $0.22 per diluted share for the fourth quarter of 2017, compared to non-GAAP net income of $56.3 million, or $1.38 per diluted share for the same period in 2016.

    {loadmodule mod_ijoomla_adagency_zone,In-article}

    The Company’s GAAP Net Loss for the fourth quarter and year ended December 31, 2017 was significantly impacted by both the enactment of the Tax Cuts and Jobs Act of 2017 and by its decision to record a valuation allowance against all of its U.S. deferred tax assets.  The combined GAAP impact totaled $103 million.  These items have been excluded for non-GAAP purposes.

    For 2017, overall gross profit margin on a GAAP and non-GAAP basis was 33% and 34%, respectively, compared to 35% on a GAAP and non-GAAP basis for 2016.

    Operating expenses for 2017 were $196.4 million, compared to $211.1 million in 2016.  Non-GAAP operating expenses for 2017 were $176.5 million, compared to $199.7 million in 2016.  GAAP operating expenses in 2017 included $8.6 million in restructuring charges associated with our recent workforce reduction.

    As of December 31, 2017, cash, investments and restricted cash totaled $147 million.  Working capital at the end of the fourth quarter was $354 million, compared to $373 million at December 31, 2016.

    “Despite difficult conditions in our core market we finished 2017 strong, highlighted by several large acceptances at multiple sites around the world, including completing the installation of what is now the largest supercomputing complex in India at the Ministry of Earth Sciences,” said Peter Ungaro, president and CEO of Cray.  “As we shift to 2018, we’re seeing signs of a rebound at the high-end of supercomputing as well as considerable growth opportunities in the coming years.  Supercomputing continues to expand in importance to both government and commercial customers, driving growth and competitiveness across many different disciplines and industries.  As the leader at the high-end of the market, we’re poised to play a key role in this growth and I’m excited about where we’re headed.”