Headlines

Trade publication covering breaking news in the rapidly evolving supercomputer marketplace.
  1. A new computing technology called "organismoids" mimics some aspects of human thought by learning how to forget unimportant memories while retaining more vital ones.

    "The human brain is capable of continuous lifelong learning," said Kaushik Roy, Purdue University's Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. "And it does this partially by forgetting some information that is not critical. I learn slowly, but I keep forgetting other things along the way, so there is a graceful degradation in my accuracy of detecting things that are old. What we are trying to do is mimic that behavior of the brain to a certain extent, to create computers that not only learn new information but that also learn what to forget."

    The work was performed by researchers at Purdue, Rutgers University, the Massachusetts Institute of Technology, Brookhaven National Laboratory and Argonne National Laboratory.

    Central to the research is a ceramic "quantum material" called samarium nickelate, which was used to create devices called organismoids, said Shriram Ramanathan, a Purdue professor of materials engineering.

    "These devices possess certain characteristics of living beings and enable us to advance new learning algorithms that mimic some aspects of the human brain," Roy said. "The results have far reaching implications for the fields of quantum materials as well as brain-inspired computing."

    Findings are detailed in a paper appearing on Aug. 14 in the journal Nature Communications.

    When exposed to hydrogen gas the material undergoes a massive resistance change, as its crystal lattice is "doped" by hydrogen atoms. The material is said to breathe, expanding when hydrogen is added and contracting when the hydrogen is removed.

    {loadmodule mod_ijoomla_adagency_zone,News section advertising space}

    "The main thing about the material is that when this breathes in hydrogen there is a spectacular quantum mechanical effect that allows the resistance to change by orders of magnitude," Ramanathan said. "This is very unusual, and the effect is reversible because this dopant can be weakly attached to the lattice, so if you remove the hydrogen from the environment you can change the electrical resistance."

    The research paper's co-authors include Purdue postdoctoral research associate Fan Zuo and graduate student Priyadarshini Panda. A complete list of co-authors is available in the abstract. 

    {media id=174,layout=blog}

    When hydrogen is exposed to the material, it splits into a proton and an electron, and the electron attaches to the nickel, temporarily causing the material to become an insulator.

    "Then, when the hydrogen comes out, this material becomes conducting again," Ramanathan said. "What we show in this paper is the extent of conduction and insulation can be very carefully tuned."

    This changing conductance and the "decay of that conductance over time" is similar to a key animal behavior called habituation.

    "Many animals, even organisms that don't have a brain, possess this fundamental survival skill," Roy said. "And that's why we call this organismic behavior. If I see certain information on a regular basis, I get habituated, retaining memory of it. But if I haven't seen such information over a long time then it slowly starts decaying. So the behavior of conductance going up and down in exponential fashion can be used to create a new computing model that will incrementally learn and at same time forget things in a proper way."

    The researchers have developed a "neural learning model" they have termed adaptive synaptic plasticity.

    "This could be really important because it's one of the first examples of using quantum materials directly for solving a major problem in neural learning," Ramanathan said.

    The researchers used the organismoids to implement the new model for synaptic plasticity.

    "Using this effect we are able to model something that is a real problem in neuromorphic computing," Roy said. "For example, if I have learned your facial features I can still go out and learn someone else's features without really forgetting yours. However, this is difficult for computing models to do. When learning your features, they can forget the features of the original person, a problem called catastrophic forgetting."

    Neuromorphic computing is not intended to replace conventional general-purpose computer hardware, based on complementary metal-oxide-semiconductor transistors, or CMOS. Instead, it is expected to work in conjunction with CMOS-based computing. Whereas CMOS technology is especially adept at performing complex mathematical computations, neuromorphic computing might be able to perform roles such as facial recognition, reasoning and human-like decision making.

    {loadmodule mod_ijoomla_adagency_zone,News section advertising space}

    Roy's team performed the research work on the plasticity model, and other collaborators concentrated on the physics of how to explain the process of doping-driven change in conductance central to the paper. The multidisciplinary team includes experts in materials, electrical engineering, physics, and algorithms.

    "It's not often that a materials-science person can talk to a circuits person like professor Roy and come up with something meaningful," Ramanathan said.

    Organismoids might have applications in the emerging field of spintronics. Conventional computers use the presence and absence of an electric charge to represent ones and zeroes in a binary code needed to carry out computations. Spintronics, however, uses the "spin state" of electrons to represent ones and zeros.

    It could bring circuits that resemble biological neurons and synapses in a compact design not possible with CMOS circuits. Whereas it would take many CMOS devices to mimic a neuron or synapse, it might take only a single spintronic device.

    In future work, the researchers may demonstrate how to achieve habituation in an integrated circuit instead of exposing the material to hydrogen gas.

  2. A new machine learning program developed by researchers at Case Western Reserve University appears to outperform other methods for diagnosing Alzheimer's disease before symptoms begin to interfere with every day living, initial testing shows.

    More than 5 million Americans may have Alzheimer's disease, according to estimates, and the numbers are growing as the population ages. The disease is an irreversible, progressive brain disorder that slowly destroys memory and thinking skills. And while there is no cure, several drugs can delay or prevent symptoms from worsening for up to five years or more, according to the National Institute on Aging and published research.

    Meanwhile, early diagnosis and treatment--the goal of the new computer based program--is key to allowing those with the disease to remain independent longer. 

    The computer program integrates a range of Alzheimer's disease indicators, including mild cognitive impairment. In two successive stages, the algorithm selects the most pertinent to predict who has Alzheimer's. 

    {loadmodule mod_ijoomla_adagency_zone,News section advertising space}

    "Many papers compare the healthy to those with the disease, but there's a continuum," said Anant Madabhushi, F. Alex Nason professor II of biomedical engineering at Case Western Reserve. "We deliberately included mild cognitive impairment, which can be a precursor to Alzheimers, but not always."

    In a study published in the journal Scientific Reports, Madabhushi, Asha Singanamalli, who recently earned her biomedical engineering master's degree and Haibo Wang, a former postdoctoral researcher, tested the algorithm using data from 149 patients collected via the Alzheimer's Disease Neuroimaging Initiative. 

    The team developed what it calls Cascaded Multi-view Canonical Correlation (CaMCCo) algorithm, which integrates measurements from magnetic resonance imaging (MRI) scans, features of the hippocampus, glucose metabolism rates in the brain, proteomics, genomics, mild cognitive impairment and other parameters. 

    Madabhushi's lab has repeatedly found that integrating dissimilar information is valuable for identifying cancers. This is the first time he and his team have done so for diagnosis and characterization of Alzheimer's disease.

    "The algorithm assumes each parameter provides a different view of the disease, as if each were a different set of colored spectacles," Madabhushi said. 

    The program then assesses the variables in a two-stage cascade. First, the algorithm selects the parameters that best distinguish between someone who's healthy and someone who's not. Second, the algorithm selects from the unhealthy variables those that best distinguish who has mild cognitive impairment and who has Alzheimer's disease.

    "The remaining views are combined to give the best picture," Madabhushi said.

    In predicting which patients in the study had Alzheimer's disease, CaMCCo outperformed individual indicators as well as methods that combine them all without selective assessment. It also was better at predicting who had mild cognitive impairment than other methods that combine multiple indicators.

    The researchers continue to validate and fine-tune the approach with data from multiple sites. They also plan to use the software in an observational mode: As a collaborating neurologist compiles tests on patients, the computer would run the data. If CaMCCo proves useful in predicting early Alzheimer's, Madabhushi expects to pursue a clinical trial for prospective validation.

  3. Machine learning scientists at Disney Research have developed a new innovative model that uncovers how the meanings of words change over time. 

    Dr. Robert Bamler and Dr. Stephan Mandt developed the dynamic word embeddings model by integrating neural networks and statistics used in rocket control systems. The result is a complex machine learning algorithm that automatically detects semantic change throughout history. 

    "Promoting innovation in machine learning is a vital part of Disney Research," said Markus Gross, vice president of Disney Research. 

    The authors will present the model on August 9 at the International Conference on Machine Learning (ICML) in Sydney.

    Their dynamic word embeddings represent semantic change over time through so-called semantic vector spaces. Words of similar meanings appear close to one another and reveal each other's relationships. Changes in these meanings appear as movements through the semantic space. 

    {loadmodule mod_ijoomla_adagency_zone,News section advertising space}

    "Dynamic word embeddings could revolutionize how we investigate the evolution of language use," Mandt said. "The model uncovers otherwise latent semantic changes that become palpable through new dynamic visualizations."

    For instance, the model automatically discovered that the meaning of the word 'peer' drastically changed in the last 200 years. "In the past, a peer was an earl or viscount," Bamler said. "Today, it's someone of equal standing. That is practically the opposite." Other words whose meaning changed dramatically included 'computer,' 'radio,' and 'broadcast'.

    Bamler and Mandt integrated neural networks with classical statistical models. While having neural networks work on their data sets, the researchers used a Kalman filter for their transition model. It is a statistical model originally developed to trace, for instance, the position of rockets. 

    "Previous approaches produce different word vectors every time they are trained," Mandt said. "Every output used to be intrinsically correct, but these outputs could not be compared to detect changes over time." 

    Added Bamler, "We developed a mechanism to train the model on a vast corpus without splitting the data into time slices. That way, we were able to leverage a very large data set while still allowing the word vectors to drift smoothly over time."

    The combination of classical statistical models with neural networks is a progressive approach able to prompt applications beyond the scope of word embeddings. To test their model, the researchers drew on a variety of data sets including 230 presidential speeches and millions of digitized books from over 200 years, as well as several months of public social media posts. In all experiments, the new model was consistently better in predicting missing words from these sources of text than existing models, confirming the validity of the observed changes.

    Beyond tracing language history, Bamler and Mandt also analyzed contemporary changes in language usage. Training their model on social media posts, they detected changes of word associations that reflected current events. 

    Combining creativity and innovation, this research continues Disney's rich legacy of leveraging technology to enhance the tools and systems of tomorrow.

  4. A project to develop new ultrafast laser transmitter technology at The University of New Mexico is expected to have a revolutionary impact on the physics of semiconductor lasers, with potential applications that could result in ten times the speed of current fiber optic networks.

    UNM Electrical and Computer Engineering (ECE) Professor Marek Osinski leads a multidisciplinary research team that includes Gennady Smolyakov, ECE Research Associate Professor, a senior collaborator on the project.

    For the past two and half years, the laser transmitter project has been centered on numerical simulations of proposed devices, funded by a $400,000 grant which will conclude in September. Now, the Office of Naval Research has awarded UNM a $1.3 million grant, funded for three and a half years, to continue with a proof of concept demonstration, moving from theory to the laboratory.

    The project addresses the need for faster data transfer by increasing the speed at which signals are first generated to send through a fiber optic network. Its goal is to demonstrate that semiconductor laser devices can send data at a bandwidth speed of over 100 gigabits per second (Gbps), or 100 million kilobits per second.

    To put this in perspective, current 4G networks may transfer data at a speed of 10 Gbps. While standards have not yet been established, the planned 5G revolution is expected to push data transfer speeds to at least 40 Gbps. 5G's speed and capacity is designed to facilitate the more rapid arrival of the "Internet of Things" (IoT).

    Much of the fiber optic infrastructure is in place and continues to be built out as an investment in the connected future. With light as the data medium, the theoretical speed limit for fiber optics is the speed of light. The effective speed limitations for fiber optic networks are the way in which the data signals are sent, encoded, modulated and received.

    "How we are sending the data is what is going to make the big difference," said Osinski.

    The laboratory work will be performed at UNM's Center for High Technology Materials (CHTM), an internationally recognized resource for research in optoelectronics. 

    While future applications for this technology could have a huge impact on data transmission and connectivity around the world, the Office of Naval Research (ONR) is funding the UNM research for their specific applications. ONR generates enormous amounts of data in supercomputers operating in cryogenic environments (around 4 degrees Kelvin), with the data produced then needing to be processed through computers at room temperature. ONR is looking for a way to quickly transfer these huge quantities of data.

    The Office of Naval Research may be interested in applications aboard a ship, which has limited energy and room for maintaining a cryogenic environment. Removing extra heat from the cryostat (an apparatus for maintaining a very low temperature) takes lots of power.

    Since it is very expensive energetically to maintain the 4K temperature, normal electronic connections with copper wire are not feasible, as the copper conducts heat. Instead, high-speed optical communication methods such as those being developed at UNM are preferred, because glass does not conduct heat.

    Osinski's project explores the use of semiconductor ring lasers to provide energy-efficient high-speed optical data egress (exit) from a cryogenic environment to room temperature for further processing.

    Semiconductor lasers are a common source of data in the form of light for input into optical communications.

    "Most people do not know -- your cellphone has semiconductor lasers, so do CD and DVD players, laser printers, and scanners. They are everywhere already," said Osinski.

    The project is expected to have a revolutionary impact on the field of fundamental and applied physics of semiconductor lasers, pioneering a new class of ring-laser-based, integrated, ultrafast sources.

    "Development of inexpensive ultrafast chips operating at speeds exceeding 100 Gbps will have a huge societal impact by increasing the transmission capacity of fiber-based networks," says Osinski. "We are conducting basic scientific research with the potential for many and much broader possible applications."

    "Society is supremely concerned with more data, more options," he says. "Imagine a doctor able to perform robotic surgery remotely -- the doctor would need very high quality images at very high resolution. Besides surgery, patients in remote communities without local medical resources may need to consult with a specialist online, perhaps with the doctor reading live data from on-site sensors. This is a concern even in New Mexico, with our vast rural areas."

    "A lot of things that were thought impossible before are going from a dream to an expectation," Osinski continued. "People are anticipating real time communication around the world, not delayed communication. And not only are the files larger, the user base gets larger and larger. The more capacity we provide by technological development, the more inventive are the inventors."

    "Ultra-high-speed (>100 GHz) on-chip modulation of integrated lasers is an important 'holy grail' for the ever-increasing demand for ultrafast communication," Osinski concluded.

    Ultrafast laser transmitter prototype fabrication and testing is an excellent example of UNM's role as a research institution at the forefront of high-speed semiconductor research.

  5. On July 5, 2017, NASA's Solar Dynamics Observatory watched an active region -- an area of intense and complex magnetic fields -- rotate into view on the Sun. The satellite continued to track the region as it grew and eventually rotated across the Sun and out of view on July 17.

    With their complex magnetic fields, sunspots are often the source of interesting solar activity:

    During its 13-day trip across the face of the Sun, the active region -- dubbed AR12665 -- put on a show for NASA's Sun-watching satellites, producing several solar flares, a coronal mass ejection and a solar energetic particle event. Watch the video below to learn how NASA's satellites tracked the sunspot over the course of these two weeks. 

    {media id=173,layout=blog}

    Such sunspots are a common occurrence on the Sun, but less frequent at the moment, as the Sun is moving steadily toward a period of lower solar activity called solar minimum -- a regular occurrence during its approximately 11-year cycle. Scientists track such spots because they can help provide information about the Sun's inner workings. Space weather centers, such as NOAA's Space Weather Prediction Center, also monitor these spots to provide advance warning, if needed, of the radiation bursts being sent toward Earth, which can impact our satellites and radio communications.

    On July 9, a medium-sized flare burst from the sunspot, peaking at 11:18 a.m. EDT. Solar flares are explosions on the Sun that send energy, light and high-speed particles out into space -- much like how earthquakes have a Richter scale to describe their strength, solar flares are also categorized according to their intensity. This flare was categorized as an M1. M-class flares are a tenth the size of the most intense flares, the X-class flares. The number provides more information about its strength: An M2 is twice as intense as an M1, an M3 is three times as intense and so on.

    Days later, on July 14, a second medium-sized, M2 flare erupted from the Sun. The second flare was long-lived, peaking at 10:09 a.m. EDT and lasting over two hours.

    This was accompanied by another kind of solar explosion called a coronal mass ejection, or CME. Solar flares are often associated with CMEs -- giant clouds of solar material and energy. NASA's Solar and Heliospheric Observatory, or SOHO, saw the CME at 9:36 a.m. EDT leaving the Sun at speeds of 620 miles per second and eventually slowing to 466 miles per second.

    Following the CME, the turbulent active region also emitted a flurry of high-speed protons, known as a solar energetic particle event, at 12:45 p.m. EDT.

    Research scientists at the Community Coordinated Modeling Center -- located at NASA's Goddard Space Flight Center in Greenbelt, Maryland -- used these spacecraft observations as input for their simulations of space weather throughout the solar system. Using a model called ENLIL, they are able to map out and predict whether the solar storm will impact our instruments and spacecraft, and send alerts to NASA mission operators if necessary.

    By the time the CME made contact with Earth's magnetic field on July 16, the sunspot's journey across the Sun was almost complete. As for the solar storm, it took this massive cloud of solar material two days to travel 93 million miles to Earth, where it caused charged particles to stream down Earth's magnetic poles, sparking enhanced aurora.

  6. Virtual.Pyxis software has already been licensed to advanced research departments of major multinationals and MIT

    Computer-aided engineering (CAE) systems help manufacturers to design parts with the ideal topology (inner and outer shape and structure) to withstand the conditions under which they will operate, such as specific temperature and pressure conditions, vibrations, and various stresses and strains, and to produce them with as little raw material as possible. In sum, CAE enables industrial design software to optimize part topology.

    By deploying topology optimization software, manufacturers virtually sculpt lighter parts using a given amount of raw material and monitoring their strength.

    These attributes are a function of the design. "You input parameters into the software with the properties and other characteristics the part needs to have, and the program shows the design path that has to be followed in order to achieve your goals," says mechatronic engineer Ricardo Doll Lahuerta, principal investigator for a research project that is promoting a significant quality enhancement in this type of tool, with joint funding from FAPESP's Small Business Innovative Research Program (PIPE) and VirtualCAE, a Brazilian company based in São Paulo State.

    Recently launched, and with several upgrades in progress, Virtual.Pyxis has already been licensed to advanced research departments of major multinationals and the Massachusetts Institute of Technology (MIT) in the United States, one of the world's leading research and education institutions.

    The tool enables engineers to design stronger and more versatile parts while shortening lead time and development cost.

    This type of software typically requires information on the conditions under which the part will operate, such as stress, compression, vibration and temperature, as well as other design constraints, including maximum flexibility and deformation. Details of the manufacturing process to be used, such as polymer injection, casting or 3D printing, are also key inputs for CAE programs.

    Virtual.Pyxis is distinguished mainly by its next-generation algorithm, which gives it the capacity to process a far larger number of variables and constraints for considerably less cost than commercially available programs.

    "Our algorithm can process more variables without the need for much more computational capacity," Lahuerta explains. "As a result, any structural design can be developed with greater precision at a much lower cost. It's especially useful for designing mechanisms that need to be flexible, in which case many more variables have to be computed. Most of today's topological optimization programs pursue maximum rigidity, but that's not always ideal in a design project."

    Virtual.Pyxis is capable of more complex analysis and can work with non-linear materials. It also enables the design of very precise frequency constraints, an important attribute to ensure that adjacent parts vibrate at the same resonant frequency. It is also capable of using different external calculus solvers, including those most used by the machine and metalworking industries.

    VirtualCAE's software has immediate applications in a swathe of industries. It is already being used to design automotive components, farm implements, railroad equipment, and safety-linked vehicle parts. It can also be used to develop lower-cost production processes, as in the case of certain parts made by casting instead of creasing, folding and welding. Castings have better mechanical properties, last longer, use fewer components, and can be assembled faster because no welding is required. "FAPESP's support has been essential," says Valmir Fleischman, VirtualCAE's founding partner.

    Even more applications are foreseen in the future. For example, enhanced prosthetic and orthotic devices. "Excessive rigidity in bone prostheses, for example, tends to weaken the bone to which they're connected. The program enables prostheses to be designed with the ideal degree of flexibility," Lahuerta says.

    He adds that the development of 3D printing, which builds objects gradually, layer by layer, will enable parts to be produced from various materials combined in a structure with more advanced properties. These objects would be far harder to obtain with today's manufacturing processes.

    "In the future, we'll be able to design the inner structure of a part combining materials on an ever-smaller scale until we reach the atomic scale, significantly extending design possibilities. For example, we'll be able to design a part with certain intelligent characteristics, making it lighter and more efficient. With the software tools currently available, this would require prohibitively expensive computational capacity," Lahuerta says.

    "Our software is easier to customize, and with access to training in its use, our customers are often able to pay for their investment in Virtual.Pyxis out of their very first design," says Leandro Garbin, a founding partner of the firm that is putting all its chips on innovation.

    The program is still being upgraded, but VirtualCAE has already licensed it to important multinationals such as Thyssenkrupp (China unit), the American agricultural equipment manufacturer AGCO, and even one of MIT's research laboratories that provides services to the US Department of Defense. Thanks to innovation, the firm has opened branch offices in Germany and the US and has representatives in Mexico, Colombia, Turkey, China and Taiwan.

  7. New work offers fresh evidence supporting the supernova shock wave theory of our solar system's origin

    According to one longstanding theory, our Solar System's formation was triggered by a shock wave from an exploding supernova. The shock wave injected material from the exploding star into a neighboring cloud of dust and gas, causing it to collapse in on itself and form the Sun and its surrounding planets.

    New work from Carnegie's Alan Boss offers fresh evidence supporting this theory, modeling the Solar System's formation beyond the initial cloud collapse and into the intermediate stages of star formation. It is published by the Astrophysical Journal.

    One very important constraint for testing theories of Solar System formation is meteorite chemistry. Meteorites retain a record of the elements, isotopes, and compounds that existed in the system's earliest days. One type, called carbonaceous chondrites, includes some of the most-primitive known samples.

    An interesting component of chondrites' makeup is something called short-lived radioactive isotopes. Isotopes are versions of elements with the same number of protons, but a different number of neutrons. Sometimes, as is the case with radioactive isotopes, the number of neutrons present in the nucleus can make the isotope unstable. To gain stability, the isotope releases energetic particles, which alters its number of protons and neutrons, transmuting it into another element.

    Some isotopes that existed when the Solar System formed are radioactive and have decay rates that caused them to become extinct within tens to hundreds of million years. The fact that these isotopes still existed when chondrites formed is shown by the abundances of their stable decay products--also called daughter isotopes--found in some primitive chondrites. Measuring the amount of these daughter isotopes can tell scientists when, and possibly how, the chondrites formed.

    A recent analysis of chondrites by Carnegie's Myriam Telus was concerned with iron-60, a short-lived radioactive isotope that decays into nickel-60. It is only created in significant amounts by nuclear reactions inside certain kinds of stars, including supernovae or what are called asymptotic giant branch (AGB) stars.

    Because all the iron-60 from the Solar System's formation has long since decayed, Telus' research, published in Geochimica et Cosmochimica Acta, focused on its daughter product, nickel-60. The amount of nickel-60 found in meteorite samples--particularly in comparison to the amount of stable, "ordinary" iron-56--can indicate how much iron-60 was present when the larger parent body from which the meteorite broke off was formed. There are not many options for how an excess of iron-60--which later decayed into nickel-60--could have gotten into a primitive Solar System object in the first place--one of them being a supernova.

    While her research did not find a "smoking gun," definitively proving that the radioactive isotopes were injected by a shock wave, Telus did show that the amount of Fe-60 present in the early Solar System is consistent with a supernova origin.

    Taking this latest meteorite research into account, Boss revisited his earlier models of shock wave-triggered cloud collapse, extending his supercomputational models beyond the initial collapse and into the intermediate stages of star formation, when the Sun was first being created, an important next step in tying together Solar System origin modeling and meteorite sample analysis.

    "My findings indicate that a supernova shock wave is still the most-plausible origin story for explaining the short lived radioactive isotopes in our Solar System," Boss said.

    Boss dedicated his paper to the late Sandra Keiser, a long-term collaborator, who provided computational and programming support at Carnegie's Department of Terrestrial Magnetism for more than two decades. Keiser died in March.

  8. ICF to support research and develop solutions for defensive cyber operations

    The U.S. Army Research Laboratory (ARL) has awarded ICF, a global consulting and technology services provider, a contract valued up to $93 million to support its Defensive Cyber Operations (DCO) and Defensive Cybersecurity Research. The contract terms include both base and option periods. 

    ICF will support ARL’s Cybersecurity Service Provider (CSSP) program and both basic and applied research, working to develop cyber tools and techniques and advance state of the art computer network defense. ICF will support all cyber operations for ARL and ARL subscribers through onsite and remote reviews of network security and ensure alignment between policy, compliance and assessment functions. The company will also support the information assurance management office, which sets policy for ARL. 

    “ARL is the tip of the spear when it comes to national security, responsible for continually monitoring, testing and defending the cyber operations of the U.S. armed forces,” said Samuel Visner, senior vice president for cybersecurity and resilience at ICF. “We are proud to work with ARL to guard against current and emerging threats to its information systems and technology infrastructure, and excited to contribute to the state of cyber research and development.” 

    The U.S. Army Research Laboratory, currently celebrating 25 years of excellence in Army science and technology, is part of the U.S. Army Research, Development and Engineering Command, which has the mission to provide innovative research, development and engineering to produce capabilities that provide decisive overmatch to the Army against the complexities of the current and future operating environments in support of the joint warfighter and the nation. RDECOM is a major subordinate command of the U.S. Army Materiel Command. 

    “As partners for over two decades, we look forward to continuing our support of ARL in developing new concepts and researching, testing and applying new technologies to our nation’s defense,” said Bill Christman, vice president at ICF. 

    ICF’s cybersecurity specialists help the warfighter, military and national security clients build and successfully defend the most aggressively attacked infrastructures on the planet. For over two decades, ICF has delivered cyber-innovations, ranging from support to the system that served as the model for network defense services within the U.S. Department of Defense, to improving machine-to-machine learning for better cyber defense and patenting a novel way to visualize cyber threats using virtual reality. 

    Read more about ICF’s cyber research and future solutions and computer and network defense solutions.

  9. A little bit of "scruff" in scientific data 50 years ago led to the discovery of pulsars -- rapidly spinning dense stellar corpses that appear to pulse at Earth.

    Astronomer Jocelyn Bell made the chance discovery using a vast radio telescope in Cambridge, England. Although it was built to measure the random brightness flickers of a different category of celestial objects called quasars, the 4.5 acre telescope produced unexpected markings on Bell's paper data recorder every 1.33730 seconds. The pen traces representing radio brightness revealed an unusual phenomenon.

    "The pulses were so regular, so much like a ticking clock, that Bell and her supervisor Anthony Hewish couldn't believe it was a natural phenomenon," said Zaven Arzoumanian of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Once they found a second, third and fourth they started to think differently."

    The unusual stellar objects had been previously predicted but never observed. Today, scientists know of over 2,000 pulsars. These rotating "lighthouse" neutron stars begin their lives as stars between about seven and 20 times the mass of our sun. Some are found to spin hundreds of times per second, faster than the blades of a household blender, and they possess enormously strong magnetic fields.

    Technology advances in the past half-century allowed scientists to study these compact stellar objects from space using different wavelengths of light, especially those much more energetic than the radio waves received by the Cambridge telescope. Several current NASA missions continue to study these natural beacons.

    The Neutron star Interior Composition Explorer, or NICER, is the first NASA mission dedicated to studying pulsars. In a nod to the anniversary of Bell's discovery, NICER observed the famous first pulsar, known today as PSR B1919+21.

    NICER launched to the International Space Station in early June and started science operations last month. Its X-ray observations - the part of the electromagnetic spectrum in which these stars radiate both from their million-degree solid surfaces and from their strong magnetic fields - will reveal how nature's fundamental forces behave within the cores of these objects, an environment that doesn't exist and can't be reproduced anywhere else. "What's inside a pulsar?" is one of many long-standing astrophysics questions about these ultra-dense, fast-spinning, powerfully magnetic objects.

    The "stuff" of pulsars is a collection of particles familiar to scientists from over a century of laboratory studies on Earth -- neutrons, protons, electrons, and perhaps even their own constituents, called quarks. However, under such extreme conditions of pressure and density, their behavior and interactions aren't well understood. New, precise measurements, especially of the sizes and masses of pulsars are needed to pin down theories.

    "Many nuclear-physics models have been developed to explain how the make-up of neutron stars, based on available data and the constraints they provide," said Goddard's Keith Gendreau, the principal investigator for NICER. "NICER's sensitivity, X-ray energy resolution and time resolution will improve these by more precisely measuring their radii, to an order of magnitude improvement over the state of the art today."

    The mission will also pave the way for future space exploration by helping to develop a Global Positioning System-like capability for the galaxy. The embedded Station Explorer for X-ray Timing and Navigation Technology, or SEXTANT, demonstration will use NICER's X-ray observations of pulsar signals to determine NICER's exact position in orbit.

    "You can time the pulsations of pulsars distributed in many directions around a spacecraft to figure out where the vehicle is and navigate it anywhere," said Arzoumanian, who is also the NICER science lead. "That's exactly how the GPS system on Earth works, with precise clocks flown on satellites in orbit."

    Scientists have tested this method using supercomputer and lab simulations. SEXTANT will demonstrate pulsar-based navigation for the first time in space.

    NICER-SEXTANT is the first astrophysics mission dedicated to studying pulsars, 50 years after their discovery. "I think it is going to yield many more scientific discoveries than we can anticipate now," said Gendreau.

  10. Hundreds of scientists worldwide currently use an online application that accesses at least one terabyte of data to calculate everything from the spectrum of an exoplanet and the weather on Mars to the chemical makeup and orbit of a celestial object. It's now expected to get even better.

    A team of NASA scientists plan at the Goddard Space Flight Center in Greenbelt, Maryland, is further enhancing the Planetary Spectrum Generator, or PSG, which has attracted hundreds of expert and non-expert users worldwide, mainly through word-of-mouth advertising, since going online just a year ago.

    Top on the team's list is incorporating additional databases that catalogue detector characteristics of current and future ground- and space-based instruments, said Principal Investigator Geronimo Villanueva, a Goddard planetary scientist who is leading the effort.

    With the enhancement, scientists will be able to use the tool to predict what an instrument might detect when observing a rocky planet, gas giant, comet, or even one of the thousands of exoplanets already identified by NASA's Kepler Space Telescope -- insights scientists value when reserving observation time on any telescope. It likewise will help them conceive and plan future missions, including the type of instruments they would have to fly to gather a certain type of measurement.

    "No tool out there does what it can do," said Villanueva, who conceived the application mainly out of frustration. "I remember trying to find out how many photons (light particles) I could collect from Mars. I thought everyone should know that, but I ended up spending a lot of time trying to get the right numbers."

    He thought, why not create an online tool that stores validated data on a server and then allows users to access and manipulate the measurements using embedded, behind-the-scenes supercomputer algorithms and a user-friendly Web-based interface that's accessible on any computing device, including tablets and smart phones? "Basically, I collected knowledge and put it here," Villanueva said.

    Initially, PSG allowed users to synthesize spectra -- the analysis of light -- of planets, comets, asteroids, moons, and thousands of exoplanets over a broad range of wavelengths. The data came from several observatories, orbiters, and landers.

    The original tool also included a 3-D orbital calculator as well as programs that simulated non-data instrument "noise" and coronagraphs -- devices that block starlight to reveal orbiting objects. Users also could access profiles of temperature and chemical abundances for Venus, Earth, Mars, Titan, Neptune, and other smaller bodies, just to name a few features.

    PSG proved to be a "great asset" and relatively easy to use, as evidenced by the number of non-technical users who access the online tool, Villanueva said. Since its debut, scientists have used the tool to develop Mars mission concepts, plan James Webb Space Telescope observations, and define scientific and technical requirements for a proposed follow-on to the Webb Observatory, he added.

    Despite its success, Villanueva and his team believed it could be improved.

    "The foundational elements are there, but the tool didn't include performance data for detectors or sensors," he said, adding that he and his team plan to incorporate detector characteristics for eight assets, including the Webb Observatory, Hubble Space Telescope, New Horizons, Cassini, ExoMars, the NASA Infrared Telescope Facility, and the Stratospheric Observatory for Infrared Astronomy.

    With these performance specifications, coupled with the tool's already-existing databases, scientists will be able to define which missions to fly and develop accurate and precise models of how instruments would be expected to perform.

    "Once you know how to model the performance of different technologies, you can accurately design a wide range of future instruments," Villanueva said. "It could tell you, for example, if you needed a bigger mirror or a better detector. In the end, these developments will reduce costs, time, and resources when designing future instruments and missions."