CAPTION Projective Awareness Model: a. The field of consciousness incorporates beliefs and preferences using projective geometry and minimizing free energy in order to motivate action. b. The subject imagines: option A (buying an expensive cake) and option B (making the cake him or herself) to reach the imagined situation 2 where the children at home are happy. Options A and B are very similar: both bring pleasure, but A demands an irreversible expense and B requires a one-off effort. The anticipated final free energy is minimal for B. The subject chooses B as the scenario to be realized.

An international team of experts led by the University of Geneva has developed a mathematical model of human psychology that can predict and analyze normal and pathological human behavior

A human being's psychological make-up depends on an array of emotional and motivational parameters, such as desire, suffering or the need for security. In addition, it includes spatial and temporal dimensions that also play a key role in rationalising the decisions we make and planning our actions. A team of researchers from the Universities of Geneva (UNIGE), Texas, Paris and University College London joined forces to create the first mathematical model of embodied consciousness. Their aim? To understand, study and predict human behaviour. The model, which is based on solid mathematical concepts and is demonstrated using simulations, makes it possible to anticipate and explain a host of cognitive phenomena and behavioural reactions. The research -- which you can read in full in the Journal of Theoretical Biology -- also paves the way for a wealth of industrial applications in robotics, artificial intelligence and the health sector.

An international, interdisciplinary team of researchers, headed by David Rudrauf, professor in UNIGE's faculty of psychology and educational science, was keen to produce a psychological theory that operated on the model developed by the hard sciences. The goal was to devise a mathematical model of human psychology for predicting and evaluating (normal and pathological) human behaviour. More than a decade of research -- combining maths, psychology, neuroscience, philosophy, computer science and engineering -- was required to construct this theoretical model of consciousness.

Free energy determines choices

We all are constantly faced with a range of choices, some of which are important, some not. But how do we make our decisions? There are many factors at work, conscious and unconscious, which are forever colliding whenever a decision is made. "We built a model to replicate decision-making based on the time, framework and perceptions (real and imaginary) that are linked to it," explains David Rudrauf. "The next step was to analyse the best solution that the mind would naturally opt for." Depending on an individual's personal preferences (such as security), and including different real and imaginary perspectives on the world, the mind calculates the probabilities of obtaining what it wants in the safest possible way. This probability calculation, which is derived from an individual's personal preferences and values, can be expressed as free energy. "Our consciousness uses free energy to actively explore the world and to satisfy its predilections by imagining the anticipated consequences of its actions," says Karl Friston from University College London. Depending on the free energy, the mathematical model can predict the states of consciousness and behaviour adopted by the individual and analyse the mechanisms. 

This Projective Consciousness Model analyses possible forms of behaviour according to events: if you spot a cake in a shop window, will you buy it or carry on your way? Based on your preferences -- whether you have a sweet tooth, for example, or are a penny-pincher -- the model will determine what best suits your state of mind: it will then predict your psychological state and behaviour using a combination of projective geometry and free-energy calculation.

Understanding and making a mathematical model of the phenomenology of the mind: projective geometry

As Kenneth Williford, professor of philosophy at the University of Texas, explains: "The aim was to understand and model the essential structures of conscious experience." Daniel Bennequin, professor in the mathematics department at the University of Paris 7, adds: "Perception, imagination and action are supported by unconscious mechanisms, we discovered that consciousness integrates them through a specific geometry: projective geometry." The researchers started with a synthesis of psychological phenomena, including basic perceptual phenomena: the illusion, for instance, that train tracks converge in the distance when they are actually parallel. The scientists were able to select the mathematical template for modelling this perception and the imagination associated with it. "It was then a question of understanding how this field of consciousness is related to affect, emotions and motivation as well as memory and intentions," says Rudrauf. 

Virtual reality: a space for experimenting and research

"Once the theoretical components were defined," continues Rudrauf, "we implemented them in computer programs. We are now working on connecting them to virtual reality in order to reproduce a spatial, temporal and emotional environment that is as close as possible to our own." The research team is then able to make predictions about behaviour by playing with the model's mechanisms, perfecting it and bringing it closer to human psychology. It was long-term work: "But our aim is also to gradually direct the research towards psychopathological models," points out Rudrauf. "We found, for example, that if we deprive the model of the faculty of imagination, it behaves like a person with autism. This suggests research pathways on the importance of the imagination and its specific mechanisms in managing the illness." The model works on a concept of reciprocity: humans are used to test and reinforce the effectiveness of the model; and the model is used to experiment with different cases and sources of psychological illnesses in humans.

The initial results show that this first mathematical model of embodied consciousness, incorporating temporality, spatialisation and emotions, can predict a vast array of known human behaviours and understand the mechanisms behind them. There is still much work to be done, however, to replicate human consciousness identically, since every possible type of behaviour must be implemented in the mathematical system. The researchers are now working on an extension of the algorithm that will produce machines that can adapt to the reactions of their interlocutors and act according to the principle of empathic reciprocity.


CAPTION Purdue postdoctoral research associate Fan Zuo, at left, and materials engineering professor Shriram Ramanathan, used a ceramic "quantum material" to create the technology. CREDIT Rebecca Wilcox

A new computing technology called "organismoids" mimics some aspects of human thought by learning how to forget unimportant memories while retaining more vital ones.

"The human brain is capable of continuous lifelong learning," said Kaushik Roy, Purdue University's Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. "And it does this partially by forgetting some information that is not critical. I learn slowly, but I keep forgetting other things along the way, so there is a graceful degradation in my accuracy of detecting things that are old. What we are trying to do is mimic that behavior of the brain to a certain extent, to create computers that not only learn new information but that also learn what to forget."

The work was performed by researchers at Purdue, Rutgers University, the Massachusetts Institute of Technology, Brookhaven National Laboratory and Argonne National Laboratory.

Central to the research is a ceramic "quantum material" called samarium nickelate, which was used to create devices called organismoids, said Shriram Ramanathan, a Purdue professor of materials engineering.

"These devices possess certain characteristics of living beings and enable us to advance new learning algorithms that mimic some aspects of the human brain," Roy said. "The results have far reaching implications for the fields of quantum materials as well as brain-inspired computing."

Findings are detailed in a paper appearing on Aug. 14 in the journal Nature Communications.

When exposed to hydrogen gas the material undergoes a massive resistance change, as its crystal lattice is "doped" by hydrogen atoms. The material is said to breathe, expanding when hydrogen is added and contracting when the hydrogen is removed.

"The main thing about the material is that when this breathes in hydrogen there is a spectacular quantum mechanical effect that allows the resistance to change by orders of magnitude," Ramanathan said. "This is very unusual, and the effect is reversible because this dopant can be weakly attached to the lattice, so if you remove the hydrogen from the environment you can change the electrical resistance."

The research paper's co-authors include Purdue postdoctoral research associate Fan Zuo and graduate student Priyadarshini Panda. A complete list of co-authors is available in the abstract. 

‘Organismic Learning’ Mimics Some Aspects of Human Thought

When hydrogen is exposed to the material, it splits into a proton and an electron, and the electron attaches to the nickel, temporarily causing the material to become an insulator.

"Then, when the hydrogen comes out, this material becomes conducting again," Ramanathan said. "What we show in this paper is the extent of conduction and insulation can be very carefully tuned."

This changing conductance and the "decay of that conductance over time" is similar to a key animal behavior called habituation.

"Many animals, even organisms that don't have a brain, possess this fundamental survival skill," Roy said. "And that's why we call this organismic behavior. If I see certain information on a regular basis, I get habituated, retaining memory of it. But if I haven't seen such information over a long time then it slowly starts decaying. So the behavior of conductance going up and down in exponential fashion can be used to create a new computing model that will incrementally learn and at same time forget things in a proper way."

The researchers have developed a "neural learning model" they have termed adaptive synaptic plasticity.

"This could be really important because it's one of the first examples of using quantum materials directly for solving a major problem in neural learning," Ramanathan said.

The researchers used the organismoids to implement the new model for synaptic plasticity.

"Using this effect we are able to model something that is a real problem in neuromorphic computing," Roy said. "For example, if I have learned your facial features I can still go out and learn someone else's features without really forgetting yours. However, this is difficult for computing models to do. When learning your features, they can forget the features of the original person, a problem called catastrophic forgetting."

Neuromorphic computing is not intended to replace conventional general-purpose computer hardware, based on complementary metal-oxide-semiconductor transistors, or CMOS. Instead, it is expected to work in conjunction with CMOS-based computing. Whereas CMOS technology is especially adept at performing complex mathematical computations, neuromorphic computing might be able to perform roles such as facial recognition, reasoning and human-like decision making.

Roy's team performed the research work on the plasticity model, and other collaborators concentrated on the physics of how to explain the process of doping-driven change in conductance central to the paper. The multidisciplinary team includes experts in materials, electrical engineering, physics, and algorithms.

"It's not often that a materials-science person can talk to a circuits person like professor Roy and come up with something meaningful," Ramanathan said.

Organismoids might have applications in the emerging field of spintronics. Conventional computers use the presence and absence of an electric charge to represent ones and zeroes in a binary code needed to carry out computations. Spintronics, however, uses the "spin state" of electrons to represent ones and zeros.

It could bring circuits that resemble biological neurons and synapses in a compact design not possible with CMOS circuits. Whereas it would take many CMOS devices to mimic a neuron or synapse, it might take only a single spintronic device.

In future work, the researchers may demonstrate how to achieve habituation in an integrated circuit instead of exposing the material to hydrogen gas.

A new machine learning program developed by researchers at Case Western Reserve University appears to outperform other methods for diagnosing Alzheimer's disease before symptoms begin to interfere with every day living, initial testing shows.

More than 5 million Americans may have Alzheimer's disease, according to estimates, and the numbers are growing as the population ages. The disease is an irreversible, progressive brain disorder that slowly destroys memory and thinking skills. And while there is no cure, several drugs can delay or prevent symptoms from worsening for up to five years or more, according to the National Institute on Aging and published research.

Meanwhile, early diagnosis and treatment--the goal of the new computer based program--is key to allowing those with the disease to remain independent longer. 

The computer program integrates a range of Alzheimer's disease indicators, including mild cognitive impairment. In two successive stages, the algorithm selects the most pertinent to predict who has Alzheimer's. 

"Many papers compare the healthy to those with the disease, but there's a continuum," said Anant Madabhushi, F. Alex Nason professor II of biomedical engineering at Case Western Reserve. "We deliberately included mild cognitive impairment, which can be a precursor to Alzheimers, but not always."

In a study published in the journal Scientific Reports, Madabhushi, Asha Singanamalli, who recently earned her biomedical engineering master's degree and Haibo Wang, a former postdoctoral researcher, tested the algorithm using data from 149 patients collected via the Alzheimer's Disease Neuroimaging Initiative. 

The team developed what it calls Cascaded Multi-view Canonical Correlation (CaMCCo) algorithm, which integrates measurements from magnetic resonance imaging (MRI) scans, features of the hippocampus, glucose metabolism rates in the brain, proteomics, genomics, mild cognitive impairment and other parameters. 

Madabhushi's lab has repeatedly found that integrating dissimilar information is valuable for identifying cancers. This is the first time he and his team have done so for diagnosis and characterization of Alzheimer's disease.

"The algorithm assumes each parameter provides a different view of the disease, as if each were a different set of colored spectacles," Madabhushi said. 

The program then assesses the variables in a two-stage cascade. First, the algorithm selects the parameters that best distinguish between someone who's healthy and someone who's not. Second, the algorithm selects from the unhealthy variables those that best distinguish who has mild cognitive impairment and who has Alzheimer's disease.

"The remaining views are combined to give the best picture," Madabhushi said.

In predicting which patients in the study had Alzheimer's disease, CaMCCo outperformed individual indicators as well as methods that combine them all without selective assessment. It also was better at predicting who had mild cognitive impairment than other methods that combine multiple indicators.

The researchers continue to validate and fine-tune the approach with data from multiple sites. They also plan to use the software in an observational mode: As a collaborating neurologist compiles tests on patients, the computer would run the data. If CaMCCo proves useful in predicting early Alzheimer's, Madabhushi expects to pursue a clinical trial for prospective validation.

Machine learning scientists at Disney Research have developed a new innovative model that uncovers how the meanings of words change over time. 

Dr. Robert Bamler and Dr. Stephan Mandt developed the dynamic word embeddings model by integrating neural networks and statistics used in rocket control systems. The result is a complex machine learning algorithm that automatically detects semantic change throughout history. 

"Promoting innovation in machine learning is a vital part of Disney Research," said Markus Gross, vice president of Disney Research. 

The authors will present the model on August 9 at the International Conference on Machine Learning (ICML) in Sydney.

Their dynamic word embeddings represent semantic change over time through so-called semantic vector spaces. Words of similar meanings appear close to one another and reveal each other's relationships. Changes in these meanings appear as movements through the semantic space. 

"Dynamic word embeddings could revolutionize how we investigate the evolution of language use," Mandt said. "The model uncovers otherwise latent semantic changes that become palpable through new dynamic visualizations."

For instance, the model automatically discovered that the meaning of the word 'peer' drastically changed in the last 200 years. "In the past, a peer was an earl or viscount," Bamler said. "Today, it's someone of equal standing. That is practically the opposite." Other words whose meaning changed dramatically included 'computer,' 'radio,' and 'broadcast'.

Bamler and Mandt integrated neural networks with classical statistical models. While having neural networks work on their data sets, the researchers used a Kalman filter for their transition model. It is a statistical model originally developed to trace, for instance, the position of rockets. 

"Previous approaches produce different word vectors every time they are trained," Mandt said. "Every output used to be intrinsically correct, but these outputs could not be compared to detect changes over time." 

Added Bamler, "We developed a mechanism to train the model on a vast corpus without splitting the data into time slices. That way, we were able to leverage a very large data set while still allowing the word vectors to drift smoothly over time."

The combination of classical statistical models with neural networks is a progressive approach able to prompt applications beyond the scope of word embeddings. To test their model, the researchers drew on a variety of data sets including 230 presidential speeches and millions of digitized books from over 200 years, as well as several months of public social media posts. In all experiments, the new model was consistently better in predicting missing words from these sources of text than existing models, confirming the validity of the observed changes.

Beyond tracing language history, Bamler and Mandt also analyzed contemporary changes in language usage. Training their model on social media posts, they detected changes of word associations that reflected current events. 

Combining creativity and innovation, this research continues Disney's rich legacy of leveraging technology to enhance the tools and systems of tomorrow.

Marek Osinski, professor, Electrical and Computer Engineering.

A project to develop new ultrafast laser transmitter technology at The University of New Mexico is expected to have a revolutionary impact on the physics of semiconductor lasers, with potential applications that could result in ten times the speed of current fiber optic networks.

UNM Electrical and Computer Engineering (ECE) Professor Marek Osinski leads a multidisciplinary research team that includes Gennady Smolyakov, ECE Research Associate Professor, a senior collaborator on the project.

For the past two and half years, the laser transmitter project has been centered on numerical simulations of proposed devices, funded by a $400,000 grant which will conclude in September. Now, the Office of Naval Research has awarded UNM a $1.3 million grant, funded for three and a half years, to continue with a proof of concept demonstration, moving from theory to the laboratory.

The project addresses the need for faster data transfer by increasing the speed at which signals are first generated to send through a fiber optic network. Its goal is to demonstrate that semiconductor laser devices can send data at a bandwidth speed of over 100 gigabits per second (Gbps), or 100 million kilobits per second.

To put this in perspective, current 4G networks may transfer data at a speed of 10 Gbps. While standards have not yet been established, the planned 5G revolution is expected to push data transfer speeds to at least 40 Gbps. 5G's speed and capacity is designed to facilitate the more rapid arrival of the "Internet of Things" (IoT).

Much of the fiber optic infrastructure is in place and continues to be built out as an investment in the connected future. With light as the data medium, the theoretical speed limit for fiber optics is the speed of light. The effective speed limitations for fiber optic networks are the way in which the data signals are sent, encoded, modulated and received.

"How we are sending the data is what is going to make the big difference," said Osinski.

The laboratory work will be performed at UNM's Center for High Technology Materials (CHTM), an internationally recognized resource for research in optoelectronics. 

While future applications for this technology could have a huge impact on data transmission and connectivity around the world, the Office of Naval Research (ONR) is funding the UNM research for their specific applications. ONR generates enormous amounts of data in supercomputers operating in cryogenic environments (around 4 degrees Kelvin), with the data produced then needing to be processed through computers at room temperature. ONR is looking for a way to quickly transfer these huge quantities of data.

The Office of Naval Research may be interested in applications aboard a ship, which has limited energy and room for maintaining a cryogenic environment. Removing extra heat from the cryostat (an apparatus for maintaining a very low temperature) takes lots of power.

Since it is very expensive energetically to maintain the 4K temperature, normal electronic connections with copper wire are not feasible, as the copper conducts heat. Instead, high-speed optical communication methods such as those being developed at UNM are preferred, because glass does not conduct heat.

Osinski's project explores the use of semiconductor ring lasers to provide energy-efficient high-speed optical data egress (exit) from a cryogenic environment to room temperature for further processing.

Semiconductor lasers are a common source of data in the form of light for input into optical communications.

"Most people do not know -- your cellphone has semiconductor lasers, so do CD and DVD players, laser printers, and scanners. They are everywhere already," said Osinski.

The project is expected to have a revolutionary impact on the field of fundamental and applied physics of semiconductor lasers, pioneering a new class of ring-laser-based, integrated, ultrafast sources.

"Development of inexpensive ultrafast chips operating at speeds exceeding 100 Gbps will have a huge societal impact by increasing the transmission capacity of fiber-based networks," says Osinski. "We are conducting basic scientific research with the potential for many and much broader possible applications."

"Society is supremely concerned with more data, more options," he says. "Imagine a doctor able to perform robotic surgery remotely -- the doctor would need very high quality images at very high resolution. Besides surgery, patients in remote communities without local medical resources may need to consult with a specialist online, perhaps with the doctor reading live data from on-site sensors. This is a concern even in New Mexico, with our vast rural areas."

"A lot of things that were thought impossible before are going from a dream to an expectation," Osinski continued. "People are anticipating real time communication around the world, not delayed communication. And not only are the files larger, the user base gets larger and larger. The more capacity we provide by technological development, the more inventive are the inventors."

"Ultra-high-speed (>100 GHz) on-chip modulation of integrated lasers is an important 'holy grail' for the ever-increasing demand for ultrafast communication," Osinski concluded.

Ultrafast laser transmitter prototype fabrication and testing is an excellent example of UNM's role as a research institution at the forefront of high-speed semiconductor research.

On July 5, 2017, NASA's Solar Dynamics Observatory watched an active region -- an area of intense and complex magnetic fields -- rotate into view on the Sun. The satellite continued to track the region as it grew and eventually rotated across the Sun and out of view on July 17.

With their complex magnetic fields, sunspots are often the source of interesting solar activity:

During its 13-day trip across the face of the Sun, the active region -- dubbed AR12665 -- put on a show for NASA's Sun-watching satellites, producing several solar flares, a coronal mass ejection and a solar energetic particle event. Watch the video below to learn how NASA's satellites tracked the sunspot over the course of these two weeks. 

Two Weeks in the Life of a Sunspot (VIDEO)

Such sunspots are a common occurrence on the Sun, but less frequent at the moment, as the Sun is moving steadily toward a period of lower solar activity called solar minimum -- a regular occurrence during its approximately 11-year cycle. Scientists track such spots because they can help provide information about the Sun's inner workings. Space weather centers, such as NOAA's Space Weather Prediction Center, also monitor these spots to provide advance warning, if needed, of the radiation bursts being sent toward Earth, which can impact our satellites and radio communications.

On July 9, a medium-sized flare burst from the sunspot, peaking at 11:18 a.m. EDT. Solar flares are explosions on the Sun that send energy, light and high-speed particles out into space -- much like how earthquakes have a Richter scale to describe their strength, solar flares are also categorized according to their intensity. This flare was categorized as an M1. M-class flares are a tenth the size of the most intense flares, the X-class flares. The number provides more information about its strength: An M2 is twice as intense as an M1, an M3 is three times as intense and so on.

Days later, on July 14, a second medium-sized, M2 flare erupted from the Sun. The second flare was long-lived, peaking at 10:09 a.m. EDT and lasting over two hours.

This was accompanied by another kind of solar explosion called a coronal mass ejection, or CME. Solar flares are often associated with CMEs -- giant clouds of solar material and energy. NASA's Solar and Heliospheric Observatory, or SOHO, saw the CME at 9:36 a.m. EDT leaving the Sun at speeds of 620 miles per second and eventually slowing to 466 miles per second.

Following the CME, the turbulent active region also emitted a flurry of high-speed protons, known as a solar energetic particle event, at 12:45 p.m. EDT.

Research scientists at the Community Coordinated Modeling Center -- located at NASA's Goddard Space Flight Center in Greenbelt, Maryland -- used these spacecraft observations as input for their simulations of space weather throughout the solar system. Using a model called ENLIL, they are able to map out and predict whether the solar storm will impact our instruments and spacecraft, and send alerts to NASA mission operators if necessary.

By the time the CME made contact with Earth's magnetic field on July 16, the sunspot's journey across the Sun was almost complete. As for the solar storm, it took this massive cloud of solar material two days to travel 93 million miles to Earth, where it caused charged particles to stream down Earth's magnetic poles, sparking enhanced aurora.

Virtual.Pyxis software has already been licensed to advanced research departments of major multinationals and MIT

Computer-aided engineering (CAE) systems help manufacturers to design parts with the ideal topology (inner and outer shape and structure) to withstand the conditions under which they will operate, such as specific temperature and pressure conditions, vibrations, and various stresses and strains, and to produce them with as little raw material as possible. In sum, CAE enables industrial design software to optimize part topology.

By deploying topology optimization software, manufacturers virtually sculpt lighter parts using a given amount of raw material and monitoring their strength.

These attributes are a function of the design. "You input parameters into the software with the properties and other characteristics the part needs to have, and the program shows the design path that has to be followed in order to achieve your goals," says mechatronic engineer Ricardo Doll Lahuerta, principal investigator for a research project that is promoting a significant quality enhancement in this type of tool, with joint funding from FAPESP's Small Business Innovative Research Program (PIPE) and VirtualCAE, a Brazilian company based in São Paulo State.

Recently launched, and with several upgrades in progress, Virtual.Pyxis has already been licensed to advanced research departments of major multinationals and the Massachusetts Institute of Technology (MIT) in the United States, one of the world's leading research and education institutions.

The tool enables engineers to design stronger and more versatile parts while shortening lead time and development cost.

This type of software typically requires information on the conditions under which the part will operate, such as stress, compression, vibration and temperature, as well as other design constraints, including maximum flexibility and deformation. Details of the manufacturing process to be used, such as polymer injection, casting or 3D printing, are also key inputs for CAE programs.

Virtual.Pyxis is distinguished mainly by its next-generation algorithm, which gives it the capacity to process a far larger number of variables and constraints for considerably less cost than commercially available programs.

"Our algorithm can process more variables without the need for much more computational capacity," Lahuerta explains. "As a result, any structural design can be developed with greater precision at a much lower cost. It's especially useful for designing mechanisms that need to be flexible, in which case many more variables have to be computed. Most of today's topological optimization programs pursue maximum rigidity, but that's not always ideal in a design project."

Virtual.Pyxis is capable of more complex analysis and can work with non-linear materials. It also enables the design of very precise frequency constraints, an important attribute to ensure that adjacent parts vibrate at the same resonant frequency. It is also capable of using different external calculus solvers, including those most used by the machine and metalworking industries.

VirtualCAE's software has immediate applications in a swathe of industries. It is already being used to design automotive components, farm implements, railroad equipment, and safety-linked vehicle parts. It can also be used to develop lower-cost production processes, as in the case of certain parts made by casting instead of creasing, folding and welding. Castings have better mechanical properties, last longer, use fewer components, and can be assembled faster because no welding is required. "FAPESP's support has been essential," says Valmir Fleischman, VirtualCAE's founding partner.

Even more applications are foreseen in the future. For example, enhanced prosthetic and orthotic devices. "Excessive rigidity in bone prostheses, for example, tends to weaken the bone to which they're connected. The program enables prostheses to be designed with the ideal degree of flexibility," Lahuerta says.

He adds that the development of 3D printing, which builds objects gradually, layer by layer, will enable parts to be produced from various materials combined in a structure with more advanced properties. These objects would be far harder to obtain with today's manufacturing processes.

"In the future, we'll be able to design the inner structure of a part combining materials on an ever-smaller scale until we reach the atomic scale, significantly extending design possibilities. For example, we'll be able to design a part with certain intelligent characteristics, making it lighter and more efficient. With the software tools currently available, this would require prohibitively expensive computational capacity," Lahuerta says.

"Our software is easier to customize, and with access to training in its use, our customers are often able to pay for their investment in Virtual.Pyxis out of their very first design," says Leandro Garbin, a founding partner of the firm that is putting all its chips on innovation.

The program is still being upgraded, but VirtualCAE has already licensed it to important multinationals such as Thyssenkrupp (China unit), the American agricultural equipment manufacturer AGCO, and even one of MIT's research laboratories that provides services to the US Department of Defense. Thanks to innovation, the firm has opened branch offices in Germany and the US and has representatives in Mexico, Colombia, Turkey, China and Taiwan.

CAPTION The colors represent the relative amounts of short-lived radioactive isotopes, such as iron-60, injected into a newly formed protoplanetary disk (seen face on with the protostar being the light purple blob in the middle) by a supernova shock wave. CREDIT Image courtesy of Alan Boss.

New work offers fresh evidence supporting the supernova shock wave theory of our solar system's origin

According to one longstanding theory, our Solar System's formation was triggered by a shock wave from an exploding supernova. The shock wave injected material from the exploding star into a neighboring cloud of dust and gas, causing it to collapse in on itself and form the Sun and its surrounding planets.

New work from Carnegie's Alan Boss offers fresh evidence supporting this theory, modeling the Solar System's formation beyond the initial cloud collapse and into the intermediate stages of star formation. It is published by the Astrophysical Journal.

One very important constraint for testing theories of Solar System formation is meteorite chemistry. Meteorites retain a record of the elements, isotopes, and compounds that existed in the system's earliest days. One type, called carbonaceous chondrites, includes some of the most-primitive known samples.

An interesting component of chondrites' makeup is something called short-lived radioactive isotopes. Isotopes are versions of elements with the same number of protons, but a different number of neutrons. Sometimes, as is the case with radioactive isotopes, the number of neutrons present in the nucleus can make the isotope unstable. To gain stability, the isotope releases energetic particles, which alters its number of protons and neutrons, transmuting it into another element.

Some isotopes that existed when the Solar System formed are radioactive and have decay rates that caused them to become extinct within tens to hundreds of million years. The fact that these isotopes still existed when chondrites formed is shown by the abundances of their stable decay products--also called daughter isotopes--found in some primitive chondrites. Measuring the amount of these daughter isotopes can tell scientists when, and possibly how, the chondrites formed.

A recent analysis of chondrites by Carnegie's Myriam Telus was concerned with iron-60, a short-lived radioactive isotope that decays into nickel-60. It is only created in significant amounts by nuclear reactions inside certain kinds of stars, including supernovae or what are called asymptotic giant branch (AGB) stars.

Because all the iron-60 from the Solar System's formation has long since decayed, Telus' research, published in Geochimica et Cosmochimica Acta, focused on its daughter product, nickel-60. The amount of nickel-60 found in meteorite samples--particularly in comparison to the amount of stable, "ordinary" iron-56--can indicate how much iron-60 was present when the larger parent body from which the meteorite broke off was formed. There are not many options for how an excess of iron-60--which later decayed into nickel-60--could have gotten into a primitive Solar System object in the first place--one of them being a supernova.

While her research did not find a "smoking gun," definitively proving that the radioactive isotopes were injected by a shock wave, Telus did show that the amount of Fe-60 present in the early Solar System is consistent with a supernova origin.

Taking this latest meteorite research into account, Boss revisited his earlier models of shock wave-triggered cloud collapse, extending his supercomputational models beyond the initial collapse and into the intermediate stages of star formation, when the Sun was first being created, an important next step in tying together Solar System origin modeling and meteorite sample analysis.

"My findings indicate that a supernova shock wave is still the most-plausible origin story for explaining the short lived radioactive isotopes in our Solar System," Boss said.

Boss dedicated his paper to the late Sandra Keiser, a long-term collaborator, who provided computational and programming support at Carnegie's Department of Terrestrial Magnetism for more than two decades. Keiser died in March.

ICF to support research and develop solutions for defensive cyber operations

The U.S. Army Research Laboratory (ARL) has awarded ICF, a global consulting and technology services provider, a contract valued up to $93 million to support its Defensive Cyber Operations (DCO) and Defensive Cybersecurity Research. The contract terms include both base and option periods. 

ICF will support ARL’s Cybersecurity Service Provider (CSSP) program and both basic and applied research, working to develop cyber tools and techniques and advance state of the art computer network defense. ICF will support all cyber operations for ARL and ARL subscribers through onsite and remote reviews of network security and ensure alignment between policy, compliance and assessment functions. The company will also support the information assurance management office, which sets policy for ARL. 

“ARL is the tip of the spear when it comes to national security, responsible for continually monitoring, testing and defending the cyber operations of the U.S. armed forces,” said Samuel Visner, senior vice president for cybersecurity and resilience at ICF. “We are proud to work with ARL to guard against current and emerging threats to its information systems and technology infrastructure, and excited to contribute to the state of cyber research and development.” 

The U.S. Army Research Laboratory, currently celebrating 25 years of excellence in Army science and technology, is part of the U.S. Army Research, Development and Engineering Command, which has the mission to provide innovative research, development and engineering to produce capabilities that provide decisive overmatch to the Army against the complexities of the current and future operating environments in support of the joint warfighter and the nation. RDECOM is a major subordinate command of the U.S. Army Materiel Command. 

“As partners for over two decades, we look forward to continuing our support of ARL in developing new concepts and researching, testing and applying new technologies to our nation’s defense,” said Bill Christman, vice president at ICF. 

ICF’s cybersecurity specialists help the warfighter, military and national security clients build and successfully defend the most aggressively attacked infrastructures on the planet. For over two decades, ICF has delivered cyber-innovations, ranging from support to the system that served as the model for network defense services within the U.S. Department of Defense, to improving machine-to-machine learning for better cyber defense and patenting a novel way to visualize cyber threats using virtual reality. 

Read more about ICF’s cyber research and future solutions and computer and network defense solutions.

  1. NASA continues to study pulsars, 50 years after their chance discovery
  2. NASA enhances online tool used by hundreds worldwide
  3. UBC software teaches computer characters to walk, run, even play soccer
  4. Are the world's highest paid soccer players overpaid? Big data says yes
  5. Lu Lu supercomputer models provide new understanding of sickle cell disease
  6. MIT's Zwierlein traps ultracold molecules for quantum supercomputing
  7. Cray sales decline 13 percent in 2Q17
  8. Im's bad bugs beware-bacterial membrane simulations earn researcher Bessel Award
  9. MIT's Chen presents new simulation methods that enable easier, faster design of elastic materials for robots, other dynamic objects at the SIGGRAPH 2017
  10. Kaurloto becomes Clemson CIO
  11. IST Austria's Bickel receives significant New Researcher Award
  12. UAF geophysicist Brinkerhoff develops model that yields insights into glaciers' retreats, advances
  13. French modelers show turbulence in planetary cores excited by tides
  14. Carnegie Mellon's Hodgins recognized with Steve Anson Coons Award, highest honor in computer graphics
  15. Gustafsson's efficient FPGA mapping of pipeline SDF FFT cores delivers five times the computing power
  16. VCU's Kurgan supercomputer programs help biologists to speed up hypothesis generation to understand proteins
  17. NASA supercomputing looks to solar eclipse to understand Earth’s energy system
  18. Harvard's Santillana modifies ARGO to analyze Google search data to track dengue in Brazil
  19. Los Alamos simulation reveals universal signature of chaos in ultracold reactions
  20. Warwick trained AI recognizes beautiful scenery

Page 14 of 41