Numerical model takes us one step closer to understanding anti-hydrogen formation, to explain the prevalence of matter and antimatter in the universe

Antihydrogen is a particular kind of atom, made up of the antiparticle of an electron - a Positron - and the antiparticle of a Proton - an antiproton. Scientists hope that studying the formation of anti hydrogen will ultimately help explain why there is more matter than antimatter in the universe. In a new study published in EPJ D, Igor Bray and colleagues from Curtin University, Perth, Australia, demonstrate that the two different numerical calculation approaches they developed specifically to study collisions are in accordance. As such, their numerical approach could therefore be used to explain antihydrogen formation.

There are several methods of explaining anti-hydrogen creation. These involve calculating what happens when a particular kind of particle, made up of an electron and a positron bound together, called positronium, scatters on a proton or on an antiproton. The trouble is that devising numerical simulations of such collision is particularly difficult due to the presence of two centres for the occurrence: the atomic level with the proton and at the positronium level.

The authors employed two very different calculations - using a method dubbed coherent close-coupling - for both one- and two-centre collisions respectively in positron scattering on hydrogen and helium. Interestingly, they obtained independently convergent results for both approaches. Such convergence matters, as it is a way to ascertain the accuracy of their calculations for anti-hydrogen formation.

They then also compared the estimates of the area in the vicinity of the atom within which the positronium would need to be to ensure collision. They found excellent agreement with the two methods for hydrogen. However, their method did not prove quite as good for helium. This indicates that there is further room for improvement in the theory for helium before the approach can be applied to more complex atoms, such as magnesium and molecular hydrogen. 

Successful graduates will enter national service to protect vital assets such utilities, water treatment and military defense systems

A team of cybersecurity researchers at the University of Massachusetts Amherst led by computer scientist Brian Levine has a received a $4.2 million grant from the National Science Foundation (NSF) to bring a CyberCorps Scholarship for Service (SFS) program to the campus, the first public university in New England to receive such an award.

NSF's CyberCorps program, in partnership with the Department of Homeland Security, supports the educational and professional development of domestic students who will help the nation address threats to national security including critical infrastructure such as utilities, water treatment, military defense systems and refineries. Brian Levine and colleagues will train 28 students over the next five years, who upon graduation will join government agencies to work in cyber security at agencies such as the FBI, National Institutes of Health, Centers for Disease Control, and others at a state or local level.

Upon graduation and completing the training, students will join government agencies at full pay and benefits working in cybersecurity, such as the FBI, National Institutes of Health, Centers for Disease Control, and analogous agencies at a state or local level. Any government service involving cybersecurity fulfills the service requirement, ranging from protecting the nation's infrastructure from state-based hackers to joining a state university as a researcher or educator in cybersecurity.

The program, which will support a total of 28 students over the next five years, will admit its first students in the fall 2016 semester, Levine says. Students must be U.S. citizens or permanent residents and can receive up to two years of support from the CyberCorps SFS. For each year they accept aid, they will serve for year in a federal, state or local government position related to cybersecurity.

"The program offers very generous support," Levine notes, "and we will be actively recruiting women and people from underrepresented minority groups interested in security." Graduate students receive full tuition and fees per year, plus a $34,000 stipend, health insurance reimbursement up to $3,000, a $4,000 travel allowance and book allowance of $2,000; undergraduates received the same except the stipend is $22,500 year. In addition to financial benefits, Levine says students in the CyberCorps SFS program will receive support in extra mentoring groups, assistance in finding summer internships and permanent positions at federal and state agencies, and other professional development opportunities.

Students majoring in the College of Information and Computer Sciences, Isenberg School of Management, the department of mathematics and statistics and the department of electrical and computer engineering at UMass Amherst are eligible to apply.

Katherine S. Newman, senior vice chancellor and provost, says, "This program answers a critical national shortage of highly trained experts in cybersecurity, and will prepare students for successful careers in this field through a combination of strong curricula, ample professional development, extensive advising, interdisciplinary enrichment and access to recruiting opportunities."

John McCarthy, senior vice provost and dean of the Graduate School, says, "This program will support and expand several of the campus's advanced programs that educate and train cybersecurity researchers and professionals. It will support interdisciplinary cohorts of seven scholars across four nationally recognized colleges and departments at UMass Amherst, computer science, electrical and computer systems engineering, mathematics and statistics and business."

Levine notes, "Our role is to help students move swiftly into government security positions and other roles where there is a shortage." Some of the CyberCorps SFS scholars will come from schools and colleges not usually associated with cybersecurity such as business and statistics, he adds, because "we are trying to train people who know about management and statistics with added computer science expertise to broaden the government's ability to address cybersecurity. Problems in security and privacy are no longer only technical gaps. Multidisciplinary approaches have the best chance at addressing these issues."

Wayne Burleson, professor of electrical and computer engineering and co-principal investigator of the program with Levine, says, "Government agencies really need a program like this because they are competing with major tech and business corporations who offer very attractive financial and tuition incentives. The government needs that same talent and is trying to match recruitment. I think it's important to us as citizens to know that the government is seeking to retain the best talent in the field."

Levine adds, "Meanwhile, as we train these SFS students we're building up a stronger program here at UMass, which benefits all students on campus. Many of the professional development events will be open to all. At the end of five years we will have trained 28 students who have advanced experience and skills in the latest cybersecurity techniques and approaches, and we will help to bring them into the government."

New projects will build datasets to help test grid technologies

Say you have a great new theory or technology to improve the nation's energy backbone — the electric grid. Wouldn't it be great to test it against a model complete with details that would tell you how your ideas would work? But it's a challenge, because existing sets of data are too small or outdated; and you don't have access to real data from the grid because of security and privacy issues. The Electricity Infrastructure Operations Center at Pacific Northwest National Laboratory will host the web portal and repository for realistic grid data developed under a new ARPA-E program.

To overcome this issue, the Department of Energy's  is helping to create open-access power grid datasets for researchers and industry. DOE's Advanced Research Projects Agency — Energy has awarded PNNL $3 million for two projects in a new program — Generating Realistic Information for the Development of Distribution and Transmission Algorithms or .

In one project, PNNL will develop a sustainable data evolution technology or SDET. First, researchers will gather features and metrics from many private datasets provided by the laboratory's industry partners. The team includes the ,  (formerly Alstom Grid),  — an eastern regional transmission organization, the  and , a western utility.

Datasets essentially describe the physical power grid and the transactions that occur on it. Data points can include how long it takes to ramp up a power plant, the resistance to flow on a power line, maximum power a certain plant can generate, connectivity — how electricity moves from point to point, the configuration of the grid at any given point, and much more.

The combination of these elements is vast and they all determine the performance of the grid, which has been made more complex with the relatively recent addition of factors such as renewables and intelligent devices. So, researchers are seeking new methods to make it operate reliably for the least amount of expense to owners, operators and ratepayers.

"Creating algorithms to optimize the grid essentially comes down to a challenging mathematical problem," said Henry Huang, an engineer at PNNL. "It's like the old saying 'garbage in, garbage out.' We need to get the right numbers — realistic numbers — into the algorithms needed for modeling so that utilities and grid operators feel confident in adopting new technologies being developed to modernize the grid."

PNNL, together with partners, will develop data creation tools and use them to generate large-scale, open-access, realistic datasets. Finally, they will validate the datasets that are created using industry tools provided by GE Grid Solutions.

The data creation tools as well as the datasets will be made available through a data repository, which will be also be created by the second PNNL project awarded by ARPA-E.

PNNL will partner with the National Rural Electric Cooperative Association to build a power system model repository, which will host the open-access power grid models and datasets.

This Data Repository for Power System Open Models With Evolving Resources, or DR POWER, approach will review, annotate, and verify submitted datasets while establishing a repository and a web portal to distribute open-access models and scenarios. It will include the ability to collaboratively build, refine, and review a range of large-scale realistic power system models. It will also include datasets created by other GRID DATA projects.

For researchers, this represents a significant improvement over current small-scale, static models that do not properly represent the challenging environments encountered by present and future power grids. The repository and the web portal will be hosted and maintained in PNNL's Electricity Infrastructure Operations Center.

  • Siemens expands portfolio for industry software
  • CD-adapco a leader in computational fluid dynamics (CFD) simulation
  • Purchase price of $970 million

Europe's Siemens and CD-adapco have entered into a stock purchase agreement for the acquisition of CD-adapco by Siemens. The purchase price is $970 million. CD-adapco is a global engineering simulation company with software solutions covering a wide range of engineering disciplines including Fluid Dynamics (CFD), Solid Mechanics (CSM), heat transfer, particle dynamics, reactant flow, electrochemistry, acoustics and rheology. Last fiscal year, CD-adapco had over 900 employees and revenue of close to $200 million with software-typical double digit margins. On average, CD-adapco increased its revenue at constant currencies by more than 12 percent annually over the past three fiscal years. Siemens expects this business to continue to experience strong growth in the future. 

"As part of its Vision 2020, Siemens is acquiring CD-adapco and sharpening its focus on growth in digital business and expanding its portfolio in the area of industry software. Simulation software is key to enabling customers to bring better products to the market faster and at less cost. With CD-adapco, we're acquiring an established technology leader that will allow us to supplement our world-class industry software portfolio and deliver on our strategy to further expand our digital enterprise portfolio," said Klaus Helmrich, member of the Managing Board of Siemens.

CD-adapco is a global engineering simulation company with a unique vision for Multidisciplinary Design eXploration (MDX). Engineering simulation provides the most reliable flow of information into the design process, which drives innovation and lowers product development costs. CD-adapco simulation tools, led by the flagship product STAR-CCM+, allow engineers to discover better designs, faster. CD-adapco now has over 3,200 customers worldwide. Its software is currently used by 14 of the 15 largest carmakers, by all of the top ten suppliers to the aerospace industry and by nine of the ten largest manufacturers in the energy and marine sectors.

CD-adapco CEO and President Sharron MacDonald said, "I am pleased for both the employees and the customers of CD-adapco. The opportunities that come with the acquisition by Siemens are endless. The vision of our founders will be realized in the integration of these world-class engineering and manufacturing technologies and a business strategy that will allow engineering simulation to impact more products and companies than ever before."

CD-adapco is headquartered in Melville, New York, U.S., and has 40 locations worldwide. Siemens expects synergy impact on EBIT to be in the mid-double-digit million range within five years of closing, mainly from revenue. Closing of the transaction is subject to customary conditions and is expected in the second half of fiscal year 2016.

CD-adapco will be integrated into the PLM software business of Siemens' Digital Factory (DF) Division. DF is the industry leader in automation technology and a leading provider of Product Lifecycle Management (PLM) software. "By adding advanced engineering simulation tools such as CFD to our portfolio and experienced experts in the field to our organization, we're greatly enhancing our core competencies for model-based simulation that creates a very precise digital twin of the product," said Anton Huber, CEO of the Digital Factory Division.

The Digital Factory Division bundles all Siemens' businesses serving the discrete manufacturing sectors – for example, car and aircraft construction, machine construction and electronics. Its portfolio includes high-performance, fully integrated software and hardware technologies for implementing seamless data-technical links between development, production and suppliers. Siemens is currently the only company offering technologies that comprehensively merge the virtual world of product development and the real world of manufacturing. New products can be designed, tested and optimized on the computer, while the corresponding production processes are already being planned and implemented. As a result, customers profit from enhanced efficiency, greater flexibility and faster market readiness.

CAPTION These are density maps of galaxy cluster distribution.

First observational evidence for assembly bias could impact understanding of the universe

An international team of researchers, including Carnegie Mellon University's Rachel Mandelbaum, has shown that the relationship between galaxy clusters and their surrounding dark matter halo is more complex than previously thought. The researchers' findings, published in Physical Review Letters today (Jan. 25), are the first to use observational data to show that, in addition to mass, a galaxy cluster's formation history plays a role in how it interacts with its environment.

There is a connection between galaxy clusters and their dark matter halos that holds a great deal of information about the universe's content of dark matter and accelerating expansion due to dark energy. Galaxy clusters are groupings of hundreds to thousands of galaxies bound together by gravity, and are the most massive structures found in the universe. These clusters are embedded in a halo of invisible dark matter. Traditionally, cosmologists have predicted and interpreted clustering by calculating just the masses of the clusters and their halos. However, theoretical studies and cosmological simulations suggested that mass is not the only element at play -- something called assembly bias, which takes into account when and how a galaxy cluster formed, also could impact clustering.

"Simulations have shown us that assembly bias should be part of our picture," said Mandelbaum, a member of Carnegie Mellon's McWilliams Center for Cosmology. "Confirming this observationally is an important piece of understanding galaxy and galaxy cluster formation and evolution."

In the current study, the research team, led by Hironao Miyatake, Surhud More and Masahiro Takada of the Kavli Institute for the Physics and Mathematics of the Universe, analyzed observational data from the Sloan Digital Sky Survey's DR8 galaxy catalog. Using this data, they demonstrated that when and where galaxies group together within a cluster impacts the cluster's relationship with its dark matter environment.

The researchers divided close to 9,000 galaxy clusters into two groups based on the spatial distribution of the galaxies in each cluster. One group consisted of clusters with galaxies aggregated at the center and the other consisted of clusters in which the galaxies were more diffuse. They then used a technique called gravitational lensing to show that, while the two groups of clusters had the same mass, they interacted with their environment much differently. The group of clusters with diffuse galaxies were much more clumpy than the group of clusters that had their galaxies close to the center.

"Measuring the way galaxy clusters clump together on large scales is a linchpin of modern cosmology. We can go forward knowing that mass might not be the only factor in clustering," Mandelbaum said.

Page 7 of 106