Woolpert launches STREAM:RASTER to host imagery in the cloud

The subscription software service is geared toward organizations that manage GIS data and infrastructure.

Woolpert is launching STREAM:RASTER, a subscription software service that hosts and facilitates access to geospatial imagery in the cloud. Customers upload raster data, and STREAM:RASTER creates tile caches of the data ready for use in GIS and web mapping software. The customers are then able to manage and control access to their proprietary imagery in STREAM:RASTER, with support and a service level agreement from Woolpert.

“Every day we work with state and local governments who manage large amounts of GIS data to support multiple initiatives,” Woolpert Cloud Director Dylan Thomas said. “STREAM:RASTER allows them to focus on the application of that data without having to hire additional staff for its operational support. It’s their data; this service just deepens the effectiveness of that data, protects it and makes it much easier to use.”

Thomas said storing data in the cloud benefits municipalities around the world, and the volume of data continues to grow. STREAM:RASTER provides cost-effective and reliable storage of that data at scale. {module In-article}

“When any kind of disaster strikes, or a city or county’s database is flooded with activity from multiple sources at once—911, police, utility crews, fire and rescue, local citizens, etc.—that high demand can overload or even shut down systems when you need them the most,” Thomas said. “Not only does having your imagery in the cloud act as a cheap backup, but it can save the thousands of dollars that even a short outage can cost.”

Thomas said the concept for STREAM:RASTER was born from clients who had used SmartView Connect, a Woolpert application that enables clients to review orthoimagery and GIS data while a project is underway. After projects were completed, many clients said that they wanted to retain proprietary access to the data while Woolpert continued as host. Like SmartView Connect, the data hosted in STREAM:RASTER is accessible by Esri software on desktop and web, as well as on many open source and commercial web mapping platforms like OpenLayers and Google Maps.

“STREAM:RASTER is the result of us scratching a real business itch based on customer feedback,” Thomas said. “It’s exciting to be able to fill that need. We understand imagery, we understand data, we are a Google partner and we are an Esri partner. This is just the next logical step in supporting our clients’ evolving geospatial needs.”

For more information about STREAM:RASTER, visit woolpert.com/stream-raster.

UM researcher's model of ancient climate suggests future warming could accelerate

The rate at which the planet warms in response to the ongoing buildup of heat-trapping carbon dioxide gas could increase in the future, according to new supercomputer simulations of a comparable warm period more than 50 million years ago.

Researchers at the University of Michigan and the University of Arizona used a state-of-the-art climate model to successfully simulate--for the first time--the extreme warming of the Early Eocene Period, which is considered an analog for Earth's future climate.

They found that the rate of warming increased dramatically as carbon dioxide levels rose, a finding with far-reaching implications for Earth's future climate, the researchers report in a paper scheduled for publication Sept. 18 in the journal Science Advances.

Another way of stating this result is that the climate of the Early Eocene became increasingly sensitive to additional carbon dioxide as the planet warmed.

"We were surprised that the climate sensitivity increased as much as it did with increasing carbon dioxide levels," said first author Jiang Zhu, a postdoctoral researcher at the UM Department of Earth and Environmental Sciences. {module In-article}Jiang Zhu

"It is a scary finding because it indicates that the temperature response to an increase in carbon dioxide in the future might be larger than the response to the same increase in CO2 now. This is not good news for us."

The researchers determined that the large increase in climate sensitivity they observed--which had not been seen in previous attempts to simulate the Early Eocene using similar amounts of carbon dioxide--is likely due to an improved representation of cloud processes in the climate model they used, the Community Earth System Model version 1.2, or CESM1.2.

Global warming is expected to change the distribution and types of clouds in the Earth's atmosphere, and clouds can have both warming and cooling effects on the climate. In their simulations of the Early Eocene, Zhu and his colleagues found a reduction in cloud coverage and opacity that amplified CO2-induced warming.

The same cloud processes responsible for increased climate sensitivity in the Eocene simulations are active today, according to the researchers.

"Our findings highlight the role of small-scale cloud processes in determining large-scale climate changes and suggest a potential increase in climate sensitivity with future warming," said U-M paleoclimate researcher Christopher Poulsen, a co-author of the Science Advances paper.

"The sensitivity we're inferring for the Eocene is indeed very high, though it's unlikely that climate sensitivity will reach Eocene levels in our lifetimes," said Jessica Tierney of the University of Arizona, the paper's third author.

The Early Eocene (roughly 48 million to 56 million years ago) was the warmest period of the past 66 million years. It began with the Paleocene-Eocene Thermal Maximum, which is known as the PETM, the most severe of several short, intensely warm events.

The Early Eocene was a time of elevated atmospheric carbon dioxide concentrations and surface temperatures at least 14 degrees Celsius (25 degrees Fahrenheit) warmer, on average, than today. Also, the difference between temperatures at the equator and the poles was much smaller.

Geological evidence suggests that atmospheric carbon dioxide levels reached 1,000 parts per million in the Early Eocene, more than twice the present-day level of 412 ppm. If nothing is done to limit carbon emissions from the burning of fossil fuels, CO2 levels could once again reach 1,000 ppm by the year 2100, according to climate scientists.

Until now, climate models have been unable to simulate the extreme surface warmth of the Early Eocene--including the sudden and dramatic temperature spikes of the PETM--by relying solely on atmospheric CO2 levels. Unsubstantiated changes to the models were required to make the numbers work, said Poulsen, a professor in the U-M Department of Earth and Environmental Sciences and associate dean for natural sciences.

"For decades, the models have underestimated these temperatures, and the community has long assumed that the problem was with the geological data, or that there was a warming mechanism that hadn't been recognized," he said.

But the CESM1.2 model was able to simulate both the warm conditions and the low equator-to-pole temperature gradient seen in the geological records.

"For the first time, a climate model matches the geological evidence out of the box--that is, without deliberate tweaks made to the model. It's a breakthrough for our understanding of past warm climates," Tierney said.

CESM1.2 was one of the climate models used in the authoritative Fifth Assessment Report from the Intergovernmental Panel on Climate Change, finalized in 2014. The model's ability to satisfactorily simulate Early Eocene warming provides strong support for CESM1.2's prediction of future warming, which is expressed through a key climate parameter called equilibrium climate sensitivity.

The term equilibrium climate sensitivity refers to the long-term change in global temperature that would result from a sustained doubling--lasting hundreds to thousands of years--of carbon dioxide levels above the pre-industrial baseline of 285 ppm. The consensus among climate scientists is that the ECS is likely to be between 1.5 C and 4.5 C (2.7 F-8.1 F).

The equilibrium climate sensitivity in CESM1.2 is near the upper end of that consensus range at 4.2 C (7.7 F). The U-M-led study's Early Eocene simulations exhibited increasing equilibrium climate sensitivity with warming, suggesting an Eocene sensitivity of more than 6.6 C (11.9 F), much greater than the present-day value.

Japanese scientists develop framework for 'extremely' energy-efficient circuits

Data centers are processing data and dispensing the results at astonishing rates and such robust systems require a significant amount of energy -- so much energy, in fact, that information communication technology is projected to account for 20% of total energy consumption in the United States by 2020.

To answer this demand, a team of researchers from Japan and the United States have developed a framework to reduce energy consumption while improving efficiency.

They published their results on July 19 in Scientific Reports, a Nature journal. Microphotograph of a 32-bit AQFP bitonic sorter generated by the proposed auto synthesis framework. This circuit contains 7557 Josephson superconducting junctions, which is the largest auto-designed system-level AQFP circuit.{module In-article}

"The significant amount of energy consumption has become a critical problem in modern society," said Olivia Chen, corresponding author of the paper and assistant professor in the Institute of Advanced Sciences at Yokohama National University. "There is an urgent requirement for extremely energy-efficient computing technologies."

The research team used a digital logic process called Adiabatic Quantum-Flux-Parametron (AQFP). The idea behind the logic is that direct current should be replaced with alternating current. The alternating current acts as both the clock signal and the power supply - as the current switches directions, it signals the next time phase for computing.

The logic, according to Chen, could improve conventional communication technologies with currently available fabrication processes.

"However, there lacks a systematic, automatic synthesis framework to translate from high-level logic description to Adiabatic Quantum-Flux-Parametron circuit netlist structures," Chen said, referring to the individual processors within the circuit. "In this paper, we mitigate that gap by presenting an automatic flow. We also demonstrate that AQFP can achieve a reduction in energy use by several orders of magnitude compared to traditional technologies."

The researchers proposed a top-down framework for computing decisions that can also analyze its own performance. To do this, they used logic synthesis, a process by which they direct the passage of information through logic gates within the processing unit. Logic gates can take in a little bit of information and output a yes or no answer. The answer can trigger other gates to respond and move the process forward, or stop it completely.

With this basis, the researchers developed a computation logic that takes the high-level understanding of processing and how much energy a system uses and dissipates and describes it as an optimized map for each gate within the circuit model. From this, Chen and the research team can balance the estimation of power needed to process through the system and the energy that the system dissipates.

According to Chen, this approach also compensates for the cooling energy needed for superconducting technologies and reduces the energy dissipation by two orders of magnitude.

"These results demonstrate the potential of AQFP technology and applications for large-scale, high-performance and energy-efficient computations," Chen said.

Ultimately, the researchers plan to develop a fully automated framework to generate the most efficient AQFP circuit layout.

"The synthesis results of AQFP circuits are highly promising in terms of energy-efficient and high-performance computing," Chen said. "With the future advancing and maturity of AQFP fabrication technology, we anticipate broader applications ranging from space applications and large-scale computing facilities such as data centers."