UNH researchers find lower emissions vital to slow warming

Winters are warming faster than summers in North America, impacting everything from ecosystems to the economy. Global climate models indicate that this trend will continue in future winters but there is a level of uncertainty around the magnitude of warming. Researchers at the University of New Hampshire focused on the role of carbon dioxide emissions in this equation—looking at the effects of both high and low levels of carbon dioxide emissions on future climate warming scenarios—and found that a reduction in emissions could preserve almost three weeks of snow cover and below-freezing temperatures. Webp.net resizeimage 2022 02 03T204040.394 1be66

“The local ski hills of New England raised me to love winter and snow,” said Elizabeth Burakowski, research assistant professor in UNH’s Earth Systems Research Center. “But winters are vital to all of us and taking serious action now to limit, or slow, the warming of winter could mean preserving many-core purposes of cold weather including providing more winter protection for woodland animals, preventing the spread of invasive forest pests, and increasing the ability of ski resorts to make snow—protecting the economy by maintaining the area’s multimillion-dollar recreation industry.”

In their study, recently published in the journal Northeastern Naturalist, the researchers analyzed 29 different climate models to determine the effect of reducing carbon dioxide emissions, and other heat-trapping gasses, into the atmosphere. At the current pace, by mid-century (2040–2069) ski areas in North America will face up to a 50% decline in days where conditions would be favorable to make snow. Limiting emissions could slow that to only a 10 to 30% decline in the number of snowmaking days. Colder days (below freezing) and preserving snow cover is also critical for providing winter habitats and protection for animals like porcupines and martens, a carnivorous member of the weasel family. At the current rate of warming, the researchers found that deep snowpacks could become increasingly short-lived, decreasing from the historical two months of subnivium, or beneath the snow, habitat to less than one month. The researchers say that maintaining a cold winter environment is also associated with greater soil carbon storage and helps prevent the spread of invasive and very destructive forests pests such as the Southern Pine Beetle, which was recently detected as far north as New Hampshire and Maine by UNH researchers.

“Emissions scenarios play a critical role in the loss of winter conditions, indicating a potential doubling of the loss of cold days and snow cover under higher emissions,” said Alexandra Contosta, research assistant professor at UNH’s Earth Systems Research Center. “These changes could disrupt and forever change some very significant social and ecological systems that have historically relied on cold, snowy winters for habitat, water resources, forest health, local economies, cultural practices, and human wellbeing.”

Historically, between 1980–2005, the number of snow-covered days in the Northeast was 95 days. Under the low emissions scenario, that would be reduced to 72 days—under the high emissions scenario, there would only be 56 days. Historically, New Jersey, Rhode Island, and Connecticut could expect to see 20–80 days of snow cover per season but by the end of the century, under the higher emissions scenario, they are more likely to have a snow-free winter.

Co-authors include Danielle Grogan, also at UNH; Sarah Nelson, Appalachian Mountain Club; Sarah Garlick, Hubbard Brook Research Foundation; and Nora Casson, University of Winnipeg.

Increase in home delivery service usage during COVID-19 pandemic unlikely to last

Services like Instacart, Grubhub, DoorDash, and Amazon certainly existed before the COVID-19 pandemic. However, demand for groceries, food, and other products purchased online and delivered directly to your door substantially increased when the coronavirus forced many Americans to stay at home. But just how much has the demand for deliveries increased, who uses the services, what kind of products are being delivered, and perhaps most importantly, will this increase in usage last? Researchers at Rensselaer Polytechnic Institute are answering these questions to aid policymakers and transportation logistics planners. 

In the first comprehensive study investigating the initial adoption and continuance intention of delivery services during a pandemicCara Wang, an associate professor in the Department of Civil and Environmental Engineering at Rensselaer, found that over 90% of people who use online delivery services would likely revert to their original way of shopping.GettyImages-1272821104_31e9a_05f14.jpg

“Likely, the increased use of e-commerce is not the result of market competition, where the most efficient competitor outperforms the others,” Dr. Wang said. “Rather, an external disruption — the pandemic — significantly altered the playing field. Once this external effect is removed, some of the gains made by the delivery services will likely fall off.”

Using a survey method and supercomputer modeling, Dr. Wang determined that not all delivery products are the same, nor do all consumers use delivery services in the same way.

Dr. Wang identified four distinct types of users: non-adopters, prior adopters, temporary new adopters, and permanent new adopters. And delivery-service users access products in four different categories: groceries, food, home goods, and other items.

The research showed that both the initial adoption of delivery services and the intent to continue using them varies by goods type. Grocery deliveries had the highest proportion of new adopters, followed by home goods, food, and, finally, other packages. These results imply that the COVID-19 pandemic had a larger impact on the purchase opportunities for essential items than less essential items.

The study also found that while the number of users for grocery deliveries increased by 113% during COVID, almost half of these new adopters would not continue to use it once the pandemic is over.

Temporary new adopters accounted for a larger portion than the permanent new adopters for essential items, while there were more permanent new adopters for less essential items.

These findings are essential for investigating the impacts of the pandemic and predicting future demand.

“Answering these questions is essential to estimate the current and future demand for deliveries,” said José Holguín-Veras, director of the Center for Infrastructure, Transportation, and the Environment at Rensselaer and a co-author of the paper. “Transportation professionals and researchers have assumed that people would still rely on delivery services even after the COVID crisis is over. However, in reality, consumers’ technology acceptance is much more dynamic and complex during a pandemic than during normal conditions. Understanding these nuanced behaviors is essential for sound transportation policymaking.” 

Entezari develops new technique that incorporates customer behavior into recommendation algorithms

Recommendation algorithms can make a customer’s online shopping experience quicker and more efficient by suggesting complementary products whenever the shopper adds a product to their basket. Did the customer buy peanut butter? The algorithm recommends several brands of jelly to add next. Vagelis Papalexakis

These algorithms typically work by associating purchased items with items other shoppers have frequently purchased alongside them. If the shopper’s habits, tastes, or interests closely resemble those of previous customers, such recommendations might save time, jog the memory, and be a welcome addition to the shopping experience. 

But what if the shopper is buying peanut butter to stuff a dog toy or bait a mousetrap? What if the shopper prefers honey or bananas with their peanut butter? The recommendation algorithm will offer less useful suggestions, costing the retailer a sale and potentially annoying the customer. 

New research led by Negin Entezari, who recently received a doctoral degree in computer science at UC Riverside, Instacart collaborators, and her doctoral advisor Vagelis Papalexakis, brings a methodology called tensor decomposition—used by scientists to find patterns in massive volumes of data—into the world of commerce to recommend complementary products more carefully tailored to customer preferences. 

Tensors can be pictured as multi-dimensional cubes and are used to model and analyze data with many different components, called multi-aspect data. Data closely related to other data can be connected in a cube arrangement and related to other cubes to uncover patterns in the data. 

“Tensors can be used to represent customers’ shopping behaviors,” said Entezari. “Each mode of a 3-mode tensor can capture one aspect of a transaction. Customers form one mode of the tensor and the second and third mode captures product-to-product interactions by considering products co-purchased in a single transaction.”

For example, three hypothetical shoppers—A, B, and C— make the following purchases:

A: Buys hot dogs, hot dog buns, Coke, and mustard in one transaction.
B: Makes three separate transactions: Basket 1: Hot dogs and hot dog buns; Basket 2: Coke; Basket 3: Mustard
C: Hot dogs, hot dog buns, and mustard in one transaction.

To a conventional matrix-based algorithm, Customer A is identical to Customer B because they bought the same items. Using tensor decomposition, however, Customer A is more closely related to Customer C because their behavior was similar. Both had similar products co-purchased in a single transaction, even though their purchases differed slightly. 

The typical recommendation algorithm makes predictions based on the item the customer just purchased while tensor decomposition can make recommendations based on what is already in the user’s whole basket. Thus, if a shopper has dog food and peanut butter in their basket but no bread, a tensor-based recommendation algorithm might suggest a fillable dog chew toy instead of jelly if other users have also made that purchase.

“Tensors are multidimensional structures that allow modeling of complex, heterogeneous data,” said Papalexakis, an associate professor of computer science and engineering. “Instead of simply noticing which products are purchased together, there’s a third dimension. These products are purchased by this kind of user and the algorithm tries to determine which kinds of users are creating this match.”

To test their method, Entezari, Papalexakis, and co-authors Haixun Wang, Sharath Rao, and Shishir Kumar Prasad, all researchers for Instacart, used an Instacart public dataset to train their algorithm. They found that their method outperformed state-of-the-art methods for predicting customer-specific complementary product recommendations. Though more work is needed, the authors conclude that big data tensor decomposition could eventually find a home in big business as well.

“Tensor methods, even though very powerful tools, are still more popular in academic research as far as recommendation systems go,” said Papalexakis. “For the industry to adopt them we must demonstrate it’s worthwhile and relatively painless to substitute for whatever they have that works already.”

While previous research has shown the benefits of tensor modeling in recommendation problems, the new publication is the first to do so in the setting of complementary item recommendation, bringing tensor methods closer to industrial adoption and technology transfer in the context of recommendation systems.

“Tensor methods have been adopted successfully by industry before, with chemometrics and food quality being great examples, and every attempt like our work demonstrates the versatility of tensor methods in being able to tackle such a broad range of challenging problems in different domains,” said Papalexakis.

Japanese scientists develop a chaos-based stream cipher that could withstand attacks from quantum supercomputers

While for most of us cryptographic systems are things that just run “under the hood,” they are an essential element in the world of digital communications. However, the upcoming rise of quantum computers could shake the field of cryptography to its core. Fast algorithms running on these machines could break some of the most widely used cryptosystems, rendering them vulnerable. Well aware of this looming threat, cryptography researchers worldwide are working on novel encryption methods that can withstand attacks from quantum supercomputers. low res infographics jpg 1 23f03

Chaos theory is actively being studied as a basis for post-quantum-era cryptosystems. In mathematics, chaos is a property of certain dynamic systems that makes them extremely sensitive to initial conditions. While technically deterministic (non-random), these systems evolve in such complex ways that predicting their long-term state with incomplete information is practically impossible, since even small rounding errors in the initial conditions yield diverging results. This unique characteristic of chaotic systems can be leveraged to produce highly secure cryptographic systems, as a team of researchers from Ritsumeikan University, Japan, showed in a recent study.

Led by Professor Takaya Miyano, the team developed an unprecedented stream cipher consisting of three cryptographic primitives based on independent mathematical models of chaos. The first primitive is a pseudorandom number generator based on the augmented Lorenz (AL) map. The pseudorandom numbers produced using this approach are used to create key streams for encrypting/decrypting messages, which take the stage in the second and perhaps most remarkable primitive—an innovative method for secret-key exchange.

This novel strategy for exchanging secret keys specifying the AL map is based on the synchronization of two chaotic Lorenz oscillators, which can be independently and randomly initialized by the two communicating users, without either of them knowing the state of the other’s oscillator. To conceal the internal states of these oscillators, the communicating users (the sender and the receiver) mask the value of one of the variables of their oscillator by multiplying it with a locally generated random number. The masked value of the sender is then sent to the receiver and vice-versa. After a short time, when these back-and-forth exchanges cause both oscillators to sync up almost perfectly to the same state despite the randomization of the variables, the users can mask and exchange secret keys and then locally unmask them with simple calculations.

Finally, the third primitive is a hash function based on the logistic map (a chaotic equation of motion), which allows the sender to send a hash value and, in turn, allows the receiver to ensure that the received secret key is correct, i.e., the chaotic oscillators were synchronized properly.

The researchers showed that a stream cipher assembled using these three primitives is extremely secure and resistant to statistical attacks and eavesdropping since it is mathematically impossible to synchronize their oscillator to either the sender’s or the receiver’s ones. This is an unprecedented achievement, as Prof. Miyano states: “Most chaos-based cryptosystems can be broken by attacks using classical computers within a practically short time. In contrast, our methods, especially the one for secret-key exchange, appear to be robust against such attacks and, more importantly, even hard to break using quantum computers.”

In addition to its security, the proposed key exchange method applies to existing block ciphers, such as the widely used Advanced Encryption Standard (AES). Moreover, the researchers could implement their chaos-based stream cipher on the Raspberry Pi 4, a small-scale computer, using Python 3.8. They even used it to securely transmit a famous painting by Johannes Vermeer between Kusatsu and Sendai, two places in Japan 600 km apart. “The implementation and running costs of our cryptosystem are remarkably low compared with those of quantum cryptography,” highlights Prof. Miyano, “Our work thus provides a cryptographic approach that guarantees the privacy of daily communications between people all over the world in the post-quantum era.”

With such power of chaos-based cryptography, we may not have much to worry about the dark sides of quantum computing.

Moons may yield clues to what makes planets habitable

In the search for Earth-like planets, University of Rochester scientist Miki Nakajima turns to supercomputer simulations of moon formations.

Earth’s moon is vitally important in making Earth the planet we know today: the moon controls the length of the day and ocean tides, which affect the biological cycles of lifeforms on our planet. The moon also contributes to Earth’s climate by stabilizing Earth’s spin axis, offering an ideal environment for life to develop and evolve.

Because the moon is so important to life on Earth, scientists conjecture that a moon may be a potentially beneficial feature in harboring life on other planets. Most planets have moons, but Earth’s moon is distinct in that it is large compared to the size of Earth; the moon’s radius is larger than a quarter of Earth’s radius, a much larger ratio than most moons to their planets.

Miki Nakajima, an assistant professor of earth and environmental sciences at the University of Rochester, finds that distinction significant. And in a new study that she led, she and her colleagues at the Tokyo Institute of Technology and the University of Arizona examine moon formations and conclude that only certain types of planets can form moons that are large in respect to their host planets.

“By understanding moon formations, we have a better constraint on what to look for when searching for Earth-like planets,” Nakajima says. “We expect that exomoons [moons orbiting planets outside our solar system] should be everywhere, but so far we haven’t confirmed any. Our constraints will be helpful for future observations.”

The origin of Earth’s moon

Many scientists have historically believed Earth’s large moon was generated by a collision between proto-Earth—Earth at its early stages of development—and a large, Mars-sized impactor, approximately 4.5 billion years ago. The collision resulted in the formation of a partially vaporized disk around Earth, which eventually formed into the moon.

To find out whether other planets can form similarly large moons, Nakajima and her colleagues conducted impact simulations on the computer, with several hypothetical Earth-like rocky planets and icy planets of varying masses. They hoped to identify whether the simulated impacts would result in partially vaporized disks, like the disk that formed Earth’s moon.

The researchers found that rocky planets larger than six times the mass of Earth (6M) and icy planets larger than one Earth mass (1M) produce fully—rather than partially—vaporized disks, and these fully-vaporized disks are not capable of forming fractionally large moons.

“We found that if the planet is too massive, these impacts produce completely vapor disks because impacts between massive planets are generally more energetic than those between small planets,” Nakajima says.

After an impact that results in a vaporized disk, over time, the disk cools and liquid moonlets—a moon’s building blocks—emerge. In a fully-vaporized disk, the growing moonlets in the disk experience strong gas drag from vapor, falling onto the planet very quickly. In contrast, if the disk is only partially vaporized, moonlets do not feel such strong gas drag.

“As a result, we conclude that a complete vapor disk is not capable of forming fractionally large moons,” Nakajima says. “Planetary masses need to be smaller than those thresholds we identified to produce such moons.”

The search for Earth-like planets

The constraints outlined by Nakajima and her colleagues are important for astronomers investigating our universe; researchers have detected thousands of exoplanets and possible exomoons, but have yet to definitively spot a moon orbiting a planet outside our solar system.

This research may give them a better idea of where to look.

As Nakajima says: “The exoplanet search has typically been focused on planets larger than six earth masses. We are proposing that instead, we should look at smaller planets because they are probably better candidates to host fractionally large moons.”