A new ‘early warning’ system that automatically informs gamblers as soon as their behaviour shows signs of turning into an addiction is helping people engage in the pastime responsibly.

The system pinpoints whether a player’s gambling patterns are exhibiting signs of risk and starting to match those of previous players who asked online gambling sites to block them, for a fixed period, to stop them becoming ‘hooked’ – an option known as ‘self-exclusion’.

City University London has worked with software analytics company BetBuddy to enhance the accuracy of the computer models underpinning the system according to the very latest understanding of the psychological pathways to gambling addiction.

The research was funded by the UK’s innovation agency, Innovate UK, under their Data Exploration programme supported by contributions from the RCUK Digital Economy Theme, Engineering and Physical Sciences Research Council (EPSRC), the Economic and Social Research Council (ESRC) and the Defence Science and Technology Laboratory (Dstl).  

“All UK gambling providers are legally obliged to offer customers a self-exclusion option,” says Dr Artur Garcez of City University London. “Our aim has been to help BetBuddy test and refine their system so that it gives providers an effective way of predicting at an earlier stage self-exclusion as well as other signals or events that indicate harm in gambling. This enables customers to use online gambling platforms more securely and responsibly.”

Dr Garcez’s team found that by harnessing a machine learning method known as ‘random forests’ and applying it to a real-world online gambling dataset, the system could achieve an impressive 87 per cent accuracy in predicting playing patterns which were likely to evolve in an unhealthy direction.

Professor Philip Nelson, EPSRC’s Chief Executive, said: “This project is an example of how artificial intelligence and machine learning methods can be used to address an important social problem. The RCUK Digital Economy Theme is a fine example of inter-disciplinary and partnership working for the benefit of society.”

Armed with information on gambling patterns, providers can decide not to send players marketing material for a period or they can use the information to alert a player to a potential problem.

“Although systems of this kind are already in use, none are believed to have published peer-reviewed research that evidences the same levels of accuracy and reliability as the BetBuddy system,” says Dr Garcez. “Early detection and prevention of problem gambling is not only in the interest of those who engage in online gambling – it can also help deliver a more stable and growing market place for online gambling providers”.

Dovetailing with these very promising results, a key aspect of City’s work with BetBuddy has been to develop and harness a better understanding of the root causes of problem behaviour in online gambling. First presented in September 2015 at the National Center for Responsible Gaming (NCRG) conference held in Las Vegas, US, key work in this area has included an analysis of how to achieve a balanced approach that maintains high accuracy but avoids alarming gamblers unduly, and the development of a ‘knowledge extraction’ method capable of explaining results to gambling operators and gamblers themselves.

“City University London has enabled us to build more robust and accurate prediction models and apply new, creative algorithms to gambling data. By applying their expertise in knowledge extraction techniques to 'black box' prediction models, clinicians, regulators, and industry can better understand how these models can predict behaviour and better protect consumers at risk of harm" said Simo Dragicevic Chief Executive of BetBuddy.

Online gambling is a global growth industry and this year in the EU alone revenues are expected to reach €13 billion. But the problems of addiction that it can bring are well known, contributing to the 593,000 problem gamblers that, according to NHS figures, are present in Britain today. 

The price fluctuation of fine wines can now be predicted more accurately using a novel artificial intelligence approach developed by researchers at UCL. The method could be used to help fine wine investors make more informed decisions about their portfolios and encourage non-wine investors to start looking at wine in this manner and hence increase the net trade of wine. It is expected that similar techniques will be used in other 'alternative assets' such as classic cars.

Co-author, Dr Tristan Fletcher, an academic at UCL and founder of quantitative wine asset management firm Invinio, said: "People have been investing in wine for hundreds of years and it's only very recently that the way they are doing it has changed. Wine investment is becoming more accessible and is a continually growing market, primarily brokered in London: the world-centre of the wine trade. We've shown that price prediction algorithms akin to those routinely used by other markets can be applied to wines."

The study, published today in the Journal of Wine Economics with guidance from Invinio, found more complex machine learning methods outperformed other simpler processes commonly used for financial predictions. When applied to 100 of the most sought-after fine wines from the Liv-ex 100 wine index, the new approach predicted prices with greater accuracy than other more traditional methods by learning which information was important amongst the data.

Co-author, Professor John Shawe-Taylor, co-Director of the UCL Centre for Computational Statistics & Machine Learning and Head of UCL Computer Science, said: "Machine learning involves developing algorithms that automatically learn from new data without human intervention. We've created intelligent software that searches the data for useful information which is then extracted and used, in this case for predicting the values of wines. Since we first started working on machine learning at UCL, our methods have been used in a wide variety of industries, particularly medical and financial, but this is the first time we have entered the world of fine wine."

For this study, the team tested two forms of machine learning including 'Gaussian process regression' and the more complex 'multi-task feature learning', which was first invented by UCL scientists in 2006 but has had significant enhancements recently. These methods are able to extract the most relevant information from a variety of sources, as opposed to their more standard counterparts, which typically assume every data point is of interest, spurious or otherwise.

Analysis shows that machine learning methods based on Gaussian process regression can be applied to all the wines in the Liv-ex 100 with an improvement in average predictive accuracy of 15% relative to the most effective of the traditional methods. Machine learning methods based on multi-task feature learning only worked for half of the wines analysed as it required a stronger relationship between prices from one day to the next. However, where multi-task feature learning was applied, accuracy of predictions increased by 98% relative to more standard benchmarks.

Primary author and UCL MSc graduate, Michelle Yeo, said: "Other areas of finance already use automated processes for identifying meaningful trends but these haven't been tested on the fine wine market until now. We're pleased we were able to develop models applicable to fine wines and we hope our findings give the industry confidence to start adopting machine learning methods as a tool for investment decisions."

Invinio plans to continue its collaboration with UCL in order to refine the algorithms and improve the tools it provides for existing and potential wine investors through its site. In doing so, the team say they need to strike a balance between the complexity of algorithms being developed and the actual improvements they offer in terms of performance. They are considering applying these techniques to the world of classic cars next.

 

Using models that blend global economics, geography, ecology and environmental sciences is essential to understanding how changes in trade and natural systems in one part of the world affect those in another, a review concludes.

Tom Hertel, distinguished professor of agricultural economics, collaborated with an interdisciplinary team of experts led by Jianguo Liu of Michigan State University to determine how systems integration - using holistic frameworks to model many components of both human and natural systems - could shed insights on how activities in one part of the world can have significant impacts on distant regions.

Hertel said these multivariable models could help policymakers make better-informed decisions, further scientists' understanding of the links between trade and the global environment, and aid natural resource managers in making management choices that will sustain resources such as clean water and air for future generations.

"We live in an increasingly integrated global economy and society," he said. "The planet itself is a single system - a change in one place affects another. Models that can capture these complex relationships between trade and the environment are key to creating solutions to our most pressing sustainability challenges."

The review of more than 100 studies concluded that analyses that do not take multiple dimensions - such as space and time - and disciplines into account risk overlooking important links between different system components. For example, current greenhouse gas emissions could have long-lasting effects on the climate of the future, altering agricultural productivity, sea levels and the transmission of pests and disease. These effects can in turn influence greenhouse gas emission levels - for instance, clearing more land for agriculture adds carbon dioxide to the atmosphere.

"Systems integration provides crucial global perspective on problems that previously may have only been studied at a local or regional level," Hertel said.

He offered the example of biofuels production in the U.S. as one that could have benefited from the insights systems integration can provide.

Previous "farm-to-fuel tank" analyses showed that production of corn ethanol could reduce annual carbon dioxide emissions in the U.S. Aided by this research, policymakers passed the Renewable Fuels Standard requiring that fuel sold in the U.S. contain a certain amount of renewable fuels, such as corn ethanol.

But the analyses had missed key components of global trade and land use, Hertel said: While land in the U.S. previously used to grow corn for food switched to corn for grown for ethanol, the global demand for corn for food and livestock did not decrease. This could cause other countries to clear land for crops, releasing additional carbon dioxide through the land conversion process.

Hertel and fellow researchers used systems integration modeling to show that, over a 30-year production period, emissions from land use change would largely offset the direct gains from producing the mandated amount of corn ethanol.

"It turns out that the global effects of this U.S. policy are quite large - clearly large enough to call into question the regional benefits of ethanol production," he said. "Biofuels are a great example of how the absence of a systems integrated approach led to misleading advice and misguided policy."

The review cites the Global Trade Analysis Project, a free global database founded by Purdue, as an example of a modeling tool that can be used to show the interactions between ecological and environmental systems. GTAP describes trade flows around the world and breaks down economic trade by sector. It also integrates trade with land use and the greenhouse gas emissions that can result from changing land use.

"We've been working to bring in more dimensions of bioscience and ecology to GTAP," Hertel said. "It's a workhorse for this kind of modeling."

Temperature differences, slow water could delay ocean entry

Temperature differences and slow-moving water at the confluence of the Clearwater and Snake rivers in Idaho might delay the migration of threatened juvenile salmon and allow them to grow larger before reaching the Pacific Ocean.PNNL researchers place yellow acoustic receivers into the Columbia River. The receivers are part of Juvenile Salmon Acoustic Telemetry System, which is helping track the movement of tagged fall Chinook salmon on the Clearwater River in Idaho.

A team of Northwest researchers are examining the unusual life cycle of the Clearwater’s fall Chinook salmon to find out why some of them spend extra time in the cool Clearwater before braving the warm Snake. The Clearwater averages about 53 degrees Fahrenheit in the summer, while the Snake averages about 71. The confluence is part of the Lower Granite Reservoir – one of several sections of slow water that are backed up behind lower Snake and Columbia river dams – that could reduce fish’s cues to swim downstream.

The delayed migration could also mean Clearwater salmon are more robust and survive better when they finish their ocean-bound trek, said Billy Connor, a fish biologist with the U.S. Fish & Wildlife Service.

“It may seem counterintuitive, but the stalled migration of some salmon could actually help them survive better,” Connor said. “Juvenile salmon may gamble on being able to dodge predators in reservoirs so they can feast on the reservoirs’ rich food, which allows them to grow fast. By the time they swim toward the ocean the next spring, they’re bigger and more likely to survive predator attacks and dam passage.”

Scientists from the U.S. Geological Survey, the U.S. Fish & Wildlife Service, the Department of Energy’s Pacific Northwest National Laboratory and the University of Washington are wrapping up field studies this fall to determine if water temperature or speed encourage salmon to overwinter in the confluence and in other reservoirs downstream. The Bonneville Power Administration is funding the research to help understand how Snake and Columbia River dams may affect fish.

USGS and USFWS are tracking fish movement by implanting juveniles with radio tags, which are more effective in shallow water. PNNL is complementing that effort with acoustic tags, which work better in deeper water. PNNL is also contributing its hydrology expertise to measure the Clearwater and Snake rivers’ physical conditions. UW is providing the statistical analysis of the tagging.

“Fall Chinook salmon on the Clearwater River have a fascinating early life history that may contribute to their successful return as adults,” said PNNL fish biologist Brian Bellgraph. “If we can support the viability of such migration patterns in this salmon subpopulation, we will be one step closer to recovering the larger fall Chinook salmon population in the Snake River Basin.”

Scientists used to think all juvenile fall Chinook salmon in the Clearwater River migrated to the ocean during the summer and fall after hatching in the spring. But researchers from USGS, USFWS and the Nez Perce Tribe began learning in the early 1990s that some stick around until the next spring. Similar delays have also been found in a select number of other rivers, but this is still the exception rather than the rule. The Clearwater is unique because a high number – as much as 80 percent in some years – of its fall Chinook salmon don’t enter the ocean before they’re a year old.

To better understand how fish react to the river’s physical conditions, scientists are implanting juvenile salmon with the two types of small transmitters that emit different signals. The transmitters – commonly called tags – are pencil eraser-sized devices that are surgically implanted into young fish 3.5 to 6 inches in length. Specially designed receivers record the tags’ signals, which researchers use to track fish as they swim. The gathered data helps scientists measure how migration is delayed through the confluence.

Radio tags release radio waves, which are ideal to travel through shallow water and air. And acoustic tags emit higher-frequency sounds, or “pings,” that more easily move through deeper water. The acoustic tags being used are part of the Juvenile Salmon Acoustic Telemetry System, which PNNL and NOAA Fisheries developed for the U.S. Army Corps of Engineers.

Together, fish tagged with both acoustic and radio transmitters help create a more comprehensive picture of how the river affects fish travel. The location data can also indicate how well fish fare. If a tag’s signal stops moving for an extended period, the fish in which it was implanted might have died. Researchers examine the circumstances of each case to determine the fish’s fate.

This study is a unique example of how both tag technologies can jointly determine the survival and migration patterns of the relatively small juvenile fall Chinook salmon. The size of transmitters has decreased considerably in recent years; further size reductions would allow researchers to study even smaller fall Chinook salmon. This could provide further insight into this mysterious migration pattern.

Beyond the fish themselves, researchers will also examine water temperature and flow to determine what correlation the river’s physical conditions may have with the fish movement. Salmon use water velocity and temperature as cues to guide them toward the ocean. But the Lower Granite Dam’s reservoir, which extends about 39 miles upriver from the dam to Lewiston, makes the water in the Clearwater River’s mouth move slowly. Researchers suspect the slow water may encourage some fall juvenile Chinook salmon to delay their journey and spend the winter in the confluence.

To test this hypothesis, PNNL scientists take periodic velocity measurements in the confluence from their research boat. Submerged sensors have recorded water temperatures every few minutes between about June and January since 2007. Both sets of information will be combined to create a computational model of the fish’s river habitat.

This study’s results could be used to modify river water flow to improve fish survival. The Clearwater’s Dworshak Dam already helps manage water temperature by strategically releasing cool water toward the Snake. The waters form thermal layers – with the Snake’s warm water on top and the Clearwater’s cool liquid below – that fish move through to regulate their body temperatures.

The Nez Perce Tribe began studying fall Chinook salmon in the lower Clearwater River in 1987. USGS and USFWS joined the effort in 1991, when the Snake River Basin’s fall Chinook salmon were first listed under the Endangered Species Act. PNNL and UW joined the study in 2007. The Bonneville Power Administration is paying for the study.

More information about the Juvenile Salmon Acoustic Telemetry System can be found at the JSATS webpage.

 

In this Letter to the Editor (LttE), Los Alamos National Laboratory's Jim Danneskiold responds to a previous LttE which stated that the real news in a Supercomputing Online story posted May 20 on LANL's new Metropolis Center for Modeling & Simulation, is that LANL's ASCI "Q" is “in trouble.”

Click here.

Los Alamos National Laboratory is one of three National Nuclear Security Administration laboratories participating in the Advanced Simulation and Computing program, called ASCI. The ASCI program is responsible for developing predictive capability to support the performance, safety and reliability of the U.S. nuclear stockpile in the absence of nuclear testing. ASCI is partnering with various industry providers to accelerate development of more powerful computing hardware. ASCI also is investing in the creation of the necessary computational models and software environment. Los Alamos, in partnership with Silicon Graphics Inc., brought the Blue Mountain machine on-line in November 1998. The machine has 6144 processors organized into 48 SMPs (each having 128 shared memory processors) with a peak performance of 3.1 TeraOps. After an initial stabilization period of 8-12 months (similar to experiences on other ASCI machines), the Blue Mountain machine has provided a very stable environment for simulations required for the stockpile stewardship mission. Los Alamos technical staff and management have had on-line access to system reliability and usage reports since November 1998. Frequent, regular reports on system reliability were provided to staff and management, as well as to NNSA. The Blue Mountain platform was used to successfully execute the ASCI 1999 and the ASCI 2000 Level 1 Milestones. The Lawrence Livermore ASCI White (12 TeraOps) platform was used to execute the ASCI 2001 milestone. ASCI White is the next step in the ASCI roadmap and consequently is more capable than its three predecessors: ASCI Red, Blue Pacific and Blue Mountain. ASCI White has faster processors and more memory per processor. For these reasons ASCI White was the most appropriate platform for the ASCI 2001 milestone calculation. However, the Blue Mountain platform has been the major production machine for nuclear weapon simulations at Los Alamos for the past three years and will remain so until the workload moves to the ASCI Q machine in 2002. The next generation ASCI flagship platform, the "Q" machine, is being installed at Los Alamos by Compaq (now Hewlett-Packard). The planned final system (30.72 TeraOps) consists of 3,072 AlphaServer ES45s, with a total of 12,288 EV-68 1.25 GHz CPUs, a total of 33 TeraBytes of Memory, 709 TeraBytes of global disk, and a dual-rail Quadrics interconnect. The Q system is being installed in phases; 768 AlphaServer ES45 nodes using the 1.0 GHz CPUs are currently installed in the Metropolis Center, for a total of ~ 6 TeraOps. Another 256 AlphaServer ES45 nodes will be added to the 768 nodes to complete the 1024 nodes in the Nicholas C. Metropolis Center for Modeling and Simulation (8 TeraOps) in about a month. The contract for Q was structured so that the majority of the payment is contingent on the Q system meeting technical performance milestones, as contrasted with a simple hardware purchase. The additional system capability is contained within three options. Option 1 is for upgrading the processors to 1.25 GHz to bring Q up to 10 TeraOps. Options 2 and 3 each include delivery of separate 10 TeraOps systems, with each being 1024 AlphaServer ES45 nodes using the 1.25 GHz CPUs. This will bring up the Q total to 30.72 TeraOps. If all goes according to plan, the 30.72 Q system will be installed and tested by the end of the calendar year 2002. Contrary to the assertions of the anonymous letter writer, the Q machine is definitely not a "bust." The Department of Energy Inspector General conducted an audit of supercomputing at Los Alamos in December 2001. There were no findings or recommendations in the audit. Los Alamos National Laboratory is proud to be a part of the ASCI program and contribute to the national defense. Our accomplishments and contributions to the program are monitored and documented as part of our University of California operating contract. Jim Danneskiold Los Alamos National Laboratory Public Affairs Office

Page 2 of 32