Academics used programmed models to accurately predict rises and falls in value from January 1986 to June 2012

Future fluctuations in oil prices could be forecast using a combination of previous statistics and complex computer algorithms, according to new research.

Academics from the Gulf University for Science and Technology and Plymouth University used a range of programmed models to accurately predict previous rises and falls in the commodity’s value over a period from January 1986 to June 2012.

They discovered that when provided with several years of data, a gene expression programming (GEP) model almost perfectly predicted subsequent years’ figures, outperforming traditional statistical techniques.

It is also more accurate than other artificial neural network (NN) models, and the widely-used autoregressive integrated moving average (ARIMA) system.

Crude oil holds an important and growing role in the world economy, with past studies demonstrating a close relationship between oil price and the GDP growth rate.

Yet oil price prediction has always proved to be an intractable task due to the intrinsic complexity of oil market mechanisms, and other outside influences such as weather, stock levels, political aspects and even people’s psychological expectations.

Dr Ahmed El-Masry, Associate Professor in Financial Management at Plymouth University, said:

“The price of oil affects people everywhere, whether they live in countries that are net importers or exporters of the commodity. And the fluctuations of recent times have led to great economic uncertainty and that will only continue as consumption – and therefore demand – increased. If policy makers and economists had a tool which could accurately predict future prices, it would enable them to plan for the future at the same time allowing consumers to have an idea of the rising or falling costs they might incur.”

GEP is one of the most recent developments in the field of artificial evolutionary systems, but has previously been shown to accurately forecast exchange rates, short-term electricity load and even daily evaporation in Turkish lakes.

It works through a complex tree structure that learns and adapts by changing its size, shape, and composition, much like a living organism. It also benefits from a simple genome to keep and transmit the genetic information and a complex phenotype to explore the environment and adapt to it.

The research paper – Oil price forecasting using gene expression programming and artificial neural networks by Mostafa and El-Masry – is being published in the April edition of the Economic Modelling journal, doi:10.1016/j.econmod.2015.12.014.

LA JOLLA, CA -- UC San Diego's Jacobs School of Engineering is launching a new center to foster commercialization of UCSD research, and to educate students about the process of moving innovations from the laboratory into the marketplace. The William J. von Liebig Center for Entrepreneurism and Technology Advancement ( is funded through a $10 million gift from the William J. von Liebig Foundation of Naples, Fla.

Approach could shed light on many complex diseases

Like many complex diseases, diabetes results from the interplay of genetic and environmental factors. To examine genetic risk factors, scientists pore over the human genome sequence. Environmental factors have been trickier to pin down because there is no way to evaluate them comprehensively.

Now, researchers at Stanford University present what they call an environment-wide association study (EWAS) or to systematically examine the contributions of hundreds of factors in the development of Type 2 diabetes. This "enviromics" approach, which mirrors genome-wide association studies, harnesses high-speed computers and publicly accessible databases.

The first-of-its-kind study, which was funded by the National Institutes of Health (NIH), appears in the May 20, 2010, issue of PLoS One. The article is titled An Environment-Wide Association Study (EWAS) on Type 2 Diabetes Mellitus.

The authors examined 226 separate environmental factors like nutrition and exposure to bacteria, viruses, allergens and toxins. They found that certain factors, notably a pesticide derivative and the environmental contaminant PCB, were strongly associated with the development of diabetes. Other factors, including the nutrient beta-carotene, served a protective role.

The scientists describe their work as a demonstration that computational approaches can reveal as much about environmental contributions to disease as about genetic factors. They posit that the technique could be applied to other complex diseases like obesity, hypertension and cardiovascular disorders.

The authors acknowledge that many challenges remain, including the fact that, unlike the genome, "the environment is boundless."

By Steve Fisher, Editor In Chief SAN DIEGO, CA -- The highly anticipated showdown between Red Hat CTO Michael Tiemann and Microsoft Senior Vice President Craig Mundie, entitled “Shared Source vs. Open Source,'' took place yesterday at the O’Reilly Open Source convention. Mundie took the stage first followed by Tiemann and an open panel discussion.

  • Spectra Logic continues its proven record of success supporting the federal, state and local government agencies, ranking in the Top 10% of Government GSA contractors for the 3rd year in a row. 

  • Exemplifying this success, Spectra’s Federal sales comprised more than 20 percent of overall company revenue in 2009.

Spectra Logic today announced that it ranked in the top ten percent of U.S. General Services Administration (GSA) information technology (IT) Schedule 70 contractors for 2009. This is the third consecutive year Spectra Logic has ranked as a top vendor based on annual revenues of pre-approved GSA Schedule 70 IT products and services purchased by federal, state and local government agencies. Spectra Logic’s Federal sales division has a proven record of success supporting government organizations, and its sales comprise more than 20 percent of overall company revenue.

"Federal, state and local government agencies want backup and archive solutions that can easily handle large, fast-growing data volumes and high data availability, while helping to deliver greener IT environments that use less energy and minimize floor space," said Brian Grainger, vice president of worldwide sales, Spectra Logic. "Spectra Logic’s solutions are ideally suited for the government market – from high density, energy-efficient tape libraries to disk-based deduplication appliances that reduce stored data volumes."

Spectra Logic added several new products and services to the GSA schedule in 2009, including the Spectra T-Finity enterprise tape library, the Spectra T680 mid-range tape library, Spectra’s disk-based nTier Deduplication product line, media, backup application software and TranScale upgrade service options. Spectra Logic’s archive and backup products have been listed on GSA Schedule 70 since 2003 under GSA contract number GS-35F-0563K.

“The high-capacity Spectra T-Finity tape library enables large enterprise-class organizations to protect, archive and quickly access petabytes of classified and unclassified data,” said Mark Weis, director of federal sales, Spectra Logic. “T-Finity’s inclusion on the GSA Schedule 70 simplifies the purchasing process for our federal, state and local government customers.”

GSA establishes long-term government-wide contracts with commercial firms to provide access to more than 11 million commercial products and services that can be ordered directly from GSA Schedule contractors. The Information Technology Schedule 70 (a Multiple Award Schedule) grants agencies direct access to commercial experts who can thoroughly address the needs of the government IT Community through 20 Special Item Numbers (SINS).  These SINs cover the most general purpose commercial IT hardware, software and services.

In addition to GSA, Spectra Logic’s products are also listed on several Government Acquisition Contracts including ITES, NETCENTS and SEWP.

Page 2 of 7