Europe's premier optical networking event moves to Monaco for 2010 and will once again bring together over 400 optical professionals for four days of structured networking and debate. The educational operator led program and professionally organized netwokring agenda make WDM & Next Generation Optical Networking an unmissable event in the industry calendar.
Cloud computing – Opportunities and challenges for service providers
This presentation will address:
• Cloud computing trends changing the ITC industry
• The role of the transport network for enabling cloud services
• Service provider opportunities to capitalize on the cloud revolution
• Ways to increase efficiency by evolving intra-cloud connectivity networks
• Enabling new applications by reliable and secure access to cloud resources
Grid computing can jet-propel research and development. An EU-funded programme that lets European and Chinese grids work together has already produced results in aircraft design, drug development and weather prediction.
In 2007, the EU-funded project, BRIDGE (for Bilateral Research and Industrial development enhancing and integrating Grid Enabled technologies), set out to link European and Chinese computing grids and enable researchers to carry out joint research.
The project was inspired by the realisation that China is rapidly becoming a world leader in research and development, as well as a booming market for European products. Developing the infrastructure to link computing grids was seen as a key step towards future scientific and industrial cooperation.
“If Europe does not want to lose ground, the response can only be to synchronise with these developments,” says Gilbert Kalb, BRIDGE project coordinator.
Building a shared infrastructure
The BRIDGE team’s first challenge was to make the software systems that manage the European and Chinese grids compatible. The European Grid infrastructure, GRIA, and the Chinese system, CNGrid GOS, provide comparable services, but are organised differently.
The team were able to get GRIA and GOS to work together by building a new software superstructure to access them and tap their capabilities. The system included new gateways into the two grids, plus a shared platform to manage overall workflow, access needed applications, and translate higher-level commands into steps that each grid could carry out.
Not surprisingly, security was an important consideration on both sides. Kalb says that many or the scientific and industrial problems that BRIDGE was developed to address require intensive cooperation, yet involve highly sensitive information.
BRIDGE resolved this issue by letting selected processes remain private. That allows one group to contribute data or results to all collaborating parties without having to share proprietary software or analytic tools.
“You can interface in terms of the input and the output, while the algorithms remain hidden,” says Kalb.
Putting BRIDGE to work
The BRIDGE team tested the intercontinental grid they built by attacking three problems, each of which made different demands on the system.
Discovering new drugs remains an extremely costly process. One way to speed research is to use computers to simulate the chemical fit between millions of small molecules and proteins that play vital roles in disease-causing organisms. A molecule that binds strongly to a key protein has the potential to be turned into a potent new drug. This kind of research demands enormous computing power.
Researchers in Europe and China contributed four different docking tools – programs that calculate bonding between a small molecule and a particular protein. Each program used a different approach and produced somewhat different results.
The researchers then examined millions of molecules to see if they held promise against malaria or the H5N1 bird flu virus. By combining the results of the four different simulations, they were able to identify promising molecules more efficiently.
“Making the outcomes of these different docking tools comparable is very new,” says Kalb.
The four-pronged approach produced promising results. The BRIDGE infrastructure has already been adopted in Egypt to target the malaria parasite.
BRIDGE was also used to solve a complex aeronautic problem – designing and positioning wing flaps to maximise lift and minimise noise as an aircraft lands.
Like drug-discovery, these aerodynamic simulations required huge computational resources. In addition, because different parts of each simulation took place in different research centres, optimising the flow of work from centre to centre was also challenging.
The BRIDGE team was able to meet these challenges, carry out intensive distributed computations, and determine optimal wing flap parameters. “It proved to be an effective method for solving multi-objective and multi-disciplinary optimisation in aircraft design,” Kalb says.
Weather data on the fly
Weather and climate represent a third area where international cooperation is vital. The BRIDGE researchers set out to link three large meteorological databases located in Europe, North America and Asia.
The key challenge they faced with this project was to handle enormous volumes of data efficiently.
“You could do a calculation in the United States and transfer the results to Europe, or you could fetch the data from the USA and do the calculations here,” says Kalb. “The best way to do it depends on what calculation and what data and what’s the best available way to transfer the data from place to place. Bridge does all this on the fly.”
“Because there was a big organisation behind it, and our work fits very well, it was taken up right away,” says Kalb. “I believe that meteorologists are already using it to access data and perform certain calculations.”
To Kalb, the importance of what BRIDGE accomplished goes far beyond any single piece of research. He feels that the project has built the foundation for the kind of multinational collaboration that is needed to tackle global problems.
“Problems like energy and climate change can only be attacked or really solved with efforts from different players around the world, and we’ve built a platform to do that,” he says. We proved that this is feasible and useful. Now it’s time for other people to jump on this, develop it further, and use it.”
The BRIDGE project received funding from the Sixth Framework Programme for research.
By Steve Fisher, Editor In Chief -- This second installment in the Supercomputing Online Emerging Companies Series focuses on PetaSwitch Solutions, a small firm focused on building new multi-generation switch fabrics. The young company offers a switch fabric that scales up to 256 ports and up to a whopping 10Tbps of switching capacity.
This, 2009, is the year linked data is being heralded as a treasure map as early adopters offer some very interesting pointers on how we can browse, explore, query and re-use data on the Web at a more fine-grained level in exciting new ways. "We are already seeing examples of non-personal public data through initiatives spearheaded by the U.S. and UK governments. Other countries will follow suit. This will be one of the tipping points for the growing web of linked data and the increasing number of applications to exploit the data sets", says Professor Dame Wendy Hall, one of the pioneers of the Semantic Web from the University of Southampton, speaking from the Online Information conference in London in early December.
Putting government data online means increasing citizen awareness of government functions for greater accountability and enables the government to function more efficiently. The UK’s new policy for public data has been triggered by pioneering work in the arena of linked data in a relatively short time-frame.
“Available public data creates bottom-up pressure to improve public services. Data is essential in enabling citizens to choose between public service providers, helping them compare their local services with services elsewhere and enables all of us to lobby for improvement”, says Nigel Shadbolt, a co-pioneer of the Semantic Web, also from the University of Southampton. data.gov.uk, a developer’s version of the government website that will be made public early next year, currently gives access to 1100 datasets ranging from traffic counts on the road network, reference data on schools and the Farm Survey, illustrating the future growth potential in the step-change towards the Web of Data, or Semantic Web, encompassing technology development, politics, economics and social change.
Linked data principles provide a basis for realising the Web of Data by ensuring data is organised, structured and independent of any application programmes so it can serve a broad community of people and many applications. The main drivers behind linked data include the value-add of structured content, a mission or mandate to make data linkable and most importantly low development barriers. Key enabling technologies span Web2.0, Mash-ups, Open Source, Cloud Computing and Software-as-a-Service.
Is it complicated? Not particularly. As Ian Davis, CTO at Talis explained "What is needed is a little extra effort when publishing data through the Resource Description Framework (RDF), the HTML of data web. This is done by giving each thing an identifier or Uniform Resource Identifier (URI), just as we do with web pages. The next step is to describe these things using RDF with links to other information so as to provide a context and respond to requests on identifiers by sending the description of that thing. No special software is needed to use it. When we turn a site into an API using linked data, people can start building new things reusing data in new and interesting ways”.
The BBC programme catalogue and their music site is a good example of linked data in action. Every BBC programme, segment of a programme, every brand and every person has an identifier, a URI, with the data hidden behind the page displayed on the screen. But it doesn’t stop here. Linked data from the BBC programme pages are remixed on Twitter through fanhu.biz, a prototype service, creating a social space for fans of BBC programmes. While many of the early use sites are read-only, linked data can also be used for powerful fully interactive web applications with full editing capabilities thanks to this very flexible technology.
"The Semantic Web will transform the World Wide Web into a more useful and powerful information source. In particular it will revolutionise scientific and other web publishing by defining new web technologies that make more web content accessible to machines. These technologies will provide better tools that make it easier for people to create machine-readable content that is widely available. A good case in point is an initiative spearheaded by the Association for Computing (ACM), the world’s leading educational and scientific computing society. In early 2010 Semantic Web tools will be implemented in the ACM Digital Library (http://portal.acm.org/dl.cfm) to enable its hundreds of thousands of professional and student users to more easily find, share, and combine information on the Web”, said Dame Wendy .
How will the Web of Data evolve? For now we only have clues on the direction Web3.0 will take in the next few years, when we can expect to see linked data services available on iPhones and mobile devices and new business models emerge. While there are still some open questions as to what tools are needed, the Semantic Web will emerge and it will probably be a hybrid of data and documents side by side. How they will co-exist is the fruitful line of research and development. “We believe that linked data will be an even bigger sea change than the world wide web not only because it plays a key role in assisting a wide range of people in government and research but also socially. With linked data it will be much easier to tackle grand global challenges like climate change, energy and health issues, ageing, and world poverty by sharing data to capture correlations and trends more effectively and more quickly and spot clues that enables someone to make that all-important step forward. The key thing is to arrive at universal standards just as we did with the web, understand that attitude is as important as the technology to get critical mass that will boost innovation and that linked data means lots more innovation", explained Dame Wendy, who also serves as President of ACM, the first person from outside North America to hold this position.
Some of the early adopters have already started to unleash information that is of value socially, politically and economically. Looking ahead, other important areas creating a treasure map for online information could include globally aggregated services on university courses encompassing league tables and citations and even a global index for a world library collection, bringing fantastic new opportunities for knowledge information professionals, who already have the skill sets needed to understand data management issues and in some instances of using metadata to describe the data in their collections, a rich information source for linked data. The bottom-line is that this is an evolution and not a replacement of the Web and that future-gazing is part and parcel of being involved in bleeding-edge technologies.
MOUNTAIN VIEW, CA -- SGI (NYSE: SGI) and vizrt today announced that Wige Data delivered a virtual sets package powered by two Silicon Graphics® Onyx2® visualization systems and vizrt's viz [virtual studio](TM) software to German public television broadcasters ARD and ZDF for production and live coverage of the 2002 Winter Olympic Games. The Games are being held in Salt Lake City, Utah, Feb. 8 through Feb. 24.