SCIENCE
NSF Funds Expedition into Software for Efficient SuperComputing in the Age of Nanoscale Devices
- Written by: Doug Ramsey
- Category: SCIENCE
As semiconductor manufacturers build ever smaller components, circuits and chips at the nano scale become less reliable and more expensive to produce. The variability in their behavior from device to device and over their lifetimes – due to manufacturing, aging-related wear-out, and varying operating environments – is largely ignored by today’s mainstream computer systems.
Now a visionary team of computer scientists and electrical engineers from six universities is proposing to deal with the downside of nanoscale computer components by re-thinking and enhancing the role that software can play in a new class of computing machines that are adaptive and highly energy efficient.
The National Science Foundation (NSF) today awarded a $10 million, five-year grant to researchers who will explore “Variability-Aware Software for Efficient Computing with Nanoscale Devices.” The grant is part of the funding agency’s Expeditions in Computing program, which rewards far-reaching agendas that “promise significant advances in the computing frontier and great benefit to society.”
“We envision a world where system components – led by proactive software – routinely monitor, predict and adapt to the variability in manufactured computing systems,” said Rajesh Gupta, director of the Variability Expedition and a professor of computer science and engineering at the University of California, San Diego’s Jacobs School of Engineering. “Changing the way software interacts with hardware offers the best hope for perpetuating the fundamental gains in computing performance at lower cost of the past 40 years.”
Joining Gupta in this effort is the expedition’s deputy director and electrical engineering professor Mani Srivastava from the University of California, Los Angeles’ Henry Samueli School of Engineering and Applied Science, and a team of eleven other computer scientists and electrical engineers from University of Michigan, Stanford University, University of California, Irvine (UCI) and the University of Illinois at Urbana-Champaign (UIUC). The project is based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2).
Researchers on the team include: Ranjit Jhala, Sorin Lerner, Tajana Simunic Rosing, Steve Swanson and Yuanyuan (YY) Zhou from UC San Diego; Lara Dolecek and Puneet Gupta from UCLA Engineering; Subhasish Mitra from Stanford; Dennis Sylvester from Michigan; Rakesh Kumar from UIUC’s Coordinated Science Laboratory; as well as Nikil Dutt and Alex Nicolau from UCI’s Center for Embedded Computer Systems. These researchers bring to the project broad expertise in a wide range of relevant topics. Their specialties include operating systems, compilers, languages, embedded and mobile systems, data center computing, integrated circuits, storage, statistical techniques, design automation and testing, and cross-cutting issues spanning reliability, energy efficiency and robust system design.
The research team seeks to develop computing systems that will be able to sense the nature and extent of variations in their hardware circuits, and expose these variations to compilers, operating systems, and applications to drive adaptations in the software stack.
“As the transistors on their chips get smaller, semiconductor makers are experiencing lower yields and more variability – in other words, more circuits have to be thrown away because they don’t meet the timing-, power- and lifetime-related specifications,” said Michigan’s Sylvester, an expert in designing computer circuits in nano-scale technologies. If left unaddressed, added UCLA’s Puneet Gupta, “this trend toward parts that scale in neither capability nor cost will cripple the computing and information technology industries. So we need to find a solution to the variability problem.”
Software experts on the team will develop models and abstractions to expose the hardware’s variable specifications accurately and compactly, and to create mechanisms for the software to react to variable hardware specifications. Hardware researchers will be focused on more efficient design and test methods to ensure that device designs exhibit well-behaved variability characteristics – ones that a well-configured software stack can easily communicate with and influence. “The resulting computer systems will work while using components that vary in performance or grow less reliable over time,” explained deputy director Srivastava. “A fluid software-hardware interface will mitigate the variability of manufactured systems and make them robust, reliable and responsive to the changing operating conditions.” Added professor Rakesh Kumar, who will lead the expedition efforts at UIUC: “Steering the effects of the variability will be particularly important.”
Variability-aware computing systems would benefit the entire spectrum of embedded, mobile, desktop and server-class applications by dramatically reducing hardware design and test costs for computing systems, while enhancing their performance and energy efficiency. Many in-demand applications – from search engines to medical imaging – would also benefit, but the project’s initial focus will be on wireless sensing, software radio and mobile platforms of all kinds – with plans to transfer advances in these early areas to the marketplace.
To ensure that the project reflects real-world challenges in the computing industry, organizers have recruited a high-powered Technical Advisory Board that initially includes top industry executives from HP, ARM, IBM and Intel. The Board also includes two senior academic researchers with expertise in modeling and manufacturing of nano-scale devices and circuits: Robert Dutton and Andrew Kahng, who are professors at Stanford and UC San Diego, respectively.
“If this project is successful, and the breakthroughs are transferred to industry,” observed Stanford’s Subhasish Mitra, “we will have contributed to the continued expansion and reach of the semiconductor and computing industries.”
Added Nikil Dutt, who leads the effort at UC Irvine: “Other significant societal benefits will include advances in server-class computing to benefit science itself, and improved energy efficiency of buildings, smart grids and transportation.”
Transforming the relationship between hardware and software also presents valuable opportunities to integrate research and education, and this Expedition will build on established collaborations with educator-partners in formal and informal arenas to promote interdisciplinary teaching, training, learning and research. Assisting the researchers in this will be three other members of the team: William Herrera, the Educational Coordinator for UCLA Engineering, and consultants Eric Arseneau and Shirley Miranda at UC San Diego, who are experts in science and technology education and outreach at the middle and high school levels.
More details about the project are available at http://www.variability.org.
The Expeditions in Computing program, now in its third year, was established by NSF’s Directorate for Computer and Information Science and Engineering (CISE), to provide the CISE research and education community with the opportunity to pursue ambitious, fundamental research. The grants represent some of the largest single investments currently made by the directorate.