Industry-Standard Performance Test from EEMBC Answers 'How's Your Mobile Device?'

The Embedded Microprocessor Benchmark Consortium (EEMBC) announced that it has launched its standardized, industry-accepted method of evaluating portable connected devices, with a primary focus on Web-browser performance. The EEMBC benchmark, named BrowsingBench, benefits processor vendors, operating system and browser developers, and system developers by providing an industry-standard and unbiased tool that determines the effectiveness of their hardware and software products in processing and displaying Web pages. Unlike other browser benchmarks, BrowsingBench measures the complete user-experience--from the click/touch on a URL to final-page rendered on the screen. Furthermore, BrowsingBench measures more than JavaScript execution, it also measures page-rendering speed and factors in Internet-content diversity.

The EEMBC working group that produced this new benchmark was expertly chaired by Mansoor Chishtie, Chief Technologist of Web Technologies at Texas Instruments Incorporated (TI). "We developed BrowsingBench with a strong focus on real-world behavior – by using real Web site content, connecting the Web server to the test device via WLAN, and introducing client-server latency found on a typical Internet connection," said Mr. Chishtie. "Furthermore, the collaborative effort of our working group members has ensured that BrowsingBench provides an equitable, unbiased, and repeatable test for mobile devices – core capabilities critical to ensuring data can be used by technology providers and customers alike to fairly assess device performance."

BrowsingBench sets up its own client-server network to ensure repeatability and a close to real-world broadband profile. BrowsingBench allows the user to modulate the bandwidth and latency of the local server to simulate a variety of wireless and Wi-Fi scenarios, although the official test results will be generated using a standard 20ms latency to simulate a broadband profile.

Another feature of BrowsingBench is that it measures the actual browsing performance within a commercial stand-alone browser, as opposed to requiring custom applications for each device platform. This feature provides the extreme portability and automated testing of BrowsingBench. With BrowsingBench, a user will be able to compare browsing performance on a level playing field across different hardware platforms running a wide variety of software browser implementations.

EEMBC's director of technology, Shay Gal-On provided a large portion of the detailed engineering work on BrowsingBench. "Creating a benchmark for mobile platforms requires more than just loading a series of Web pages, as you must carefully control caching effects, check for page-rendering compliance, and perform tasks such as page scrolling to ensure 100% rendering," said Gal-On. "In addition, the exhaustive testing by working-group members has helped ensure that BrowsingBench provides a rock solid and reliable testing environment."

"Consumers will appreciate BrowsingBench because it will encourage manufacturers to test their systems and deliver high-performance connected devices that give good results on common Web-browser tasks," said Markus Levy, EEMBC's president. "As an industry association, we also encourage manufacturers to join the working group to help define BrowsingBench 2.0."