BIG DATA
Hearing Through Your Bones
War is dangerously noisy. Between explosions, gunfire, and jet take-offs and landings, armed forces personnel are frequently exposed to extreme decibel levels. This exposure has consequences: hearing loss is one of the most common medical problems facing soldiers and veterans.
A recent report from the Department of Veterans Affairs found that 58,000 of the 1.3 million soldiers who have served in Iraq and Afghanistan are on disability for hearing loss. And in 2006, the V.A. reportedly spent $539 million on payments to veterans with hearing-related ailments. This number is expected to rise in the coming years.
“People talk about dramatic battle-field injuries, but hearing loss is the number one medical issue for the Air Force,” Margaret Wismer of the Bioacoustics Research Lab at the University of Illinois Urbana-Champaign, said. “A lot of people suffer from this problem.”
When we think of hearing, we usually imagine our ear canal. However, sound also travels through the bones in our skulls, and at high noise levels can be just as damaging. To understand the nature of bone-conducted hearing, Wismer studies this phenomenon on behalf of the Air Force, using computer simulations generated at the Texas Advanced Computing Center (TACC).
Movies of a 4kHz transducer mounted on the right mastoid are available online. Movie 182 depicts the slice at the ear canals and Movie 175 shows the slice 0.5cm below the canal. The movies cover a timeframe of 4mS.
“The Air Force is interested in reducing hearing loss, by building better hearing protection devices (HPDs) for people that work in very noisy environments, which means under airplanes,” Wismer said. “The Air Force wants to know all the pathways by which sound can reach the ear and cause damage to hearing, because even if you block the air pathways, noise can still reach the eardrum through these bones.”
The fact that sound travels through the skull has been recognized for ages. Beethoven, for example, found a way to hear music through his jaw after he became deaf, by biting a rod attached to his piano. Bone-conducted hearing also explains why we sound strange to ourselves on a recording: we’re used to hearing our voice through bone-conducted sound-waves; when it comes exclusively through our ear canal, our voice seems distorted.
Despite extensive study, however, the phenomenon is not well understood. “I have whole folders of research from scientists who have researched some questions of how bone conducted sound reaches the ear,” Wismer said. “Does it vibrate the eardrum? Does it vibrate the small bones in the middle ear that are connected to the eardrum? Does it vibrate the cochlear fluid directly? If it goes through the top of your head, what’s the pathway? They are trying to reach a very basic understanding of bone-conducted sound, but even with my simulations, I’m not sure that it’s going to be so easy to resolve.”
Wismer used a CT scan of a skull to create a virtual model with the complex geometries of a real head. Her software simulates the pressure signal of compressed and rarified air interacting with and traveling as an acoustic wave through the bones of the virtual skull, and eventually reaching the eardrum. By repeating the simulations at different frequencies and at different input points, she obtains a highly-detailed picture of what’s actually happening with bone-conducted hearing.
Her high-resolution and three-dimensional simulations require the parallel processing power of immensely powerful HPC, like Ranger, the nation’s most powerful academic computing system. “My typical simulation gets into the hundreds of millions of nodes or degrees of freedom,” Wismer said. “You need at least 10 to 20 gigabytes to run a simple, straightforward problem, and to run it over the timeframe that needs to be simulated is only feasible using a high-performance, parallel cluster.”
Experts in the Speech and Hearing Science department at the University of Illinois, with whom Wismer collaborates, conduct similar experiments on human subjects which validate Wismer’s computer program. Both experimental and computational approaches are necessary to understand human hearing, but Wismer’s virtual model has the advantage of showing the impact at a range of locations, frequencies and time-scales that are impossible with live subjects (or even with cadavers, which are frequently used for hearing studies as well).
“I can put a signal in anywhere on the skull and I can measure the signal at the eardrum, but I can also measure it inside the head,” Wismer explained. “I can look at time of flight, pressure levels and frequency shifts for a host of different signals. These are things that my program can do that are very difficult to measure experimentally.”
Margaret Wismer, research scientist at the Bioacoustics Research Lab at the University of Illinois Urbana-Champaign
Wismer is currently studying the output of her simulations on Ranger, looking at intensity plots, watching 2D and 3D movies of how sound-waves travel through bone, and exploring how occlusion effects —such as when individuals wear protective earplugs — can actually make bone-conducted sound more damaging. All of these insights feed into the Air Force’s (and industry’s) ultimate goal of creating better hearing protection devices, with less leakage and less bone-conducted sound.
Using Ranger allows Wismer to do more complex simulations faster, and gain insights into this lesser-known hearing process that couldn’t be achieved with laboratory experiments. “I’m not sure you can actually say, ‘This is the pathway and this is what’s happening.’ But I think there are some things my simulation can answer, in terms of whether the eardrum is being affected more than the bone or the cochlear fluid, and what vibrates more, and whether the pressure travels inside the soft tissue or if it stays in the bony part of the solid,” she said. “It’s more of a very basic understanding we’re striving for that could be applied to hearing protection devices.”
But preventing hearing loss is only one reason to study sound being transmitted through the skull. The armed forces and commercial companies have already begun to tap the bone pathway for alternative communication devices — so-called ‘bone phones.’ These have the advantage of keeping the ears open while transmitting cleaner sound waves. Such a device will rely on enhanced knowledge of how sound travels through the head — the type of insights that will come from Wismer’s software, and the application of high-performance computing systems.
“With the computers going the way they are, I think it’s going to be a lot easier to simulate these phenomena, to visualize them and model more frequencies,” Wismer said. “There’s a segment of people out there who are interested in developing better hearing protection and communication systems, and applying this research is the best way to do that.”
******************************************************************************************************************************
Funding for Wismer's research has come principally from the Air Force Office of Science Research (FA9550-06-1-0128). To learn more about this and other related research, visit the University of Illinois Bioacoustics Research Lab homepage.
Aaron DubrowTexas Advanced Computing Center
Science and Technology Writer