Illustration of a cochlear implant
Image credit: BruceBlaus [CC BY-SA 4.0], via Wikimedia Commons

Today’s article was going to be a pretty straightforward technological exposition. I was going to describe a procedure that can improve hearing in ways that conventional hearing aids cannot, mention some of the limitations and risks involved, and pretty much leave it at that. Then I got an email from a friend wondering if I was planning to cover the political issues cochlear implants raise for the Deaf community. Um…political issues? I hadn’t known there were any. But after a bit of research, I discovered that the controversy surrounding this procedure is at least as interesting as the procedure itself, which has been called everything from a miracle cure to genocide.

Can You Hear Me Now?

First, a bit of background. There are many different types and causes of deafness. Some kinds of hearing loss can be compensated for very adequately with just a bit of amplification—namely, a hearing aid. However, if there is a defect or damage in the inner ear, a hearing aid may do no good. Our perception of sound results from the vibrations of tiny hairs lining the cochlea, a spiral, fluid-filled organ in the inner ear. When the hairs move, the hair cells convert the movement into nerve impulses, which are then sent to the brain for decoding. If the vibrations never reach the cochlea, or if the hair cells themselves are damaged, no neural stimulation occurs and deafness results.

However, if most of the underlying nerve fibers themselves (and the neural pathways to the brain) are intact, they can be stimulated electrically, producing a sensation interpreted by the brain as sound. A cochlear implant places a series of electrodes inside the cochlea to do just that; a wire connects these electrodes to a small receiver with its antenna placed under the skin. Outside the skin, a device that looks somewhat like a hearing aid picks up sounds with a microphone, digitizes them in such a way that they produce meaningful signals for the electrodes, and transmits them via radio waves to the receiver. The net result is the perception of sounds picked up by the microphone, but because this apparatus completely bypasses the eardrum and middle ear, it’s really an artificial ear rather than a hearing aid. The technology was developed by Dr. Graeme Clark at the University of Melbourne in the 1960s and 1970s; the first implant was performed in 1978.

Although any number of technological innovations have occurred in the decades since, cochlear implants are still by no means perfect. They vary greatly in their effectiveness, depending on a large number of variables. And the effect they produce, while auditory in nature, is not identical to what would be experienced with a fully functional ear. In addition, patients with cochlear implants require months or years of training to associate their new perceptions with sounds as they are usually known. In the most successful cases, implant recipients can eventually understand someone talking on the phone—but there is no guarantee of that level of hearing. Still, tens of thousands of people around the world have received the implants, and the procedure is rapidly gaining in popularity.

You Will All Be Assimilated

To a hearing person such as myself, all this sounds very rosy and optimistic. Of course, the surgery is rather delicate and carries with it the usual risks associated with putting holes in one’s head; plus, the cost of the procedure and rehabilitative therapy is quite high. But these are not the primary concerns of the Deaf community. Although the controversy has diminished greatly in recent years, cochlear implants—particularly for children—were strongly opposed by many deaf people for some time because of a fear that they would destroy the Deaf culture in general and the use of sign language in particular.

On the surface, this argument may seem sort of silly to hearing persons. But the Deaf community has a unique culture and language that they rightly consider quite valuable; the thought of losing such a culture to technology is understandably offensive. One of the key beliefs of the Deaf community is that deafness is simply another perfectly valid way of life, not a problem that needs to be fixed. So the intimation that deafness is a “disease” for which cochlear implants are a “cure” smacks of assimilationism: “You must all be like us.” (The 2000 documentary film Sound and Fury examines the controversy over cochlear implants in detail as it follows members of two families through their decisions about whether or not to undergo the procedure.)

Even detractors of cochlear implants allow that this must be an individual decision, and that implants may be a reasonable choice for people who have lost hearing later in life (and who therefore may not have integrated themselves into the Deaf community). But when it comes to implants for children, the story is different. If a deaf child does not receive an implant, he or she is likely to learn sign language easily and adopt the Deaf culture. With an implant, the child is more likely to be treated as a hearing child. However, the imperfect nature of “hearing” provided by the implants may make it difficult to learn spoken English; meanwhile, because the parents have little incentive to raise the child as a deaf person, the child may never learn sign language. The result is that the child has less ability to communicate than if the implant had not been performed. In addition, if the child has partial hearing, the implant may eliminate any possibility of later using a conventional hearing aid by impeding normal functioning of the cochlea.

On the whole, decades of experience with cochlear implants in thousands of children have not borne out these worries, so resistance to implants in children is decreasing somewhat. Conventional wisdom holds that someone with a cochlear implant is still deaf, and many people with implants—children and adults alike—continue to learn and use sign language, participating actively in the Deaf culture. If cochlear implants, in a roundabout way, can promote both bilingualism and biculturalism, that may be their most compelling advantage.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on October 14, 2004.