When one listens to music or hears someone speak, the brain must process what it has heard. In order to be understood, sounds must first be converted to vibrations in the middle ear and then to electrical impulses in the inner ear. These electrical impulses are then relayed to different sites in the brain for interpretation.
The acoustic nerve -- also known as the cochlear nerve -- acts as a busy highway, transmitting electrical data from the inner ear to the brain stem, where the signals are relayed to other parts of the brain. The acoustic nerve also transmits information from the brain stem back to the inner ear. The back-and-forth transfer of information between the inner ear and brain regulates sound processing. This regulation helps filter out background noise and protects the inner ear from damage due to loud noise.
The first stop in relaying sound from the inner ear to the brain is the cochlear nucleus, located in the brain stem. There is one cochlear nucleus for each ear. The cochlear nucleus takes the bundle of electrical signals from the acoustic nerve and separates them from one another. It organizes the signals based on the pitch of the sound and sends the organized set of information to other parts of the brain for interpretation. It also sends feedback information to the inner ear.
The auditory cortex -- located in the temporal lobes of the brain, which are situated above the ears -- gives meaning to the large amount of information sent to it by the inner ear and cochlear nucleus. It is the language center of the brain and its role is to interpret sounds so they are understood. For example, the auditory cortex allows a person to identify and recognize specific sounds such as another person's voice, the bark of a dog or a specific musical instrument. It is also responsible for determining where the sound is coming from and how loud it is.
An area of the brain called the prefrontal cortex has a complicated role in processing what it heard. It receives information from the auditory cortex as well as other sites in the brain and puts all of this information together. For example, during a conversation, not only does the prefrontal cortex receive information about what is said, but also the other person's facial expressions along with memories and emotions that relate to the conversation. In this way, the prefrontal cortex enables a deeper understanding of the conversation by integrating what is said with how it is expressed and past experience.
- Nature: How The Ear Works
- Psychiatry MMC: Cranial Nerve VIII, Hearing and Vestibular Functions
- Neuroscience, 2nd edition; Dale Purves, et al.
- Current Opinion in Neurobiology: Cortical Processing of Complex Sounds
- The Neural Bases of Multisensory Processes; Micah M. Murray and Mark T. Wallace
- Inside Science:The Physics of Tuning Out