- <div style="background-image:url(/live/image/gid/32/width/1600/height/300/crop/1/41839_V14Cover_Lynch_Artwork.2.rev.1520229233.png)"/>
The “Sixth” Sense: Understanding the Mechanistic and Biological Properties Associated with Human Echolocation
Department of Biology
Lake Forest College
Lake Forest, Illinois 60045
It is a common conception that vision is the most necessary sense to understand the world and live a fully functioning life. But, in current times, being blind is not synonymous with relying on others (Thaler & Goodale 2016). Human echolocation provides a way for the blind to gain information about different items in the environment. Whether this be done by creating a noise by oneself or from an outside force to bounce off of an object, the echo that soon follows tells a great deal about the surrounding environment. The “decoding” of echoes may be an easier process in the blind due to an intensity in hearing detection (Nillson & Schenkman 2016), which may be because of sensory plasticity (Stevens 2013). Echolocation may also be made possible in people who are blind because of its mechanistic and neurological properties that have a relation to sight. Although not fully understood, MRI scans have shown that brain regions that are typically used for sight are stimulated during echolocation in the blind (Milne, Arnott, Kish, Goodale & Thaler 2014). Specific physical movements that blind echolocators perform, such as rotating the head to detect echoes, is also hypothesized to be tied to vision (Milne, Goodale & Thaler 2014). Further research can create new technology and assist therapists in helping the blind live a life that is as autonomous as possible.
Introduction: Organisms use a wide variety of sensory modalities in order to adapt to their environment, survive, and pass on their genes to future generations. Although touch, hearing, sight, smell, and taste are certainly the most well-known senses, animals have the ability to perceive the outside world in different ways based on their neurological and biological make-up. To recognize the placement of other organisms or items, some animals have the ability to use echolocation. In order for echolocation to be successful, animals produce a sound and use the modifications in the echo that is generated to determine location and position of a designated target (Stevens 2013).
Bats commonly use echolocation in order to catch insects to eat (Stevens 2013). The noise that bats create before listening to the echo that is refracted from an object is not only fast but is also placed at an extremely high frequency that can range from 11 to 212 kilohertz (Stevens 2013). While the nose and mouth can be used by bats to create this noise, a bat’s pinna, which is both large in size and exposed to the outside environment, is used to pick up the noise that the echoes make (Stevens 2013). One key feature about bats and echolocation is that the intensity of the noise that they make before listening to an echo is high in order to hear the corresponding echo with the greatest accuracy (Stevens 2013). Specific brain areas like the auditory cortex and midbrain seem to be activated depending on the echo of a target with particular widths (Stevens 2013). Other areas within the brain, like the auditory cortex, are involved in deciphering the amount of time from a sound produced to an echo in specific species of bats (Stevens 2013). It is clear that multiple brain areas are utilized in bats based on particular components related to echolocation. Although many are aware that bats are not the only organism to use echolocation, human echolocation seems to be far less well-known and discussed.
For humans who lack the sense of sight, echolocation is extremely useful because it allows people to understand where objects are placed and gives them the ability to distinguish objects based on their characteristics (Thaler & Goodale 2016). Because of echolocation, blind people do not have to be reliant on others to paint a picture of the world around them; despite visual impairments, echolocation gives people the ability to navigate their environment autonomously (Thaler & Goodale 2016). It was not until the second half of the 20th century that the relationship between man-made calls, echoes, and the interpretations of the echoes was studied in regard to human echolocation (Thaler & Goodale 2016). While humans have the ability to use a number of techniques to make a noise in order to hear a returning echo, using the tongue as a way to synthesize a click-like noise has raised significant attention, allowing for humans to reach maximum auditory frequencies between three and eight kilohertz (Thaler & Goodale 2016). For people who are blind, creating a noise and listening to the echo stimulates neurological activation that differs from those who have full function of all their senses. In fact, recent evidence has supported the idea that the blind use visual areas of the cerebrum during echolocation (Koning 2014). Not only this, but, during echolocation, different parts of the brain are being utilized for specific tasks (Milne, Arnott, Kish, Goodale, &Thaler 2014). Unfortunately, not everyone who is blind is able to use echolocation effectually. Unlike humans, bats are able to use echolocation by instinct because of the evolution of specific structures in the body that makes this modality advantageous to them (Thaler & Goodale 2016). Humans that are blind must learn to echolocate, which limits its use to only a small portion of people (Thaler & Goodale 2016).
Because of its novelty and complexity, human echolocation is not fully understood. However, researchers do believe that human echolocation in the blind provides insight into the plasticity of the cerebrum (Thaler & Goodale 2016). Human echolocation utilizes different brain regions depending on the unique qualities of the intended object that facilitates an echo. Regions of the brain associated with hearing and sight have been seen to be activated in the blind while using echolocation (Milne et. al 2014), which explains why people who are blind are more skilled in echolocation and have a lower threshold in detecting echoes (Kolarik, Scarfe, Moore, & Pardhan 2017). Despite the different ways in obtaining information about the environment, human echolocation and sight are alike because they both show similar properties in regard to understanding object characteristics (Thaler & Goodale 2016). But, unlike vision which has been studied for centuries, human echolocation requires much more in-depth research in identifying which parts of the brain are involved in identifying different aspects of echolocation (Milne et. al 2014). Answering the question of how human echolocation in the blind works is the first step toward gaining a well-rounded explanation of the neurological regions and mechanisms used to make this sensory modality successful in those who are blind.
Methods and results: In order to test the idea that echolocation in the blind works in a similar fashion to visual mechanisms, Buckingham, Milne, Byrne & Goodale (2015) tested if blind echolocators would believe that something that is small would feel as if it weighs more compared to something that is big with the identical weight. This incorrect interpretation, or the size-weight illusion, is a common misconception amongst those who can see and shows that the way an object looks can have an immense impact on how much someone perceives it weighs (Buckingham et. al 2015). Each of the three groups in the study, or people that had their vision, people that could not see and did not use echolocation, and people that were blind and used echolocation, were able to gain a sense of understanding of how much each cube weighed by pulling on a wire that was attached to each of the six cubes (Buckingham et. al 2015). Three of the cubes weighed an equally small amount while the other three cubes weighed an equally large amount and the cubes within the two groups differed in size from one another (Buckingham et. al 2015). In addition, the participants used various techniques, such as quickly viewing the cubes for those who can see and creating echoes for those who are blind, to infer the cube’s size (Buckingham et. al 2015). The results from the experiment showed that blind echolocators believed that smaller objects possessed a smaller weight than the larger objects that weighed the same (Buckingham et. al 2015). The blind echolocators believed this significantly more compared to those who were blind but did not use echolocation and to those who can see (Buckingham et. al 2015). This shows that sight is not always necessary to detect properties about the environment that are typically connected to visual cues. Those who could not see fell for a well-known illusion of the eyes despite their lack of eyesight. This study supports the idea that there is a link between visual processing in people who use their eyes to sense the environment and echolocation in the blind. Echolocation in the blind works in a similar way to that of the eyes.
Specific visual areas of the brain are stimulated in the blind during echolocation, which may explain why these people are victim to illusions that are associated with visual perception. To understand how human echolocation works from a neurological perspective, three blind participants, three people who can see and three people who were blind and use echolocation were used in a study to analyze differences in brain activation when hearing echoes (Milne et. al 2014). In this study, people who were blind and skilled in echolocation created a noise to hear an echo in front of different materials and then in a vacant space and were recorded; the control groups, or those who were not experienced in echolocation, received training to decode various echoes (Milne et. al 2014). Although those who are blind and use echolocation did the best in correctly guessing the properties of an object that an echo bounced off of, none of the participants’ skills were due to only chance (Milne et. al 2014). Results from neurological scans between listening to echoes and no sound at all found that all nine of the participants’ Heschl’s gyrus, which has an auditory function, was stimulated when hearing an echo (Milne et. al 2014). However, a more significant neurological finding showed that the people who were blind and normally used echolocation displayed activity in the left section of the parahippocampal cortex, which typically functions in interpreting stimuli involving hearing and vision (Milne et. al 2014). The parahippocampal cortex is usually used in recognizing a specific environment, or the properties of specific objects (Milne et. al 2014). This study, as well as previous research, suggests that echolocation in the blind works by stimulating regions in the brain that have a visual connection.
Although echolocation in the blind stimulates the parahippocampal cortex when detecting specific characteristics of objects in the environment, different areas of the brain are utilized when a blind person uses echolocation to guess how large a room is (Flanagin, Schörnich, Schranner, Hummel, Wallmeier, Wahlberg & Wiegrebe 2017). Not only did personally generating noises to hear echoes lead to greater accuracy in guessing a room’s size compared to simply listening to a noise in order to hear an echo, but Flanagin et. al (2017) found a large difference in the parts of the brain that were utilized when personally creating a noise for echolocation. In a blind person who typically uses echolocation, there was stimulation in multiple areas of the cerebrum, like the middle occipital gyrus and the occipital cortex; both of these regions function in processing sight (Flanagin et. al 2017). The parietal cortex, which was also seen to be activated, as well as the middle occipital gyrus are used to understand space in the environment (Flanagin et. al 2017). Multiple studies have indicated that echolocation in the blind works by stimulating parts of the brain associated with processing sight. It is also clear that different parts of the brain are active depending on the echolocation task. All of the various brain regions and their specific function in regard to echolocation, though, needs to be further analyzed.
In contrast to findings from Flanagin et. al (2017), a study conducted by Thaler & Castillo-Serrano (2016) found that it is just as effective to have the noise that is produced to fabricate an echo come from an external or an internal source with echolocation. These results came from a study which consisted of participants that were blind and skilled in echolocation and people who could usually see but had their vision momentarily hindered. Both groups of participants had to guess the placement of an object by either personally creating a noise to generate an echo or by listening to a noise that created an echo, which came from an external speaker (Thaler & Castillo Serrano 2016). In opposition to the study derived from Flanagin et. al (2017), this study shows that making a noise with the tongue to create an echo may not be necessary for echolocation to work (Thaler & Castillo Serrano 2016). Although the study conducted by Thaler et. al (2013) reports that echolocation can utilized in the blind and in those that have their sight, this study seems to reject the idea that echolocation works only by personally making one’s own tongue clicks and hearing an echo to accurately understand the environment. This concept is also validated by an experiment in which blind echolocators and those that can see heard different echo noises and had to infer whether the plane that the echo bounced off of was still or in motion (Thaler, Milne, Arnott, Kish, & Goodale 2013). The source of the click noise to create the echo, though, made no significant difference in the participants’ inferences (Thaler et. al 2013). Although there are restrictions to hearing a noise that creates an echo (Koning 2014), hearing a click noise from an outside medium or personally creating a click noise does not seem to play a large role in human echolocation. It can work by creating a noise or hearing a noise that leads to an echo.
Whether the noise that generates an echo comes from the individual or from an external location, the intense auditory function in the blind plays a large role in making echolocation effective. Kolarik, Scarfe, Moore & Pardhan (2017) conducted a study in which people who are blind, people who can see but are temporarily restricted from sight, and blind echolocators take part in specific motor tasks. When participants were able to make noise, those who were blind were better at the motor tasks than those who were briefly not allowed to see (Kolarik et. al 2017). However, there was no significant difference in motor task performance when both groups of blind participants were able to make noise (Kolarik et. al 2017). This study showed that echolocation may be most effective in the blind because other senses become more sensitive when one is inactive. It also raises the well-known idea that blind people have a keen sense of hearing compared to those with their sight intact. Those who are blind seem to have an advantage over those who can see when it comes to echolocation. This may be because of the heightened response from the other senses in the blind. This idea is also shown in a study conducted by Nillson & Schenkman (2016) in which the differences between the two ears amongst people who are blind and people with their sight were measured. The study found that when participants listened to different clicks in both ears through headphones, participants that were blind were better able to hear differences in volume between the noises coming into the left and right ear (Nillson & Schenkman 2016). In relation to these results, researchers have concluded that people who are blind lose acoustic processing slower than those with all their senses intact (Nillson & Schenkman 2016). Echolocation seems to work best in the blind because of the increase in intensity of the auditory sense.
An increased sense of hearing is not the only mechanism needed for echolocation in the blind to be effective. Milne, Goodale & Thaler (2014) tested the importance of changing the direction of the head with echolocation. Three groups consisting of people who were blind but do not use echolocation, blind echolocators, and people who can see but were blindfolded tried to guess the specific shape of various objects (Milne et. al 2014). However, the movement of the participants’ heads were manipulated in the experiment; when the participants’ head can be moved, blind echolocators were more accurate in guessing an object’s shape compared to blind participants that do not use echolocation and participants that were blindfolded (Milne et. al 2014). But, the blind echolocators’ accuracy in correctly guessing the shape of objects dropped significantly when they were not able to move their head (Milne et. al 2014). Milne et. al (2014) concluded that, in blind echolocators, moving the head to detect stimuli in the environment may be similar to when people with their sight move their eyes when viewing something that takes up a lot of space or is detailed. This study also may support the idea that echolocation is similar to sight. It also shows that moving the head is critical for precision with echolocation in the blind. In addition to the stimulation of visual and auditory-oriented brain areas, specific mechanisms, such as moving the head in the direction toward a noise, seem to be critical for echolocation to work.
Discussion: In modern times, advancements in technology are at an all-time high. Understanding the way echolocation works is critical for biomedical engineering, which is proposed by Thaler and Castillo-Serrano (2016). It is now obvious that keen hearing is not the only way in which people who are blind are able to sense the world around them. Learning more about the neurological and behavioral functions behind echolocation will allow for more technology that is specific to each blind echolocator and their needs. Even though many use echolocation without assistance, taking advantage of an external appliance may be beneficial in improving the performance of blind echolocators (Thaler & Castillo-Serrano 2016). Although not relevant to every study or circumstance, blind echolocators have been seen to benefit from an outside source, which provides a noise to hear an echo. In the study conducted by Thaler and Castillo-Serrano (2016), blind echolocators, for example, showed as much success while hearing calls that synthesize echoes as they did from directly creating a noise to hear an echo. This evidence provides a wide range of possibilities for the use of biomedical technology in increasing the accuracy and efficiency of echolocation.
For someone who is blind, echolocation may seem like an extremely daunting task. The difficulty may not only stem from making a specific noise and interpreting the echo that follows (Koning 2014). Some may pick it up and be more effective at it quicker than others. Understanding how echolocation in humans works would be beneficial for therapy purposes. If an occupational therapist knows the characteristics associated with echolocation, for example, they would be more skilled in helping those that are blind learn how to best adapt to their environment. Currently, no research has shown a specific way to teach echolocation (Koning 2014). However, due to sensory plasticity, blind people have an increased sensitivity in relation to their other senses, which may make picking up echolocation an easier task (Stevens 2013). Humans sensing the world around them seems to be critical for survival and for an improved quality of life; echolocation seems to be a perfect outlet for those who are blind to adapt to the world around them. Therapists that understand echolocation can provide patients who are blind with the means to be more equipped with the skills and experience to live a fully-functional life.
Although major advancements have been made in regard to human echolocation, there are still major gaps in knowledge in regard to how it specifically works (Milne et. al 2014). More research conducted in understanding which specific brain regions are linked to different object characteristics would be critical in gaining a well-rounded understanding about this fairly new sensory modality in humans. It would also be interesting to study the increased sensitivity of other senses besides hearing in the blind when vision is lost. Smell, taste, and touch may strengthen, which could be an adaptive trait in the blind for survival purposes. Future research can also delve into the effects of the age of onset of blindness and echolocation. Previously conducted studies indicate that stimulation of areas in the brain linked to vision during echolocation is only apparent in those who became blind at a young age (Koning 2014). However, because of the incomplete information in regard to certain brain area stimulation and echolocation, more research can unveil undiscovered information in relation to this field (Milne et. al 2014).
Although human echolocation is a relatively new topic, it is now clear that animals like bats are not the only organisms to use echoes for their own personal gain. It is critical for a species to be able to make sense of their environment for survival reasons; echolocation facilitates the identification of surrounding stimuli for those lacking the sense of sight. Surely, the brain plays a large role in making echolocation possible in the blind. The ability for the brain to “adjust” in the blind with echolocation is remarkable. The stimulation of areas associated with vision in the blind during echolocation shows the plasticity of the human brain (Thaler & Goodale 2016).
It is clear that current research involving blind echolocators draws the connection between how vision and echolocation work. The proposal made by Milne et. al (2014) stating that moving the head during echolocation in the blind may be equivalent to moving the eyes to detect all parts of the environment in someone that can see is a clear association drawn between the two sensory modalities. Having blind echolocators and people with full vision both misinterpret how much an object weighs because of how big or small it appears in the study conducted by Buckingham et. al (2015) is also a clear sign that vision and echolocation have commonalities. However, the neurological stimulation of areas in the brain typically associated with sight in blind echolocators is the most clear-cut evidence that sight and echolocation in the blind share similar mechanistic properties. Regardless of where the noise that produces the echo comes from, human echolocation seems to work in a very similar way as sight.
Buckingham, G., Milne, J. L., Byrne, C. M., & Goodale, M. A. (2015). The size-weight
illusion induced through human echolocation. Psychological science, 26(2), 237-
Flanagin, V. L., Schörnich, S., Schranner, M., Hummel, N., Wallmeier, L., Wahlberg, M.,
… & Wiegrebe, L. (2017). Human exploration of enclosed spaces through echolocation. Journal of Neuroscience, 37(6), 1614-1627.
Kolarik, A. J., Scarfe, A. C., Moore, B. C., & Pardhan, S. (2017). Blindness enhances
auditory obstacle circumvention: Assessing echolocation, sensory substitution,
and visual-based navigation. PloS one, 12(4), e0175750.
Koning, N. A. (2014). Human echolocation: How the blind and visually impaired can” see” with
their ears (Master’s thesis).
Milne, J. L., Goodale, M. A., & Thaler, L. (2014). The role of head movements in the
discrimination of 2-D shape by blind echolocation experts. Attention, Perception,
& Psychophysics, 76(6), 1828-1837.
Milne, J. L., Arnott, S. R., Kish, D., Goodale, M. A., & Thaler, L. (2014).
Para hippocampal cortex is involved in material processing via echoes in blind
echolocation experts. Vision Res.
Nilsson, M. E., & Schenkman, B. N. (2016). Blind people are more sensitive than sighted people
to binaural sound-location cues, particularly inter-aural level differences. Hearing
research, 332, 223-232.
Stevens, M. (2013). Sensory ecology, behaviour, and evolution. Oxford, United Kingdom: Oxford University Press.
Thaler, L., & Castillo-Serrano, J. (2016). People’s ability to detect objects using click-
based echolocation: a direct comparison between mouth-clicks and clicks made
by a loudspeaker. PloS one, 11(5), e0154868.
Thaler, L., & Goodale, M. A. (2016). Echolocation in humans: an overview. Wiley
Interdisciplinary Reviews: Cognitive Science, 7(6), 382-393.
Thaler, L., Milne, J. L., Arnott, S. R., Kish, D., & Goodale, M. A. (2013). Neural correlates of
motion processing through echolocation, source hearing, and vision in blind
echolocation experts and sighted echolocation novices. Journal of Neurophysiology, 111(1), 112-127.
Eukaryon is published by students at Lake Forest College, who are solely responsible for its content. The views expressed in Eukaryon do not necessarily reflect those of the College.
Articles published within Eukaryon should not be cited in bibliographies. Material contained herein should be treated as personal communication and should be cited as such only with the consent of the author.