How Does Human Echolocation Work?
Blind since he was very young, Daniel Kish is the world’s foremost proponent of using vocal clicks to navigate
Daniel Kish, president of World Access for the Blind, developed his own method of generating vocal clicks and using their echoes to identify his surroundings and move about. Ever an advocate for this technique which he calls “flash sonar,” Kish teaches in small groups or one-on-one in field exercises around the world, with an emphasis on training instructors who can further spread the method. This year Kish collaborated with researchers from six different universities on an in-depth analysis of the practice, published in PLOS Computational Biology.
“You could fill libraries with what we know about the human visual system,” says Kish. “But what we know about human echolocation could barely fill a bookshelf.”
The study sampled thousands of clicks from three different echolocators, and examined their consistency, direction, frequency, and more, including describing a 60 degree “cone of perception” that radiates out from the mouth and provides the most detail in the scene.
“When people echolocate, it’s not like now they can see again. But echolocation does provide information about the space that’s around people, and that would otherwise not be available without vision. It allows them to orient themselves and so on,” says Lore Thaler, lead author of the paper. “You can think of it as an acoustic flashlight.”
So human echolocation is useful. But what is it like? We caught up with Kish to discuss his unique abilities and how they could be helpful to anyone, as he clicked his way around his neighborhood in Southern California.
Could describe what you “see?” What do you tell people when you want them to understand what your experience with sonar is like?
We know from other studies that those who use human sonar as a principal means of navigation are activating their visual brain. It’s the visual system that processes all of this, so vision is, in that sense, occurring in the brain.
It’s flashes. You do get a continuous sort of vision, the way you might if you used flashes to light up a darkened scene. It comes into clarity and focus with every flash, a kind of three-dimensional fuzzy geometry. It is in 3D, it has a 3D perspective, and it is a sense of space and spatial relationships. You have a depth of structure, and you have position and dimension. You also have a pretty strong sense of density and texture, that are sort of like the color, if you will, of flash sonar.
It does not possess the kind of high definition detailed precision that vision has. There’s a big difference in size, for example, between sound and light waves. And then there’s a difference in how the nervous system processes auditory information versus visual information, in how information is sent to the brain through the eye, as opposed to the ear. So you are, in some ways, comparing apples to oranges. But they’re both fruit, they’re both edible, there are a lot of similarities between them.
Could you give one or two specifics of something you might perceive within that environment and how it appears?
I’m walking through my neighborhood, on the phone with you. Right now, I’m passing by a neighbor’s house, and she’s got a lot of trees surrounding her house. It’s very treed and hedged and heavily bushed. It’s very fuzzy, it’s kind of soft, it’s kind of wispy. Foliage has a particular effect, a particular signature. It puts out a very specific image. I can tell you that someone has done a lot of work on her yard, because her tree line and hedge line are thinned out. Now I’m aware of the fencing behind the tree line, which I always new was there, but now it’s much more clear because the tree line is more transparent, acoustically. But you know, I also have one ear to a phone.
Some of it’s really clear and crisp, some of it’s probability, some of it’s context. Some of it, you don’t really care what it is, it’s just there, it’s to be avoided. Some of it I know I’ll recognize it if I come back past it again. And then, some of it, I could actually sit and draw for you.
It’s relatively easy for someone using flash sonar to, for example, navigate an obstacle course, even a pretty complex one. You might not necessarily recognize what the objects were that you were navigating, but you could navigate them pretty precisely, and probably fairly quickly.
How did you learn to do this?
My parents really valued my freedom. They didn’t get hung up about the blindness, they were just more concerned about me growing up to be a relatively normal kid, to then emerge into becoming a relatively normal adult, which is to say someone who is able to enjoy the same freedoms and responsibilities as others. I was encouraged to get on with being a child, and being a boy of any given age was much more important to them than the fact that I was blind at any given time. Kids adapt to their conditions very quickly, and the more supported they are in that adaptation, the quicker it will happen. I taught myself to use flash sonar in much the same way that you taught yourself how to see.
How common is it for other blind people to make that journey on their own?
It’s not very common. There’s not a lot of research on that matter, but I would say that it’s less than 10 percent. It’s hard to generalize, because the research is really very scant. The reason isn’t that blind people don’t have the capacity; blind people do have the capacity. It has more to do with social barriers, imposed limitations. There’s nothing inherent about blindness that would keep a person from learning to be mobile, and learning to self navigate. That’s not an artifact of blindness, it’s really a barrier imposed on blindness.
There are those who are highly capable, who either pulled themselves together or were well prepared, and who are doing very well. Many of those are echolocators—there is a certain correlation between blind people who are self-proclaimed echolocators and mobility and employment. The majority of blind people are caught up in this social construction whereby they are restricted and limited. All you have to do is look at the unemployment rates among blind people, and you have an unemployment rate of upwards of 70 percent. So that’s pretty dire. But unnecessary.
There are those who are opposed to our methods of echolocating, because they feel that blind people echolocate anyway, but the research doesn’t really support that. There are those who feel that the clicking draws negative social attention, and there are those who don’t. It kind of varies all over the map, in terms of how receptive and responsive blind people are.
What does a world look like that’s been built to accommodate or support people who use flash sonar? What goes into that kind of design?
A world that was conducive to the use of flash sonar wouldn’t be so noisy. There wouldn’t be a lot of sound clutter, as there is in today’s modern world. There would be less extraneous reverberation in indoor spaces. We tend not to pay much attention to the amount of reverberation in classrooms, auditoriums, even gymnasiums.
We have to keep in mind that a blind person integrates a lot into their navigation and movement process. It isn’t all about flash sonar; flash sonar is just one component of that. There are a lot of different systems that feed in. For the most part, I think of blindness as adapting to the world, I don’t think too much in terms of the world adapting to blindness. You sort of have to meet it half way, at least. So yeah, it’d be nice if there was more Braille in public places. Imagine a world without signs. How do you get around in a world without signs? Blind people for the most part don’t have that.
Why was this study something you wanted to be a part of?
The visual system has been studied prodigiously. It has a lot of literature behind it, a huge body of knowledge. You could fill a library with what we know about the human visual system. But what we know about human echolocation could barely fill a bookshelf. And yet, human echolocation is as important to humans who use human echolocation as vision is to people who use vision.
I knew this study, in concert with other studies, would contribute to that knowledge. I, as a teacher, would expect to be able to use that knowledge to refine approaches to instruction, as well as potentially the development of devices or enhancement tools that might help people learn echolocation more quickly or use it more effectively.
What did you feel like you took away from the study, or learned from it?
The parameters of human clicking haven’t been studied that closely or precisely. Now we have a better idea of what those parameters are, and that there are similarities between the three subjects in the sample. Expert echolocators tend to favor certain kinds of signals, which I would describe as pulsed or flashed signals.
The cone of perception is interesting. So they were able to get a fairly defined sense of what the acuity regions are, with flash sonar, which is interesting and that will contribute to helping refine the design of teaching protocols, and maybe what to expect from students as they’re learning.
As an educator, what’s your ultimate goal?
Really, it’s just to teach people how to see better. If seeing is perceiving, if seeing is being aware, and if it’s possible to help people who can’t see with their eyes to learn to see more effectively in other ways, why would we not do that? We work very hard to help people see better with their eyes. Why not work very hard to help people see better without their eyes?
This whole thing really boils down to freedom. Freedom of movement and personal choice, the ability to use flash sonar effectively, to enhance and expand one’s ability to move and navigate comfortably and freely through the environment and through the world. To develop their own relationship with their world in their own way, on their own terms, represents a basic definition of freedom, and to us what this all boils down to is helping individuals to find their freedom.