Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Did Ancient Humans Use Echolocation? (atlasobscura.com)
69 points by Hooke on Nov 8, 2022 | hide | past | favorite | 68 comments


I can partially echo locate. With eyes closed and a little click I can "see" what shape of room I'm in and detect open windows or doors. It doesn't come up much. I've never thought about trying to train it, now I'm really curious to do so.

On the other hand, I can also smell people's scent trails. This comes up all the time, knowing who's in a room before I walk in, or who recently left it. This can be super annoying, like if someone smells really bad or doesn't brush their teeth. It can provide a big distraction when I'm trying to focus. Places like coffeeshops with big overwhelming smells help with this.

Yesterday at the gym, I walked through a scent trail that was so strong I almost gagged. The main lifting area had two dudes who all smelled terrible, and I could smell a third who'd just left. Thankfully the two dudes moved on after ten minutes, but those ten minutes were rough.

I think this happened because I needed glasses very young but wasn't diagnosed for a year. I was able to keep reading, but wasn't in any sports or anything where it would matter. I think my brain just learned to compensate by paying more attention to my other senses.


My SO has a similar sense of smell and given her family's situation it's very likely that she went without a prescription for too long in childhood as well.

To me it's like living on a different plane of existence. Among other things she (almost) never gets food poisoning, even though she's no stranger to eating things I wouldn't touch.


I did not need glasses as a kid, but I can still smell things way before others. As you describe it, I can smell the edges of the scent trails, as it fades and gets stronger. I understand why dogs sweep back and forth; they are trying to find the edges and help them move the direction of the trail.


"Scent trails", I know exactly what you mean. I also have a very strong sense of smell. My friends call it my "dog nose". It was much more sensitive as a kid (I think), but still pretty good. I can follow people through buildings based on their smell (be it cologne or pungent b.o.) I can also identify people based on scent alone. Everyone has a very unique aroma. As a kid in school, I knew who had milk for breakfast. It was a very sour smell emanating off them -- probably why I find plain whole milk disgusting to this day.

Funny story, I found one of my favorite bars using this skill. Had just moved to a new location and it was super late -- everything was closed, but I was starving. I smelled tacos in the air. No idea where it was coming from, so I just followed the smell. After winding through a few small alleys, and up a few flights of stairs to this building, I ended up at a small late night bar. The owner was making taco rice.

My hearing is also quite good compared to my age. I still get bugged by those anti-teen/mosquito buzzers. I can still hear if an electronic device is on by the high frequency hum they emit. I attribute this to my actively taking care of my ears since childhood.

On the other hand, my vision is shit. Without contacts/glasses, the world is just a blur of color. I didn't know I had terrible vision until I got my first pair of glasses in middle school.


One of the first things I did as a professional programmer was to make a game for blind people where you used audio for navigation. The game was developed in cooperation with a school for blind kids and them using echo location to various degrees was quite prevalent. A common use case was to identify the direction of open doors when passing by. Some where quite reliant on using it but skill levels varied quite a lot.



I wonder what this says about the brain, and how it could be applied for AI. It seems to me like we have an innate knowledge that we exist in a 3d space, and that this model somehow can pull data from multiple sensor modalities - at least vision, hearing and touch, oh and the vestibular system (accelerometer). Maybe not smell? (it's both very disimilar from the others, it's also wired up differently, not passing through the thalamus - though that may be a red herring).

Edit: Another neat factoid is that there were experiments done where subjects wore glasses that turned the world upside down or mirrored left and right. After about a week people could function normally while wearing those. The brain knows what the world is supposed to look like and is able to figure out the correct transform, and once applied previously learned motor skills are once again fully available. This shows that there is some kind of stable higher level representation that sight feeds into in a flexible way.


Right - I use echolocation (although rarely) and I can't work out why it's contentious to think that humans do.


Isn't it just a matter of listening to the ambient noise in your environment. open space sounds different to a closed room. if there is any noise around you can normally tell if you are near a wall just because one ear is getting a close echo from the wall, and the other isn't.

I think most people use sound all the time to orient themselves, especially in the dark. It's just that we are so fixated on what we see that using hearing to tell what is around you gets overloaded by the higher resolution information from our eyes.

I think I would be very interesting to see good research done with blind people to see how they use their senses. Is anybody aware of such research?


Well, you could look at i.e. Daniel Kish (as mentioned above).

https://en.wikipedia.org/wiki/Daniel_Kish

"Kish's work has inspired a number of scientific studies related to human echolocation. In a 2009 study at the University of Alcalá in Madrid, Spain, ten sighted subjects were taught basic navigation skills within a few days. The study aimed to analyze various sounds which can be used to echolocate and evaluate which were most effective.[5][6] In another study, MRI brain scans were taken of Kish and another echolocation expert to identify the parts of the brain involved in echolocation, with readings suggesting "that brain structures that process visual information in sighted people process echo information in blind echolocation experts."

https://visioneers.org/daniel-kish/


I use echolocation myself,

> Isn't it just a matter of listening to the ambient noise in your environment. open space sounds different to a closed room.

Yes. Absent of light and suitable noise, you make the noise yourself. And this is something instinctive to me, so i would assume other people could do it, too.


> I can't work out why it's contentious

Echolocation is contentious because people assume it's to do with timing.

Humans can't time the echo in a room.

Echolocation by humans is using the head as a dampener between the ears to get direction/distance by diffing the ears on noise not time.

Is my understanding.

You could see why people who think it's related to timing like the classic submarine would obviously think it doesn't work with humans. You learn about the speed of sound at a young age and people know the brain can't handle that.

They might assume in a dark room people are using ambient noise and ambient echo's, not echolocation. With people using echolocation who are visually impaired in most articles they are not 100% blind so you are not sure what mix is going on.

Most articles aren't explaining this well.


Ah! That makes sense, and was not what I was expecting at all. Thank you for explaining!


There is a strong materialist bias to modern thinking that precludes having thoughts about any sort of sensory perception beyond the most basic facilities of our five sense organs.


There are many modern thinkers who don't just think about our other senses but make their careers on studying them through rigorous (even materialistic) experiment, e.g. balance, spatial orientation, proprioception, nociception etc. The notion that there are only five senses (let alone five sense organs) is well established to be false.


Yes, but I didn't say there are no people who think about these things, just that there is a strong materialist bias in modern thinking against anything beyond the five senses. Progress is being made but we still have a long way to go.


Fascinating.

I find it hard to imagine what learning such a skill entails.


If you can make a clicking noise with the side of your tongue, you can easily sense the volume of space you’re in, how close to a wall, etc.


I’ve read about this, but I’ve never understood what that means. I can click my tongue but I can never hear any kind of echo. Is it a very particular click?


You will not distinguish an echo, just a change in the "tone" of the sound you make as it reverberates in your surroundings.

The echo pulses are there but just milliseconds after the sound you made. It will all blur into a softer or harder reverb depending on your surroundings.

To hear an actual separate echo pulse, it would need to be 20-100 milliseconds later than the original sound. That corresponds to a large hall tens of meters in size.


Doing a loud clap with your palms aligned and as flat as possible produces an adequate sound.


Sharp and short, alternating left and right at about 4 Hz, on the side of the tongue back by the molars. Feels like pulling the side of the tongue away from the teeth. It’s a much clickier click than a front of the tongue ‘click’.


See the videos in my other comment. Daniel Kish explains how he does it.


I have practiced echolocation for fun. Getting some basic awareness from your surroundings is not that difficult, like detecting if there is a wall on your left or right side.

The difference in the sound response is subtle but clear. It's like tweaking the knobs of a reverb effect in an audio processing software.

It is not about detecting discrete echo pulses like a radar or a sonar does, it's about sensing a change in the "tone" of the sound you make.

In signal processing terms you're "measuring" the impulse response of the room around you.


Neural plasticity goes a long way.


Yeah, I find this almost unremarkable compared to the wide variety of other things humans already do. I think it's pretty obvious we don't have any specialized hardware for echo location like bats or dolphins do, but our brains are pretty big, and our hearing is actually quite good. We spend so much time talking about how this or that animal can hear ultrasonic sounds or whatever that we often end up with the idea that our hearing isn't very good, but even our base hardware is actually fairly good by animal kingdom standards, and it's hooked up to one heck of a brain on the backend. It's not surprising to me that we can bash together some echolocation with just the hardware we have and software we can learn.


People still use a limited form of echo navigation. Hearing sounds can give you some information about a space - compare listening to music in a room versus headphones, or the deeply unsettling feeling of a sound-dampened phonebooth.


Which is why I don't like headphones. Move your head and the sound doesn't move. Can't locate anything that way.


Spatial audio cues in videogames work just fine for me with headphones!

When I play Valorant and am trying to be sneaky, I will tune out whatever my teammates are saying to listen for footsteps, or other audio cues that might indicate some activity. Using my mouse to rotate my view changes how the sounds come in to my headphones, and allows for some amount of directional awareness.

This is geared towards using shotguns to surprise people, by jumping out and blasting them before they have much time to react. Unfortunately, I'm not perfect at this, and swing out early about 1/3rd to 3/8ths of the time.


In games that support head tracking and spatial sound (I.e. Arma 3) it's incredible the difference it makes being able to move the head around to listen at things, spatial awareness increases tenfold.

I wonder why nobody tried the same with audio players. Granted you have to have your head in front of a webcam, so it is uniquely well suited to gamers in front of a pc and not to audiophiles on a sofa, but still.


Apple Music supports spatial audio for Dolby Atmos songs with Airpods Pro and Max. It is quite a different experience.


Ironically that's why I do like headphones. They help tune out environmental noise which would otherwise be distracting.


I've always wondered if I blindfolded myself for a few days and clicked a lot, if I'd pick up echolocation. The key is, though, for it to be useful is I have to be able to take off the blindfold and retain the skill.


Vision trumps sound in the human brain. See the McGurk effect.


This is fascinating. I just watched the bbc video on YouTube about it.


Do you spend enough time in pitch black (or otherwise without sight) for it to be useful? If so wouldn't you retain it once learned?


Blind man here. "for a few days"??? Seriously? This is like saying "If I only would wear a bra for a week, I could give birth to a child!"

I know blind people who wish their echolocation skills were better.

What I am trying to say: expect months or years for training, not days. Besides, it is not only blind people who have excellent directional hearing, conductors are forced by their trade to be able to point at the fiddle player they want to correct. So it is definitely a skill which can be picked up. But it takes some years to get to a level where it is useful.


Few days was a random guess really haha. There was that study where they found people's vision flipped after a week of wearing goggles that project the world upside down. Interesting to hear it's such a complex skill. I suppose I've never really used my sense of hearing to its full potential.


It might depend much on how sensitive your hearing is. I inherited a very sensitive hearing and picked up echolocation unintentionally.


1- "The team hikes up the narrow trail to the large mouth of Bédeilhac Cave, which was used as a military hangar by the occupying Germans during World War II."

2- "I’ve not heard about anyone using singing to echolocate, at least successfully. I use a percussive click, using my tongue and the roof of my mouth, and this is the technique that I teach, and it seems to be the most effective."

3-Through the process of gauging the resonance of the spaces, and therefore their shape and size, with his voice, he figured out that the most resonant parts of caverns are where the most cave paintings are located.

4- “There’s some evidence of the Navajo, the Iroquois, the Cherokee, the Acoma people of New Mexico, and the Nuxalk people of Canada using echolocation to determine ritual places”

https://en.wikipedia.org/wiki/Ari%C3%A8ge_(department)


I've wondered if you could make a device to help people with this - basically just a little clicker that would emit the ideal click once per second. I've also wondered if it would work to have the clicker emit a higher or lower frequency than humans can hear and to be accompanied by headphones or a hearing aid that could hear the click. That way, you could use the clicker without annoying or distracting everyone around you.


I don't think headphones would be as useful, or at least, it would tie you to that device.

Even the crinkles in our ear lobes affect how we appreciate and model the room in our brains.

So instead you would depend on the microphone characteristics, if you changed it, you would have to re-learn until you are used to it.

If we are tying ourselves to a device, we might just as well increase the smarts of the device itself. It could have cameras, lidar and eco-location. It could output coded sounds in earphones representing small and large objects, speeds, etc. Eventually the brain would synthesize.


I expect you would want the sound source to be in a static position relative to your ears - if you're trying to determine the distance between you and a wall 3 feet away, moving the sound source by a couple feet would change everything about the echo


You could just build off a metronome, no?


What makes the ideal click?


Naively I would assume the closest you can get to an impulse function, aka a Dirac delta function, because it contains all frequencies at once. So, as loud and short of a sound as possible. This should give you the most information about your surroundings from listening to the echo, because different frequencies can reflect differently in a room. Also because mathematically an impulse convolved with a function returns the function, so an impulse gives you the most neutral echo with which to compare.


Autocorrelation is a delta function. Basically a random known signal, so you can disambiguate simply from background noise (look into matched filters).


I don't know - if I were actually going to make this device I'd try to figure it out by talking to people who can echolocate, people who do signal processing, scientists studying echolocation in bats and whales - etc.


Perhaps the clickers used in dog training.


I vaguely recollect watching a documentary years ago, which had a blind kid who'd taught himself echolocation using clicking noises that he made with his mouth. I remember that some grownups were trying to discourage him from relying on that and to use a stick instead, but he wasn't keen. The echolocation was giving him more information it seemed like.


I recalled articles about blind mountainbikers using echolocation, and a search found several, including [0] and [1], there was also an article on HN [2] a few years ago.

[0]: https://www.discovermagazine.com/the-sciences/echolocation-t...

[1]: https://www.mbr.co.uk/news/blind-mountain-biker-echolocation...

[2]: https://news.ycombinator.com/item?id=21169414


My guess is whatever doc you saw involved Daniel Kish.

https://en.wikipedia.org/wiki/Daniel_Kish


A kid at my high school used clicks to navigate the school and find the right hallways. He carried a stick but said using the clicks was much easier a school.


In the news, the ancient art of yodeling has been lost among the denizens of Switzerland! How will they adapt to life going forward?


https://www.carnifest.com/switzerlands-national-yodeling-fes...

Still enough of them around to plan a national yodeling festival for 2026!


I've read even modern humans can easily learn echolocation. If that's true, it seems almost obvious to me ancient humans, for whom that would be much more useful, also did.


It’s true. I saw a TikTok of a blind guy mapping things out on a bicycle using it. Wish I could find it again.

Brains are quite incredible.


You don’t happen to find the link? I’d be curious to watch


Sadly not. I’ll post it to one of your future comments if I do run across it again.


Oh that would be super nice. Thank you!


I think it would have been a lot harder in the ancestral environment. They didn't have the luxery of many hard, large and straight surfaces.


I was gobsmacked when I learnt that humans are capable of using sonar, like bats. Even the meta-knowledge seems to have disappeared, strange. I've sometimes wondered whether I could learn it, but I never got round to it, I guess there wouldn't be many courses anyway.

But I urge anyone who has a visually impaired person in their family to look into this seriously - as with any skill, the sooner you pick it up the better.


my blind friend once were in our late teens and he asked me to punch him in the face so we are joshing and do a fast light jab to him clocks him on the cheek not too hard, so then he says again and I kinda just want him to stop asking so i go a little harder at him. suddenly he did these rapid dolphin like clicks and dodged matrix style my jab i was impressed from that day never took him lightly and after that never felt i should really have to help him see or anything. after this article i am guessing he really honed his echolocation skills over the years


I wrote a blog post years ago discussing the possibility of combining echolocation with augmented reality and binaural audio to help the visually impaired navigate their environment with the aid of just a cell phone. Now that some cell phones have LiDAR baked in, this is totally feasible to build today. Anyone want to fund it and open source it? I Truely believe this should exist and will in the future in some form.

https://blog.syllablehq.com/project-sonorous-a-proposed-navi...


Yes, lidar is probably a game changer for (indoor) navigation apps for blind people. However, I have yet to see a prototype which surpasses my own hearing. We shouldnt be too intusiastic about technologies coming up, and forgetting the skillset a human has already built-in... Tech is not always helpful, sometimes it is just an artificial gap between you and the rest of the world.


Totally agree about tech not always helpful. Maybe this wouldn’t be helpful for folks who are already skilled at echolocation. Maybe it would be a gateway to learn it, or would be helpful for folks who were not skilled at it for whatever reason? Not sure, of course would need to test and work with folks as it’s developed.

But I know I was able to “virtually echolocate” playing a game to navigate a maze with no previous experience and I thought that was cool.



I still have good sense of hearing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: