“I went from handicapped to cyborg”
Last fall I became a cyborg. Now, when I tap twice on my left ear, I answer a phone call or hang up. If I tap twice on my right ear, I activate Siri. My new hearing aids make me feel superpowered. Improving our senses and bodies will be the future for all of us, says Sven Størmer Thaulow.
As a kid I handled the fact that I was born with a reduced hearing by compensating with lip-reading and being seated in the first row in the classroom. Later, in the army and at university – exposed to the myriads of Norwegian accents and most likely some effects of playing in bands – it became more tiresome to compensate, so I started using hearing aids.
Until last fall it has been a varied experience. They serve as an important support, but they lack in refinement – and always make me acutely aware of my handicap. But then something happened. For years we´ve talked about cyborgs and wearables, while techies implanted RFID chips under their skin to demo the future and numerous smart glasses with AR have been tested. But now I’m convinced that it’s within the audio space that things will take off. Superpowered hearing aids will become consumer electronics – filling the missing link in the audio interface ecosystem.
But the road to make me a cyborg has been long.
It all started with “in ear”-aids, devices that practically plugged my ears so that no natural sound could enter. In the mid-90s, I got the first programmable devices that could amplify sound on six different frequency bands. Then, in the early 2000s, came the tiny aids that hung behind my ear and that had a speaker connected by a fairly invisible chord. At that time, the buds stuck into my ear were full of holes, and amplification was much better, resulting in a more natural sound picture. These are still the default mode today. But all this time I still felt handicapped, and I didn’t like that they were visible – so I kept my hair long. Being a vocalist in a band was a good combination.
Around 2014, the first mobile connected hearing aid entered the scene. It was connected to the mobile through Bluetooth, not only in an app, but into the operating system IOS. The hearing aids worked like headphones and the voice audio was excellent – but they didn’t have a microphone. I had to hold up the phone to my mouth when speaking. You could also control different programs and the volume, either through an app or directly in the control centre on Iphone. I could also stream music from my Iphone – but the sound was optimised for voice, so it was treble-crap. But this started to become cool. So, I cut my hair.
Still, the potential was so much greater. When could I drop my airpods? Why weren’t the hearing aids truly connected two-ways so I could talk in them, talk to Siri, and get answers? I kept asking every year for innovations and I told my audiologist Heidi to call me when any great leaps were made on functionality.
Then came the breakthrough. Heidi called me in November 2020, asking me to come over. She handed me a hearing device called Phonak from the Swiss company Sonova Group. And now, after 15 years with hearing aids – I have officially become a cyborg. I don’t feel like I have a handicap anymore. I feel superpowered. Privileged. And I am sure this is the future for all humans.
I can use the aids as I do airpods when I talk on the phone. It’s almost a problem as people don’t have any visual cue that I am on a call. The audio quality is also close to the quality from airpods. I have now used these hearing aids for about twelve months, and I haven’t used my airpods at all. And I am listening to a lot of music.
But what’s probably the coolest thing about my new hearing aids is that they are gesture activated. If I tap twice on my left ear, I answer a phone call or hang up. If I tap twice on my right ear, I activate Siri and can ask whatever I want – hoping she will understand. I use it mostly for controlling Spotify, beefing up the volume, setting a timer, jotting down a to-do, sending a simple text message, etc. A few times I’ve asked questions like “who is xx”, but what you can do with this new audio interface is limited to the intelligence of Siri and not to the content or the value chain, which is super smooth.
For 17 of the 24 hours in a day, I am connected to the internet – inside my brain practically. It saves me loads of friction during the day; I probably pick up the phone 40–50 percent less. I don’t need to charge or look for my airpods either.
In my view, the true breakthrough of connected humans will come from medtech. And it starts here, with audio. And while I’m walking around with some fellow hearing-impaired cyborgs, waiting for my audiologist, Heidi, to reveal yet another breakthrough – here are some predictions about the soon-to-come future of the audio interface:
In five years from now you will be able to buy “Invisible Airpods” from Apple or equivalent. They will be the priciest Airpods you can get hold of, but way cheaper than my Phonaks (which are hovering around 1,400 USD in Norway).
“Invisible Airpods” will be the primary audio interface toward the internet. It’s always on you and it’s personal – so why bother with Google Home?
Siri will become a lot smarter and tailored towards “non-screen” communication. This means you won’t need to pick up your phone to browse through what Siri has found on the Internet when asking her a question. Just imagine you are about to have a meeting with someone and you’d like to get some information about them. Siri will be able to provide that, directly into your ear.
App providers will build in Siri support on loads of functions so that it’s possible to use functionality inside the apps without picking up the phone. Today there are very few apps that have implemented functions towards Siri, which is why “she” has limited reach on our phones.
I can’t wait for the next innovation in this space!
Sven Størmer Thaulow
EVP Chief Data and Technology Officer
Years in Schibsted: 2.5