Whenever possible, I like to use FaceTime Audio or WhatsApp instead of making regular phone calls. The people I’m calling see it like a normal phone call, but the audio quality is leaps and bounds better than a good old-fashioned phone call.
But! It turns out I wasn’t even doing my internet calls the right way. Today, I learned that there’s a newish feature buried in the Control Center that instantly improves the quality of your microphone during calls, whether you’re audio-only or on video.
It’s called Voice Isolation, and it works on most iPhones, iPads and Macs from the last few years as long as you’re running iOS 15 or macOS Monterey. (Anything that supports Spatial Audio seems to also support Voice Isolation.) It’s weirdly hard to find, and you can only access the setting when you’re already in a call: you swipe down from the upper-right corner (or click in the upper-right corner on a Mac) to get to the Control Center, then tap on the button that says “Mic Mode.” By default, it’s set to Standard, but there are two other options: Voice Isolation and Wide Spectrum. Wide Spectrum will actually let the other people on your call hear more background noise, which I guess is useful if you’re holding up your phone at a concert but mostly sounds like a horrible thing to do to the other people on the line. But Voice Isolation? Voice Isolation is where the magic happens.
i had no clue that a) Voice Isolation was a feature available on the new iPhones / Airpods and b) it worked so well. It’s incredible on the other end — you hear nothing but the person you are talking to. Surprised it’s not automatically turned on!
— can duruk (@can) May 16, 2022
Basically, when you enable Voice Isolation, your device begins to aggressively process the audio coming into your mic in order to remove background noise. When I turned on the setting on my iPhone 12, my dog barking 20 feet away completely disappeared — and so did nearly all of the sounds of traffic. When I turned it on on my MacBook, the sounds of both my laptop fan and my keyboard typing stopped coming through altogether.
In the process of isolating the voice, Apple also seems to bring it closer; there’s much less echo and room tone, so it sounds like you’re holding your phone to your face even when you’re not. The tradeoff is that your voice definitely sounds more processed, but it always sounds processed through apps like FaceTime or Zoom.
In my testing, there was one moment when two cars revved their engines at the same time only a few feet from where I was standing when the AI seemed to get overwhelmed and just output a half-second of total silence. But it’s not like you’d have been able to hear me over the roar anyway, right? And, in general, a little more processing for a lot less background noise is an easy trade to make for most calls.
There are only two problems with Voice Isolation. One, it’s not a universal setting, so you’ll have to enable it in every app you use for calls. Two, it doesn’t work everywhere. Apple makes Voice Isolation available through an API on iOS, iPadOS and macOS, but not every app supports it. On mobile devices, the track record is pretty good: Snapchat, WhatsApp, Slack, Signal, and Instagram all support it, though TikTok doesn’t. Zoom had it on iOS but not the Mac, and there’s no way to turn it on for any in-browser apps as far as I can tell, so that rules out Google Meet and a handful of others.
But the most glaring absence? Regular ol’ phone calls. There are no Mic Modes at all for phone calls even though that’s the place you could probably most use a little bit of improvement. I asked Apple why this is, but the company didn’t comment.
To be fair, even in normal modes, Apple’s doing some noise-cancellation work. If you ever want to test it, hold up a fan to your phone and listen as the device takes a few seconds to identify and suppress it, but it doesn’t go far enough. I’ve now heard Voice Isolation, which means I’ve heard what better can sound like. And I want it everywhere, and I want it on all the time — for my sake and for everyone I talk to.