For quite some time, smartphone users have harbored suspicions that their devices might be spying on conversations for ad targeting. Recent leaks suggest those anxieties might be grounded in reality.
A leaked presentation from CMG Local Solutions, a part of Cox Media Group (CMG), unveiled their method called “active listening.” This technique employs AI to mix voice data with users’ online behavior to create hyper-focused advertisements.
The document, acquired by 404 Media, claimed, “Advertisers can pair this voice-data with behavioral data to target consumers ready to buy.” Essentially, CMG has collected voice data from third parties to enhance their advertising strategies. However, they stated that the materials were “outdated” and that their active listening product was discontinued to prevent misunderstandings.
While tech giants such as Google, Meta, and Amazon were mentioned as potential clients in the pitch, all three have denied any connection to this active listening initiative.
Google has removed Cox Media Group from its Partners Program as part of its vetting process, ensuring ads comply with its policies. Meanwhile, Meta is investigating the matter for any potential violations of its terms.
The data CMG referenced often comes from apps that gather information from users who have agreed to their terms of service. Interestingly, research indicates that 91% of people don’t read these agreements before giving consent—this number spikes to 97% for those aged 18-24.
If you think you’ve already granted app permissions, it’s wise to review your settings regularly.
“For an app to perform active listening, it needs access to the microphone. Both Android and iOS require explicit permission during installation or updates,” explains Luis Corrons, a researcher and Norton Security Evangelist. “Some apps may request background access, allowing them to listen even when not in use. On iOS, an orange or green dot appears in the status bar when the microphone or camera is in use, while Android has similar alerts.”
Regularly checking app permissions can help users identify unnecessary access, according to Corrons. He also distinguishes between how voice assistants like Siri and Alexa operate versus other apps that may use “active listening” for marketing.
Voice assistants listen for specific trigger phrases such as “Hey Siri” and inform the user by activating a microphone indicator. Other apps also require microphone permission and will trigger visual alerts when active, Corrons clarifies.
To minimize unwanted background listening, Corrons recommends best practices: make sure to check the permissions for your voice assistants and choose the strictest options, like allowing activation only through a wake word and disabling listening when your device is locked.