Health/Sci-TechLifestyleVOLUME 21 ISSUE # 16

Your own voice could be your biggest privacy threat

If you know what to listen for, a person’s voice can tell you about their education level, emotional state and even profession and finances — more so than you could imagine. Now, scientists posit that technology in the form of voice-to-text recordings can be used in price gouging, unfair profiling, harassment or stalking.

While humans might be attuned to more obvious cues such as fatigue, nervousness, happiness and so on, computers can do the same — but with far more information, and much faster. A new study claims intonation patterns or your choice of words can reveal everything from your personal politics to the presence of health or medical conditions.

The research, published in the journal Proceedings of the IEEE, highlights a grave concern for the technology’s capability in privacy and unfair profiling. While voice processing and recognition technology present opportunities, Aalto University’s speech and language technology associate professor Tom Bäckström, lead author of the study, sees the potential for serious risks and harms. If a corporation understands your economic situation or needs from your voice, for instance, it opens the door to price gouging, like discriminatory insurance premiums.

And when voices can reveal details like emotional vulnerability, gender and other personal details, cybercriminals or stalkers can identify and track victims across platforms and expose them to extortion or harassment. These are all details we transmit subconsciously when we speak and which we unconsciously respond to before anything else. Jennalyn Ponraj, Founder of Delaire, a futurist working in human nervous system regulation amid emerging technologies, told Live Science: “Very little attention is paid to the physiology of listening. In a crisis, people don’t primarily process language. They respond to tone, cadence, prosody, and breath, often before cognition has a chance to engage.”

While Bäckström told Live Science that the technology isn’t in use yet, the seeds have been sown. “Automatic detection of anger and toxicity in online gaming and call centers is openly talked about. Those are useful and ethically robust objectives,” he said. “But the increasing adaptation of speech interfaces towards customers, for example — so the speaking style of the automated response would be similar to the customer’s style — tells me more ethically suspect or malevolent objectives are achievable.”

Share: