Image created by Kayla Doan

The surprising way AI deepfakes may shift etiquette norms and reduce audio recording in the workplace.

A couple weeks ago, we all saw the story where a mother received a ransom call using AI deepfake technology to impersonate her daughter’s distressed voice.

That story was fantastical, but not a surprise to those watching AI capabilities drastically improve over the past 6 months. Today, it only takes a three second clip of audio to produce a deepfake. As the Center for Humane Technology warns, we will see a rise of more subtle, nuanced scams using this technology. For example, a college freshman calling their parent to verify their social security number while filling out aid forms. Or, a call to a relative to reminisce about a fun fact from childhood, that is also the answer to a bank security question.

In response, a recommendation was shared in my circles to set passcodes with family members - which I urge everyone to do. I've asked my network if deepfake concerns had deterred others from posting video and audio publicly, as I have dialed back on podcasting and have hesitated to publish TikTok or Instagram videos. One specific answer made me pause:

“I was kind of surprised when I was asked to schedule an interview to be asked to record my name so they would know how to pronounce it. Which, given my [unique] name, is so nice in theory, but no thank you.”

Let’s unpack that: Company well-intentioned collection of user audio data, without a clear statement of how it would be stored, protected, or deleted - subtly turning off potential employees who don’t trust them to do right by their data.

Think of all the times company meetings are recorded without asking participants for consent. The phone calls where you hear “this message may be recorded for training purposes”. The record button being accessible to all participants in most Zoom calls. I even recently had a developer’s Otter bot join a call to record and transcribe notes for him, while the developer himself skipped the meeting entirely.

It would be perceived as rude to record a video of someone without asking for their permission, because people generally want autonomy over how they are presented online. We certainly wouldn't flippantly ask someone we don't know well for personal details, such as their home address. Therefore, we can anticipate changes in societal and business-context norms of how and when it’s appropriate to record other’s voices, and what consent gathering looks like.

Startups where background audio monitoring is the core offering will need to get ahead of privacy concerns, or will open themselves up to their technology becoming faux pas by their customer base.

In a beautiful silver lining, perhaps this rise in deepfake advancements will prompt a shift towards meeting intentionally, with less technology. If GenZ bringing back flip phones is any indication, societal norms can make or break technology adoption in surprising ways.

About the author: Kayla Doan is a thought leader on ethical technology, fractional product manager and founder of Intentional Ventures. See services for product management and startup strategy support.

This article originally appeared on
LinkedIn.