Today, a recruiter invited me to a call about a potential role I was very interested in learning more about. But, less than an hour before the meeting, I received a sudden calendar update: “Fred from Fireflies will join to record and transcribe the conversation.” - No prior request for consent. - No explanation of how the recording would be stored. - No clear details on how my data might be used. What should have been a straightforward conversation instantly shifted into a scramble to protect my privacy (voice, image, and data). Recording an interview, without clear, advance permission, erodes trust before the first question is even asked. Consent is a deliberate agreement that lets everyone show up prepared and comfortable. This is an ethical issue. No doubt, an AI note-taker could be valuable to this recruiter. But, they also raise questions about data retention, confidentiality, and intellectual property. A candidate discussing career history, research, or sensitive client details deserves to know exactly how those records will be used and who will have access. If you truly aim to build an inclusive hiring process, plan for ethical recording practices from the first email. - State your intentions. - Outline how the file will be stored and data retention policies. - Offer alternative accommodations. - Secure explicit consent well before the call. Anything less feels like surveillance disguised as efficiency. How are you making sure your use of AI tools in interviews respects privacy, consent, and accessibility? *Note, I am fortunate to be able to walk away from situations that violate my privacy, and I did exactly that in this case. I recognize that many candidates cannot afford to decline and must navigate similar scenarios without the option to stay no. If you are in that position, I see you and stand with you. #CyberSecurity #DataPrivacy #Consent
Privacy Issues With Video Conferencing Tools
Explore top LinkedIn content from expert professionals.
Summary
Privacy issues with video conferencing tools have become a growing concern, especially with the rise of AI transcription and recording features. These tools can unintentionally compromise sensitive information if not used transparently and ethically.
- Clearly communicate recording intentions: Always inform participants in advance if a meeting will be recorded or transcribed, including details on how the data will be stored and utilized.
- Seek explicit consent: Ensure all participants provide clear consent for recordings or AI transcription to avoid violating privacy or legal requirements, especially in all-party consent jurisdictions.
- Understand platform policies: Familiarize yourself with the data retention and usage policies of the video conferencing tools you're using to prevent mishandling of sensitive information.
-
-
With AI transcription tools, I think it's REALLY important to revisit the concept of "informed consent." UX researchers should be familiar with this term (to read more, 18F has a nice little explainer: https://lnkd.in/emPYefdf). For me, informed consent extends to ANY meeting you have, not just research sessions. That's how I have always operated. What i'm most concerned about right now is this part (again, from the 18F explainer): "In order to give their informed consent, participants need to understand...what data you’ll collect, how you will use it and how long it will be kept." This means informed consent involves two things on your part: You MUST inform participants if you are recording and transcribing a meeting. There are ton of AI transcription add-on tools right now that people are using and ARE NOT INFORMING meeting participants about this. Some of these tools are visible on the meeting platform, some of them aren't, but either way, everyone involved should know they are being recorded and an AI-tool is creating a transcript. Consent to recording has always been a standard part of informed consent, and a transcript is a form of recording. What makes this more complicated with AI tools is this: in order to ask for informed consent, you must know where and how that data is stored and how it is being used by the platform itself (e.g., is it being used as training data). If you do not know that, you can not even ask for informed consent, because you cannot guarantee privacy. I am not "anti-AI", I am actually pro "actually understand how these tools work so you can use them thoughtfully."
-
I’ve had some serious concerns about AI notetakers for quite a while, but I want to take your temperature on them. Are we using these tools without thinking through all the implications? AI notetakers are everywhere, humming in the background of our virtual calls, often unseen and unheard. (And just because you don’t see ‘otter.ai’ on a call doesn’t mean a bot isn’t quietly listening in.) Admittedly, they are super convenient, and if you’re running a lean business, you are looking for any way possible to save time for your team. But let’s think about what’s actually happening: these tools are creating and storing audio recordings or word-for-word transcriptions of your conversations. Think about that for a moment: your discussions, your sensitive information, your private thoughts—all living on someone else’s computer, perhaps on a third-party server, potentially forever. This isn’t just a privacy concern; it’s a significant legal risk. Let’s start with privilege. It is an open question whether simply having an AI notetaker on a call with your attorney destroys privilege. An OPEN QUESTION. Imagine facing a lawsuit and being compelled to hand over transcripts of your confidential discussions. Then there’s the straight up legality question. Is it illegal for you to bring your AI notetaker on a call? That’s a strong maybe—especially since consent laws vary by state. If you or someone on your call is an all-party consent state (like California or Florida), that is a real risk. We have a policy against using AI notetakers at All Places and we don’t allow notetakers to join our meetings. Of course, let’s be honest—there’s always a chance Granola is running silently on the other end and we’d never know. Where is your head at on this? Are you comfortable with the risk? Have you not really thought about it? I’m curious how non-lawyers are thinking about it. #dataprivacy #ainotetakers #ailaw #leanstartups