The pretext stops working For a decade, the smart-speaker industry hid behind a single sentence:
"the device only listens after the wake word." Multiple settlements have
now established, on the record, that this was not true. The 2025 Apple Siri settlement was the watershed. Apple paid $95M to
resolve a class action alleging that Siri activated unintentionally and
recorded private conversations that were then shared with advertisers.
The 2026 follow-up wave hit harder. What the public record shows Apple Siri settlement (Jan 2025) — Apple paid $95M to resolve a US class action alleging that Siri activated unintentionally and that recordings were shared with advertisers. Reuters has the filing details.
FTC v. Amazon (May 2023) — $25M settlement over Alexa retention of children's voice recordings in violation of COPPA, including a deletion order. The FTC press release documents the specific retention practices.
CNIL guidance on smart speakers (2024) — the French regulator's published opinion sets out the consent and retention rules every EU smart-speaker vendor is now expected to follow, and is the basis for ongoing enforcement work. Why it matters technically A "wake word" is just a low-power neural net listening for a phoneme
pattern. It runs continuously. When it fires (correctly or not), the
device opens a buffer that includes a few seconds before the wake
word — to capture what you actually asked. That buffer is the source of
every "we didn't mean to record that" headline of the past six years. A 2020 Northeastern study cataloged dozens of phrases that reliably
misfire popular assistants. The vendors knew. Recordings were graded by
human reviewers in multiple jurisdictions, with the contractors often
hearing intimate, medical, or financial material. What you should do Mute the mic physically when you don't need it. Most modern devices have a hardware switch. Use it.
Disable "help improve" / "voice review" opt-ins. They are usually on by default. Turn them off in the companion app.
Auto-delete recordings at 3 months (or "never save"). Both Amazon and Google now expose this. Most users never set it.
Don't put a microphone-equipped device in a bedroom. Just don't. The smart-display industry still hopes you will. The "always-listening" device is now legally obligated to actually
listen less. Hold them to it.