Apple’s release of iOS 15.4 beta 2 completes the fix for a bug that may have recorded interactions with Siri without permission on some devices. Apple has fixed this bug that was introduced in iOS 15 and accidentally kept some recordings, regardless of whether you opted out or not.

The bug was actually fixed in iOS 15.2 but users have only learned about this now that the latest beta has started asking for permission again.

The bug

TheImprove Siri & Dictationsetting was turned off in 15.2 to fix a bug that was introduced in iOS 15. This bug enabled the setting for some users who had previously opted out. In other words, recordings were being kept for some users who had opted out of the setting instead of being deleted. Once Apple discovered the bug, the company turned off the setting for the affected Siri users with the release of iOS 15.2. The same update also fixed the bug.

Now, in iOS 15.4, which is still in beta, users will be asked if they want to opt-in and help improve Siri and Dictation by allowing Apple to review recordings of voice interactions to improve both services. If you opt-out, your voice interactions with Siri or the voice dictation tool on your iPhone aren’t recorded and shared with Apple.

Users that want to get the fix for theImprove Siri & Dictationbug should make sure they have iOS 15.2.

Saved recordings

What is painful is that the bug affected mostly people that had on purpose opted out from being recorded. Since identifying the bug, Apple has stopped reviewing and started removing audio received from all affected devices.

One thing that is unfortunately considered standard behavior for Apple is that it kept the information under its hat until it was fixed. It is clear from its statements that the company has known about the bug at least since before the introduction of version 15.2 (December 13, 2021).

Why not let your customers know what is going on? Let them know what happened and that you’re working on it. This is nothing like a vulnerability that you need to keep the lid on, in case a cybercriminal abuses it. This is a privacy issue that users need to be informed about as soon as possible.

Why record at all?

Apple likes to review recordings of voice interactions to improve both Siri and Dictation. However, every user should be asked if they want to submit their recordings. And if they decide to opt-out, their voice interactions with Siri or the voice dictation shouldn’t be recorded and shared with Apple.

Apple’sregular privacy informationoutlines the default behavior for Siri and Dictation. If you opt in toImprove Siri and Dictation, additional data is collected, stored, and reviewed. For more information, visitwww.apple.com/legal/privacy/data/en/improve-siri-dictation.

Not its first rodeo

In 2019 Apple contractors revealed to theGuardianthey regularly heard confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, for the company’s Siri voice assistant. At the time, it was found that a small proportion of Siri recordings would get passed on to contractors working for the company around the world. These contractors were tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with, and whether Siri’s response was appropriate.

Not having learned from that incident and taking the wrong turn again on the same issue does not bode well for the future.