[Solution] Apple Admits Listening to Private Siri Recordings, Like Amazon & Google

The company known for its privacy features has admitted that they listen to the private recordings of the Siri interactions, later this week. The Apple contractors who develop apps to work in association with Apple’s artificially intelligent voice assistant, Siri, are reported to be collecting the data for improvement in the services.

However, as the company says that the voice conversations will not be associated with their Apple ID’s to keep them unidentified. Still, it’s someone listening to the private conversation that you thought that would only stay between you and your wonderful AI assistant.

Amazon and Google were the first companies to face these privacy concerns.


What’s our take on this?

Voice assistants are evolving. They need to feed more samples to work on, to make it as better as a natural human interaction. If the voice recordings, they are listening to aren’t attached to a personal ID and encrypted before they train their assistants, it would be more peaceful.

However, if you feel like not sending your conversation with Siri, you can turn that off from the following Git from GitHub. https://github.com/jankais3r/Siri-NoLoggingPLS

How to install this Git?

Step 1: Open ‘Prevent server-side logging of Siri commands.mobileconfig’ from this GitHub repository and switch to RAW view to download it.

Step 2: Finish the installation from screen instructions.


That’s it! Conversations with Siri are no longer logged now. You can always send your feedback to Apple about this concern.

Alternate method:

You can configure your own profile by disabling ‘Allow server-side logging for Siri commands’ option, as shown below:


0/Post a reply/Replies

Previous Post Next Post