Nine Years of Siri

Nick Heer, Pixel Envy:

Given its age, inconsistency, slow response times, and unreliability, there is little doubt in my mind that Siri is one of modern Apple’s greatest software failures. I do not understand how, after a decade of development, it still struggles with fundamental expectations.

My confidence in Siri waned over the past few years, and the above quote sums up my sentiments exactly. It’s not a feature I use with any verve, anymore, though I keep persisting.

Upcoming Changes to Siri Privacy Protection

Apple Newsroom:

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

Following the suspension of using human contractors to listen to Siri audio snippets for its Siri grading program to improve Siri’s effectiveness in providing accurate responses to queries, Apple have now temporarily terminated the program and offered its apologies for failing to live up to their high ideals and upholding the level of privacy its users are accustom to.

However, the practice will resume in-house when upcoming software updates are released, and a few evaluation process changes have been made:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

I’m glad to see Apple continue to take ownership of its responsibilities in addressing the situation, offering its apologies, and putting forth changes that align with their strong privacy stance and the respect it has for its users. I will be sure to opt-in to help with improving Siri.

Source: Apple Newsroom.

Apple Suspends Siri Data Analysis by Contractors

In a statement to TechCrunch:

“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Despite the delayed response to the criticisms of potential privacy concerns surrounding the use of contractors to analyse and grade Siri search query data - a concern made more prominent by The Guardian’s reporting of the situation - am glad to see Apple take action nonetheless.

Albeit, only a suspension of the program worldwide and not a total cancellation, probably due to needing time to assess better operable options in how to continually process such data without the need of contracted human helpers.

It is great to see Apple take ownership of its responsibilities and holding themselves accountable.

Siri Data Analsysis by Humans

“Often masqueraded under the thin veil of ‘anonymous data collection to improve your experience’, every tech company is susceptible to using data in ways users might not be fully aware of, we are, after all, in a digital age of ubiquitous data harvesting. Whether users tolerate the unethical amassing of data to be sold off without consent is a decision a user should regularly review.”

- Excerpt from a post on Privacy published on Chambyte on 22 July 2019.

A week or so after publishing a post about Apples Privacy Stance in which I stated the reason why I trust the tech giant in defending our right to privacy, UK publication The Guardian published an article about how Apple contractors ‘regularly hear confidential details’ on Siri recordings.

The Guardian:

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.c

Not the first time such a revelation came to light, as Apple Analyst Rene Ritchie points out on Twitter; Bloomberg published an article back in April of 2019 which cited Apple’s use of human helpers to listen to and assess Siri data.

A citation on a publication is not enough, however. Apple can and should do better in explicitly disclosing the use of sub-contracted human helpers in this process.

Relating to Siri - On Apples Privacy Policy page under the Collection and Use of Non-Personal Information:

We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.

The glaring omission here; no explicit mention of who analyses this data in its pursuit of improving the relevancy of results, and in continuing its tradition of accountability, Apple should rectify and update this omission on its Privacy documentation.

With their journalistic responsibilities, The Guardian et al., are right in publicising this distinct lack of disclosure along with any potential possibility of misuse of such data by people trusted to examine it, regardless of how anonymous the data is. The Guardian’s sex and drug lede designed to entice readers and raise undue concern rather than taking the educative approach is of no surprise, however.

Sex and Drugs sell in the world of tabloid headline-grabbing ‘news’ for clicks.