Skip to content

Siri

Apple contractors ‘regularly hear confidential details’ on Siri recordings

Why anyone in their right mind would us a “smartspeaker” or digital assistance is beyond me. Apple got caught with their pants down and has “suspended” the practice. That said, can you trust them, or Google, or Amazon? No. Just say no to this technology. There will always be leaks. “What happens on your iPhone goes to our contractors” in this case.

Quote

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.

Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

That accompanying information may be used to verify whether a request was successfully dealt with. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.

Accidental activations led to the receipt of the most sensitive data that was sent to Apple. Although Siri is included on most Apple devices, the contractor highlighted the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. “The regularity of accidental triggers on the watch is incredibly high,” they said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

The contractor said staff were encouraged to report accidental activations “but only as a technical problem”, with no specific procedures to deal with sensitive recordings. “We’re encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content.”

As well as the discomfort they felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The contractor argued Apple should reveal to users this human oversight exists – and, specifically, stop publishing some of its jokier responses to Siri queries. Ask the personal assistant “are you always listening”, for instance, and it will respond with: “I only listen when you’re talking to me.”

That is patently false, the contractor said. They argued that accidental triggers are too regular for such a lighthearted response.

Apple is not alone in employing human oversight of its automatic voice assistants. In April, Amazon was revealed to employ staff to listen to some Alexa recordings, and earlier this month, Google workers were found to be doing the same with Google Assistant.

Apple differs from those companies in some ways, however. For one, Amazon and Google allow users to opt out of some uses of their recordings; Apple offers no similar choice short of disabling Siri entirely. According to Counterpoint Research, Apple has 35% of the smartwatch market, more than three times its nearest competitor Samsung, and more than its next six biggest competitors combined.

The company values its reputation for user privacy highly, regularly wielding it as a competitive advantage against Google and Amazon. In January, it bought a billboard at the Consumer Electronics Show in Las Vegas announcing that “what happens on your iPhone stays on your iPhone”.

My Friend Cayla

…Or is it My Friend Spy Cayla. And what is the difference between this and Google Voice and Siri? Not much.

Quote:

The My Friend Cayla doll has been shown in the past to be hackable

An official watchdog in Germany has told parents to destroy a talking doll called Cayla because its smart technology can reveal personal data.

The warning was issued by the Federal Network Agency (Bundesnetzagentur), which oversees telecommunications.

Researchers say hackers can use an unsecure bluetooth device embedded in the toy to listen and talk to the child playing with it.

But the UK Toy Retailers Association said Cayla “offers no special risk”.

In a statement sent to the BBC, the TRA also said “there is no reason for alarm”.

The Vivid Toy group, which distributes My Friend Cayla, has previously said that examples of hacking were isolated and carried out by specialists. However, it said the company would take the information on board as it was able to upgrade the app used with the doll.

But experts have warned that the problem has not been fixed.

The Cayla doll can respond to a user’s question by accessing the internet. For example, if a child asks the doll “what is a little horse called?” the doll can reply “it’s called a foal”.
Media captionRory Cellan-Jones sees how Cayla, a talking child’s doll, can be hacked to say any number of offensive things.

A vulnerability in Cayla’s software was first revealed in January 2015.

Complaints have been filed by US and EU consumer groups.

The EU Commissioner for Justice, Consumers and Gender Equality, Vera Jourova, told the BBC: “I’m worried about the impact of connected dolls on children’s privacy and safety.”

The Commission is investigating whether such smart dolls breach EU data protection safeguards.

In addition to those concerns, a hack allowing strangers to speak directly to children via the My Friend Cayla doll has been shown to be possible.

The TRA said “we would always expect parents to supervise their children at least intermittently”.

It said the distributor Vivid had “restated that the toy is perfectly safe to own and use when following the user instructions”.
Privacy laws

Under German law, it is illegal to sell or possess a banned surveillance device. A breach of that law can result in a jail term of up to two years, according to German media reports.

Germany has strict privacy laws to protect against surveillance. In the 20th Century Germans experienced abusive surveillance by the state – in Nazi Germany and communist East Germany.

The warning by Germany’s Federal Network Agency came after student Stefan Hessel, from the University of Saarland, raised legal concerns about My Friend Cayla.

Mr Hessel, quoted by the German website Netzpolitik.org, said a bluetooth-enabled device could connect to Cayla’s speaker and microphone system within a radius of 10m (33ft). He said an eavesdropper could even spy on someone playing with the doll “through several walls”.

A spokesman for the federal agency told Sueddeutsche Zeitung daily that Cayla amounted to a “concealed transmitting device”, illegal under an article in German telecoms law (in German).

“It doesn’t matter what that object is – it could be an ashtray or fire alarm,” he explained.

Manufacturer Genesis Toys has not yet commented on the German warning.