![]() The contractor argued Apple should reveal to users this human oversight exists – and, specifically, stop publishing some of its jokier responses to Siri queries. If there were someone with nefarious intentions, it wouldn’t be hard to identify. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. “Apple is subcontracting out, there’s a high turnover. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. There’s nothing about reporting the content.”Īs well as the discomfort they felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. The only function for reporting what you’re listening to seems to be for technical problems. “We’re encouraged to hit targets, and get through work as fast as possible. The contractor said staff were encouraged to report accidental activations “but only as a technical problem”, with no specific procedures to deal with sensitive recordings. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.” Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. ![]() Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.” “The regularity of accidental triggers on the watch is incredibly high,” they said. Although Siri is included on most Apple devices, the contractor highlighted the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.Īccidental activations led to the receipt of the most sensitive data that was sent to Apple. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. That accompanying information may be used to verify whether a request was successfully dealt with. These recordings are accompanied by user data showing location, contact details, and app data.” The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated. The service can also be activated in other ways. “The sound of a zip, Siri often hears as a trigger,” the contractor said. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.Ī whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information. ![]() User requests are not associated with the user’s Apple ID. ![]() Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.īut the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.Īpple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |