New privacy backlash isn’t hurting Apple
Getty Images
Apple doesn’t make many mistakes when it comes to security and privacy for its more than 1 billion iPhone owners. But then they attack even harder. There’s been a surprising uproar this week as users started asking if what happened on their iPhone still remains on their iPhone.
Of course, we’re talking about photos, and Apple’s complex Enhanced Visual Search feature that “auto-selects” all users and appears to “violate” its legendary privacy guarantees. It’s about. As I reported when this was first published, Apple captures and masks portions of users’ photos, which are then centrally analyzed and flagged to determine the location of landmarks against a central dataset. is specified. If it works as claimed, there is no privacy risk to the user. However, very few people understand the technical content, so it becomes an afterthought.
So does this really violate your privacy? No, but it’s not nothing either. There are two major problems here for Apple. The first is optics. When you build a brand within a bubble of privacy, you never want to make fun of it with a pin. The second is the final wedge problem. This kind of hybrid device-cloud photo scanning has gotten Apple into trouble before, with its ill-fated CSAM proposal in 2021.
This child safety upgrade is designed to screen photos on your device against a hashed dataset of known illegal CSAM images, flagging multiple images and sending them for human review. Ta. As noted at the time, the problem was not with CSAM itself, but rather with the door being opened to the review of other materials (religious, sexual, political) based on local laws and regulations. As I said at the time, the solid defense that was technically impossible suddenly collapsed.
And it’s here too – perhaps. The idea that a user’s photos could be screened against a cloud dataset for any purpose is off-putting to many people, at least those who view their iPhones as repositories for their personal eyes. . As privacy expert Matthew Green posted on BlueSky, “Learning about a service two days before New Year’s and realizing it’s already enabled on your phone can be very upsetting. It’s frustrating.”
Apple says Enhanced Visual Search “lets you search for photos using landmarks and points of interest.” Your device privately matches the location in your photo to a global index maintained by Apple on our servers. It applies homomorphic encryption and differential privacy, and uses OHTTP relays that hide IP addresses. This prevents Apple from knowing information about your photos. ”
However, Jeff Johnson, the blogger who started the controversy, warns: There’s no way to personally assess the health of Apple’s implementation of Enhanced Visual Search…Computing privacy is simple. If something happens entirely on my computer, it’s private, but if my computer sends the data to the computer’s manufacturer, it’s private. It’s not private, or at least not completely private. ”
Apple’s absolute control over on-device and off-device is already being challenged by new cloud-based AI services being pushed out. A lot of work has gone into developing and rolling out Apple’s “groundbreaking” private cloud computing. This essentially provides a cloud extension of the device’s encrypted enclave, allowing centralized processing within the user’s private space. If a process escapes this enclave, such as ChatGPT, Apple specifically calls it out to the user. Compare this to the secretive nature of this update. As Green puts it, “Apple didn’t announce it; it was discovered.”
And that’s the problem here – the lack of transparency. And, as Michael Tsai warns, “Not only can you not opt in, but once you start uploading metadata about your photos before you can use the search feature, you can’t effectively opt out either.” is even worse. This happens even if you’ve already opted out of uploading photos to iCloud…I don’t think this company is living up to that ideal here. ”
The real issue here is optics and perception. And that’s a grave mistake. If Apple had presented this more openly, there would have been much less of a fuss and most people wouldn’t have opted out. But elsewhere, iMaker goes to such great lengths to force opt-ins to off-device data capture that this stands out as odd. I wouldn’t be surprised to see a U-turn or retroactive opt-in.
Apple says, “You can turn off Enhanced Visual Search at any time by going to Settings > Apps > Photos on your iOS or iPadOS device. On a Mac, open Photos and go to Settings > General. I have asked the company for some kind of response to this incident, but so far nothing has been done.