Photo by Brett Jordan on Unsplash
The latest update to Apple’s iOS 18 has brought with it a new feature that has left some users scratching their heads – the “Enhanced Visual Search” toggle in the Photos app. This seemingly innocuous addition has raised concerns among developers and users alike, highlighting once again the importance of transparency when it comes to data sharing and user consent.
For those who may not be aware, Enhanced Visual Search is a feature that allows you to look up landmarks you’ve taken pictures of or search for images using the names of those landmarks. Sounds convenient, right? Well, it’s certainly useful – but at what cost?
The Fine Print
According to a blog post by developer Jeff Johnson, the Enhanced Visual Search toggle is seemingly enabled by default on iOS 18 devices. This means that users are unknowingly giving their device permission to share data from their photos with Apple. Yes, you read that right – your photos.
Johnson’s post highlights a few examples of how this feature can be useful – but also raises some important questions about data sharing and user consent. For instance, when you swipe up on a picture you’ve taken of a building and select “Look up Landmark,” the Photos app will attempt to identify it. Sounds simple enough… except when it gets something wrong.
- A recent test run by Johnson revealed that the feature got a landmark identification wrong – mistaking a city hall building for a Trappist monastery!
The Technical Details
So, how exactly does Enhanced Visual Search work? According to Apple’s research blogs, the process starts with an on-device ML model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “landmark” domain, a vector embedding is calculated for that region of the image.
The process then encrypts and sends this vector embedding to Apple to compare with its database. This means that every time you use Enhanced Visual Search, your device is sending a snippet of data about your photo to Apple’s servers for analysis.
Apple’s research blogs go into great detail about the technical aspects of vector embeddings – but the bottom line is that this technology condenses image data into a format that’s legible to an ML model. This allows Apple to compare and match places in your photos with its global index, all without actually having access to the original image data.
The Concerns
Now, we’re not suggesting that Apple is up to no good here. The company has clearly gone to great lengths to keep the data private – and it’s true that Enhanced Visual Search can be useful for users who want to identify landmarks in their photos.
However, Johnson’s concerns about transparency and user consent are valid ones. After all, if a feature is enabled by default on millions of devices worldwide, shouldn’t Apple be making it clear what that means for users – and giving them the option to opt out?
The Verdict
Ultimately, whether or not Enhanced Visual Search is a feature worth using up to you. If you’re concerned about data sharing and user consent, it might be worth opting out – especially if you value your privacy.
On the other hand, if you’re looking for a convenient way to identify landmarks in your photos, Enhanced Visual Search could be just what you need. Just remember that every time you use it, your device is sending data about your photo to Apple’s servers – and that might make some users uncomfortable.
Leave a Reply