Search

Tuesday, September 10, 2024

Apple Visual Intelligence – search using your iPhone camera

Apple demonstrated a new search experience with Apple Intelligence named Apple Visual Intelligence. It looks and feels like Google Lens but it uses the native iPhone camera and is built into Apple Intelligence directly.

Plus, it seems to use third-party search providers, like Google, OpenAI’s ChatGPT and Yelp for its search results – depending on the type of query.

What it looks like. Here are some screenshots I grabbed from the Apple event from yesterday, if you want to watch it, it starts at about the 57-minute mark in this video:

Looking to buy a bike you saw on your walk; it says “Searching with Google…” after you snap a photo of it:

Although, the example provided of the search results look somewhat “doctored”:

Here is an example of a local search result when someone wants more details on a restaurant they came across while walking. This seems to pull up the local search results in Apple Maps, which I believe is powered by Yelp and OpenTable.

Here is a close up showing OpenTable options in Apple Maps:

Then here is an example of taking a photo of a homework assignment, where it uses OpenAI’s ChatGPT for help:

Why we care. Apple seems to be using AI as a tool rather than a foundation for its devices, where it integrates with Google, OpenAI and other search providers. There is obviously underlining AI and machine learning that is taking place on the Apple device, but the results seem to be coming from third-parties.

An early beta review from the Washington Post suggests it has a long way to go. Specifically it has issues with with hallucinations, marking spam emails as priority, and other problems.



from Search Engine Land https://ift.tt/DFGwBe6
via IFTTT

No comments:

Post a Comment