Last year, Google Lens was announced at I/O for Google’s Pixel devices, and was later rolled out for both iOS other Android devices, even if it was a bit awkward to use for Android (It was used through the photo app).  By taking a photo, you could run Lens against the photo rather than through the camera in real time.

At this year’s Google I/O, Lens’ distribution is expanding to more devices, and directly through the camera app as intended. Google also announced some updates:

  • Lens can analyze text and provide explanatory pictures or background information. Google points out that this involves semantic understanding of meaning and context of words and phrases.
  • Lens will now show you lookalikes. You see shoes you like or furniture out in the world; Lens will get more information about that item (assuming recognition) but can now also show you lookalikes (see below).
  • Google also says that Lens now works in real time “just by pointing your camera.”

Source –