I will specify: the function I am writing about, I have not found on any official list of changes in Android 12. However, it appeared in my case when I updated the Samsung Galaxy S21 Ultra to the new version of the system and I cannot use it on any smartphone with Android 11.
Google Assistant integrated with the Lens. It offers great opportunities
When I have an application launched and I start the Assistant, the Google Lens icon with the caption "Search what is on the screen" now appears on its screen. The potential of this solution is huge.
This function allows you to select and copy text that would not normally be copied. Including the one in the photos.
All text, including the photos, can be translated with one click.
If there are any mathematical operations on the screen, after clicking the "Homework" button, the lens will display the finished solution.
The assistant can search for information about known places, objects or people in photos.
It works similarly with items; The lens searches for clothes or smartphones in the photos.
It all makes using the web a lot easier. What makes me happy is the ability to conveniently copy text from any source.
The Google Assistant has been asking for such an improvement for a long time
The Google lens itself is by no means new. Until now, using it in a similar way was difficult, because you had to first take a screenshot and then send it to the Lens application. An unnecessary complication.
The Assistant, on the other hand, already had an integrated "What can be seen on my screen?" Mode, but it was context-sensitive. If the program has just realized that there is, for example, a photo of a famous person on the screen, it displayed the search results. In another case, however, this function could not be forced.
Full integration of Assistant and Lens is a killer-feature. It remains to be hoped that Google will free it from Android 12 and also make it available to users of older versions of the system.
See also: