C. Scott Brown/Android Authority
TL;DR
- Google has changed Circle to Search’s “Search using your entire screen” feature to “Ask about screen” in the latest Google Apps beta (v17.18.24).
- The feature now automatically sends URLs, page content, and PDF data to Google to provide more contextually relevant AI-generated results.
- This functionality is currently being released to beta users and is expected to arrive in the stable branch in the coming weeks.
Google has a variety of ways to search across the Android system, but Circle to Search is one of its most useful additions. It’s very convenient to trigger and very easy to run search results, but it is a bit limited in that it can only view content on your screen. If you’ve ever wished you could automatically add more references to circles in search queries, Google has heard your prayers. Changes made to Circle to Search now allow users to add more context, beyond your visible screen content, in the form of URLs and PDFs, to search through Google Search’s AI mode.
Circle to Search on Android already lets users attach a screenshot of their screen contents to generate suggestions and search results when using the “Search using your entire screen” option. The results for this are processed directly in the AI ​​mode of Google Search.
Don’t want to miss the best of Android Authority?


With Google Apps v17.18.24 beta, Google is making some changes to this process. The search box now says “Ask about screen.” While screenshots are still attached as always, the text description now says “Search with more context.” It clarifies that “available context, including URLs, page content, and PDF documents, will be automatically sent to Google,” with AI mode very likely to process these URLs and PDFs.
We were able to trigger uploads of URLs and PDFs when using “Ask about screen” in Google Chrome and Files by Google, but this could potentially work with more apps.
As you can see, with the ability to upload URLs and PDFs, the AI ​​mode through Ask About Screen can give you more contextually relevant responses rather than just giving you responses based on what you see in the screenshot.
Yes, users can already do this through the Gemini overlay in Chrome and the Files app, and that route has its advantages, like letting you control the model you’re using. However, for free users, or even just those who like the Circle feature for searching or prefer more summarized answers from the AI ​​mode, this is a boon.
This change is already being gradually rolled out to users in the beta branch. Assuming Google is satisfied with the test results, the changes should be merged into the stable branch in the coming days and weeks.
Thank you for being a part of our community. Please read our comment policy before posting.
