First look: Gemini is learning to get smarter when you’re asking about your screen

Wait 5 sec.

TL;DRGoogle Gemini can answer questions about the contents of your device’s screen, but only after you tap an “Ask about screen” button.Gemini appears to be working on a new mode that would infer when you’re talking about your screen, and then automatically analyze it.Google would also offer settings to disable this functionality, so Gemini would need explicit permission to look at your screen.How many different ways does Google give you to make sense out of what you’re seeing on your Android phone’s screen? You could open a screenshot in Lens if you were trying to be as clunky as possible, or just pull up Circle to Search for some quick analysis. Now we’ve also got Gemini ready to lend a hand, and its “Ask about screen” mode is already a deceptively powerful tool. Today we’re checking out one small way that Google could make this screen-searching solution even more streamlined. Authority Insights You're reading an Authority Insights story. Subscribe to our new Authority Insights newsletter for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else.sign up now By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time. Right now, the way you get Gemini to start answering questions about your screen’s contents takes a step or two. First you pull up the Gemini overlay, either with a hotword or button shortcut. Then you have to manually tap on “Ask about screen.” And only then can you ask Gemini your question — if you start directly without hitting “Ask about screen” first, Gemini won’t understand the context.