Google Gemini has released Pixel 9 features. This allows you to discuss the contents displayed on the screen in Real-TimeGemini Live.
Google asks the Gemini AI Assistant for a new perspective and what is on the smartphone screen. Specifically, GEMINI LIVE can “talk about this live” on a Pixel 9 series device, as it was first found on 9to5 Google, can chat in images, files, and YouTube videos in real time. Masu. It is ideal to ask for an explanation of a Mome that is not enough to understand or understand the middle recipe of the tutorial of cooking.
The role of GEMINI LIVE was very similar to the standard AI audio assistant so far, and it was equipped with more Chatgpt conversation models. But now, you can look into a specific content on the screen and add it to the debate. Pixel 9 owners can access the function by launching the floating gemini overlay. Here, youTube displays proposals on the AI ”Talk Live About Video”, “Live About PDF” and “Talk Live About this” in Google files. Image on the screen. It does not need to explain what is there, and gives the context of Gemini. It is much faster than uploading images manually from the gallery.
When the function is activated, Gemini Live opens in a preview on the screen you want to discuss. AI may provide ideas for your destination based on YouTube travel videos, summarize contract PDFs, or explain the Rennaisance Art that you are investigating on your mobile phone. If you feel that it is all in the way, you can automatically stop what Gemini Live is looking at without OG. If you do not have Pixel 9, don’t worry. Google has released the functions of Samsung Galaxy S24 and S25 smartphones soon, and then other Android devices will continue.
Gemini alive
This feature matches Google’s Gemini strategy. If you are not aware, Google dreams of solidifying the Gemini location, especially in the center of mobile devices, as proven by continuous integration with Android. And this function is not the end of the Gemini Live upgrade. Google is preparing for Project Astra. This is a tool kit that will allow users to stream the screen in real time while talking to Gemini Live.
Google is working hard on real -time context support. Instead of generating a response based on an abstract query, Gemini Live wants to be part of that moment, and (if possible), we want to respond to what is on the screen with a useful insight. 。 To maintain Apple’s Apple Intelligence plans and Openai and Microsoft plans, we want Google to be as ugly as possible.