Link to main version

1 027

Google's Gemini Live AI assistant learns to see the world through your smartphone - accurately and without lag

Through the camera, the innovation recognizes objects you show it through your smartphone's camera

Google has announced a series of new features for its virtual assistant Gemini Live , which will become even smarter and more useful. It will be able to see the world through your phone's camera, better understand your intonation, and interact with other apps.

One of the most impressive news is that Gemini Live will be able to recognize objects you show it through your smartphone's camera. For example, if you're not sure which tool to use for a given task, you can simply show the assistant several tools and it will help you choose the best option. This feature will be available first for the Google Pixel 10, and then for other Android and Apple devices.

Gemini Live will now be able to work more efficiently with apps like Phone, Messages, and Clock. If you’re discussing a route with your Assistant and you realize you’re going to be late, you can simply tell it to send a message to someone. Gemini Live will do it automatically, notifying your contact that you’re late.

With the new audio model, Gemini Live will be better at recognizing the intonation, rhythm, and pitch of your voice. Soon, the Assistant will be able to change its own intonation depending on what you’re discussing, and you’ll be able to make it speak faster or slower. Gemini Live will even be able to tell stories more dramatically if you ask it to do so on behalf of a specific character.