Google's recently launched Live Search feature in the Gemini app allows users to engage directly with the app using voice and camera inputs. Users can point their camera at objects and ask questions, prompting Google to provide answers and clarifications through conversation. This feature is now being tested in the main Google Search app, showcasing its innovative approach to improving user interactions. Follow-up questions and context-driven responses enhance the search experience, although live streaming capability via the camera is yet to be enabled. This technology reflects Google’s commitment to integrating AI-driven functionalities into everyday tasks.
Google's new Live Search feature in the Gemini app enables users to engage in conversations based on real-time visual stimuli, allowing enhanced search experiences.
The feature leverages camera input and voice queries, facilitating a dynamic interaction where Google refines answers based on user follow-ups.
Collection
[
|
...
]