Moreover, Google claims that the app won't always work with 100% accuracy, and it will continue to develop the app as it gets more feedback from users. The app helps the visually challenged people identify information about their surroundings. By the way, Lookout was originally announced at the I/O conference back in May 2018, requiring a lot of time for "testing and improving the quality" of its results.
Google has launched a new app called Lookout, which is created to give the visually impaired verbal information about their surroundings, using artificial intelligence technology. Do note, to make Lookout app do these things seamlessly; Google recommends users to hold or hang their Pixel device on their neck or place it in a shirt's pocket - camera pointing outwards.
Lookout is primarily created to work in "situations where people might typically have to ask for help"; Google cites examples like "learning about a new space for the first time, reading text or documents" and daily tasks like "cooking, cleaning, and shopping". Simply launch the app and keep it pointed forward and it will identify objects seen by the camera, which is then spoken out loud.
Google’s Lookout app comes in three modes Explore Shopping and Quick read
Lookout is created with similar technology previously used in Google Lens. The 'Shopping' mode is to help with barcodes and currency, and the 'Quick' mode is best for sorting mail and reading signs and labels. Google says it's working to bring the app to more devices, countries and platforms soon.
Judging by today's Google logo (spells "Google" in Braille), the company is either (a) celebrating Louis Braille's birthday (he was born on January 4, 1809), (b) about to engage in some new accessibility initiative, or (c) both.
Users only have to fire up the application once to use Lookout - they don't have to tap any other button in-app.