|
[br]
[br]I can't imagine having to navigate today's world while visually impaired. From streets to people to items, and even trivial things like preparing a sandwich or knowing the proper toilet sign in a restaurant, it would all be infinitely more difficult without sight, and I have a lot of admiration for those who have to handle these situations every day. Smartphones can make some of this easier, especially with AI at the helm. If Google Lens can identify a dog's breed from a photo, there's nothing stopping it from using the same tech to help visually-impaired people, and that's where Lookout comes in.[br][br][br]Announced at last year's I/O, Lookout is finally available for users to try out. It has three modes: one to help explore the world and assist with cooking, one while shopping for reading barcodes and seeing currency, and the last for reading pieces of text on mail, signs, labels, and more. The app is well done from the start: when you launch it, it starts by asking you which mode you want to use, so as not to cumber you with navigating menus.[br][br]Upon testing it for a few minutes, I noticed it was quick to identify objects and tell me where exactly they were (12 o'clock, 3 o'clock, etc...) and whether or not they had text on them. There's a menu for all recently identified items and a camera to grab a snapshot and upload it, though the latter didn't seem to get me any result. The voice was very quick to move from one item to the next, without pause, and sounded very machine-like, so I hope Google can improve this especially in visually-crowded scenes.[br]
[br][br]
[br][br]
[br]Lookout is now available for the Pixels in the US and in English, though Google says it's looking forward to expanding it to more countries, languages, and platforms later. For now, you can download it on the Play Store, or manually grab it from APK Mirror if you don't live in the US or don't have a Pixel and want to try it out.[br][br][br]#Infinix_India..... |
|