Latest Android Auto v14.2 seemingly dashes hopes of smart glasses navigation
What you need to know
- Google was spotted rolling out v14.2 of Android Auto, which reportedly features bug fixes and other subtle reworks.
- It was discovered that the latest stable version doesn’t mention “Glasses,” which was previously speculated to concern smart glasses integration for Android Auto.
- The discovery was originally found in an overseas version of the app; however, there’s speculation that there could’ve been a translation mix-up.
Google is busy rolling out its latest Android Auto update; however, previous glimpses of an intriguing new feature have reportedly vanished.
The update was spotted rolling out earlier today (Apr. 21) by 9to5Google for Android Auto drivers as version 14.2. The patch doesn’t appear to be substantial, nor does it feature any noteworthy features for drivers to enjoy. The publication states v14.2 only offers “bug fixes and other minor tweaks,” so it seems Google is keeping things lighthearted in mid-April.
However, what is noteworthy is the supposed removal of a previously discovered (and once assumed) feature in development.
The post says Google has removed all mentions of “Glasses” within Android Auto’s code as v14.2 rolls out. It conducted another teardown of its APK, which discovered a lack of its previous barebones code. Speculation says there may have been a few “alternate translation” issues with the code that could’ve forced the idea of “Glasses” in Android Auto.
With things still woefully unclear, it’ll be worth playing things by ear and seeing if this feature surfaces again in the future.
Google’s Smart Glasses Push
If you missed it, there were rumors that Google was developing a smart glasses-based feature for Android Auto. Leveraged from the Hindi version of the app, a tipster found two strings that mention “Glasses” and the ability to “start navigation to launch Glasses.” However, the way the strings are mentioned, as highlighted by 9to5, could point toward that “alternate translation” issue.
Its post states the English version didn’t match with what was found in the overseas version. So, while it doesn’t seem like Google is working on smart glasses integration, maybe there’s a future where this changes.
Nothing else was found in that teardown earlier this month. What we were able to chew on was Google’s TED 2025 demo, which was all about its Android XR ambitions. At its core, the demo featured the company’s “memory” feature for smart glasses. With them on, the glasses were recording and making mental notes about the placements of items. This is for the betterment of its rediscovery capabilities.
If you lost something, users could ask Gemini where it is, like “Where did I leave my hotel room key?” In the demo, the AI was able to inform the user where their key card is through descriptors and items around it to help them.
Post Comment