Unlike regular phone Android, Android Things is not customizable by third-parties. All Android Things devices use an OS image direct from Google, and Google centrally distributes updates to all Android Things devices for three years. Android Things doesn’t really have an interface. It’s designed to get a device up and running and show a single app, which on the smart displays is the Google Smart Display app. Qualcomm’s “Home Hub” platform was purposely built to run Android Things and this Google Assistant software – the SD624 is for smart displays, while the less powerful SDA212 is for speakers.
When it came time to build the Google Home Hub, Google didn’t use any of this. At the show, I had a quick chat with Diya Jolly, Google’s VP of product management, and learned that Google’s Home Hub doesn’t run Android Things – it’s actually built on Google’s Cast platform, so it’s closer to a souped-up Chromecast than a stripped-down Android phone. It also doesn’t use Qualcomm’s SD624 Home Hub Platform. Instead, Google opted for an Amlogic chip.
This is such an incredibly Google thing to do. Build an entire platform specifically for things like smart displays, and then build a smart display that does not use said entire platform. It’s a nerdy little detail that virtually no user will care about, but it just makes me wonder – why?
If I understand correctly the feature of Google, organisation-wise, is that each project is a silo with pretty much full autonomy.
So they chose their own hardware, looked for stacks and decided the existing weren’t fitting enough, and made their own.
I’d be somewhat worried if people from the team wouldn’t be able to give exact factors as to the decision not to use the existing smart display stack.