After Android M, Google also talked a lot about Google Now – which is getting a major upgrade called Now on Tap. In short, Now on Tap is context-aware, and knows what’s going on in the application you’re using right now. If someone sends you a WhatsApp message that says “Want to have dinner at Chez Fred tonight?”, you can bring up a Google Now overlay without leaving WhatsApp that shows you a Google Now card with information pertaining to Chez Fred. Or, if you’re listening to a song on Spotify, you can just say “OK Google, who’s the lead singer”, and Google Now will provide the answer.
We’re working to make Google Now a little smarter in the upcoming Android M release, so you can ask it to assist you with whatever you’re doing – right in the moment, anywhere on your phone. With “Now on tap,” you can simply tap and hold the home button for assistance without having to leave what you’re doing – whether you’re in an app or on a website. For example, if a friend emails you about seeing the new movie Tomorrowland, you can invoke Google Now without leaving your app, to quickly see the ratings, watch a trailer, or even buy tickets – then get right back to what you were doing.
Developers do not have to do anything to their applications to make them work with Now on Tap – they only need to be indexed by Google.
Sure, if Google would bother to actually make that feature work again. It’s broken for my Moto G and many Galaxy S6 owners ever since a few updates back. It’s still not fixed. And yes, it was most definitely a Google Update that broke it, though no one’s sure if it was an update to the Google app or if it’s a Play Services update that did it. This is all over Google’s own product forums and, while largely an issue on the Galaxy S6, it has also affected 2nd gen Moto G owners such as myself and I have a friend with a Nexus 6 that has also been affected. forget making Now smarter, and make your developers smarter first.
On the one hand cool.
On the other, who don’t you just put a NSA/Google/statsi/KGB goon on my back 24/7?!
Edited 2015-05-28 20:23 UTC
I agree. IMHO, the small benefit of these “services” that track everything you do is not worth it.
Google, may I suggest a new motto for you to try to live by? It could make things a lot easier for you.
Google: Don’t be creepy.
Aligning with the “Don’t be Creepy” suggestion
I’ve had a strong suspicion that Google has been working in a context aware layer and/or listening for a while. And integrating it
I know the Omnibox will be able to makes some very well taught guesses after a few keys stroke a lot of the time just from people’s overall statistical search patters.
But FAR too many times – I have noticed that if I’m listening to something but particularly watching something of TV and there’s a strong topic of conversation going on or something like that – Goggle nails that Topic/Person/etc spot on with the first letter!
So, I’ve been almost sure they are Either actually listening to the background and preloading their search results /omnibox auto-completes with the content;
Or they’re preloading with data from TV listings/schedules and that’s nearly as bad anyway.
Anyhow – this announcement pretty much convinces me.
And while it’s all creepy… we’re going to have a star trek quality computer control dialogue in a few years methinks. Productivity for office chores should rocket
My guess is it works in a different way. And you should think of that similar to as ‘trending on Twitter’ works.
If a lot of people search for something on Google at this time, Google will suggest that term first.
Same as Siri, get the answer wrong a few times and you won’t bother again with it.
Also speaking to your phone still makes you look like an idiot.
I agree about the fail-and-desist. I notice the “still” and also agree — that might change. Simply talking on a cellphone made you look like an idiot not so long ago. Then talking with a headset. Then having a large phone…
I think taking pics with a iPad will always make you look like an idiot, though.
1
Browser: Mozilla/4.0 (compatible; Synapse)