Google Goggles (IMO one of the all-time great product names) is a picture-based search tool, wherein you can take a picture of something and have Google return search results based on it. This could be good to identify something you don't recognize, or learn more about something you do. Lately I've been using the iPhone app Red Laser to archive a list of products I'm interested in and do a quick price comparison search. Goggles seems like a cool way to do the same thing for everything else in the world. This sounds like the first step in a Google-enabled augmented reality service, which I'm sure Google engineers are working on as we speak.
Search by Voice (which is available to be via the iPhone app but which I've never felt the need to use) has been expanded to work on more devices and in English, Mandarin, and Japanese.
What's Nearby is a location-based search that's part of an updated Google Maps app on Android devices, and soon to be available on the web-based Google Maps on other devices (no word on whether the iPhone's Maps app will get the update). What is does is simply give you a list of the ten closest places (restaurants, shops, points of interest) that are near your location.
In short, your location is a very important data point when it comes to mobile computing, and your lat/long coordinates and even what direction your facing and what particular object you're looking at are important parameters for your searching. Google obviously wants to be at the hub of all your searching efforts, and they're trying to pull in all the relevant data to make that searching more effective, and, let's be honest, more flashy and more fun too. We'll see how these feature evolve and expand over the next few years, and whether they're ever as integral to the mobile computing experience on other mobile platforms as they are on Android.