How Does Google Now Work?
Hey everyone! This week, I’m going to cover how Google Now works and some of the incredible features that it has, as well as how it can help you.
Listen
How Does Google Now Work?
If you’re not familiar with Google Now, it’s similar to Apple’s Siri, and Microsoft’s Cortana. It has many commands or questions that you simply say, such as: What is the weather like today?; How many ounces in a pound?; and, What are the names of the seven dwarves?
However, a month ago, Google announced a new feature for Google Now called “Now on Tap.” Basically, this new feature, which will be available in Android’s next operating system Android M, gives Google the ability to determine the context of the question you are asking.
The goal of this new feature is to deliver information to users proactively or without them even asking for the information. Google tries to anticipate your actions to better deliver information, and to make you aware of potentially useful information ahead of time.
OK, so how does it actually work? If you’re using an Android device, you tap the ‘home’ button while you have an email, text message, or other program open, and Now on Tap will use that context to answer your question or deliver you information. It will scan the text on the screen and toss it into a Google search and spit out what it is it thinks you may want to know.
It feeds this information back to you in the form of cards on the screen. If there’s multiple things it thinks you may want, it will pull up multiple cards. For example, if you wanted information on a Hawaiian vacation, it might bring up a card on great resorts, a card for flight information, and another card for an extended babysitter.
Let’s take a look though at how the whole process works though. This is much easier to understand with a simple example. Say my wife were to text me what she wanted for dinner. A simple tap of the home button with the text message open would interpret that I may need to find out what ingredients I need, as well as the closest grocery store and maybe even some coupons.
The same might be the case if a friend were to email me about seeing a movie after work. A tap of the home button would activate Now on Tap. This would bring up trailers, movie times, movie reviews, and any other relevant information.
The really nice part about this feature, though, is that Google will only turn the feature on if you press the home button. This means that Google isn’t sitting there constantly watching and evaluating your data. It also doesn’t keep any information on your Now on Tap requests on its servers.
From a privacy standpoint, this is also a feature that you can opt out of as much as you want, to the point where you can just turn it off altogether. For a week or so, a coworker of mine has let Google take complete hold of his life. He’s given his calendar and email over to gmail, along with his location and routine.
After about a week, Google learned his habits, such as when he goes to work, when he picks his kids up at school. and where he usually eats dinner. After that one week of learning, he started getting updates about traffic, telling him he should leave earlier or later, depending on traffic conditions. Google even told him if he should take a different way to work based on construction in the area.
This is all a part of Google’s initiative to give you useful information before you ask for it. It can pull your appointments from Google Calendar and give you directions, contact information, and anything else you might need.
Overall, it looks like an amazing overhaul to the Google Now functionality that most Android phones have now. Even if you have another phone, such as a Microsoft phone or an iPhone, there is always the Google app, which will have many of the same features as the full-fledged Now on Tap.
Google Now will also have context as to where you are and what you’re looking at if you point your phone in a general direction. One of the famous examples can be seen on a popular Android commercial where the user points her camera at the Eiffel tower and asks, “How tall is it?”
To do this, Google takes your location, and the direction you are pointing your phone, and it then determines that you are probably talking about the Eifel Tower.
Contrary to popular belief, this feature doesn’t directly understand that it is looking at the Eiffel tower. That process is something that Google is working on, but it’s much much more difficult to teach a computer to understand images and context-based images. It is still nonetheless an awesome feature that I’m sure will have many practical applications in the future.
For example, say that you are in an airport, you could simply ask Google many of the questions you would normally ask an attendant, such as “Where can I pick up my bags?” “What gate do I need to go to?” and “Where can I get a cup of coffee?” With the ability to determine where you are and what you’re looking for, it can pull from maps, reviews, and any number of other online locations to get you the information that you need.
Developers will also have the option of integrating information in their pages so that Google can parse a page or an app more accurately, and there may also be an option for Google to “always ignore” information that might be sensitive, such as medical information or banking information.
Come this fall when Google releases Android M, I’m sure that there will be some new features in the operating system, as a well as other aspects of Now on Tap that have not been released. Be sure to check back for any updates!
Well, that’s it for today. Be sure to check out all my earlier episodes at techtalker.quickanddirtytips.com. And if you have further questions about this podcast or want to make a suggestion for a future episode, post them on Facebook QDTtechtalker.
Until next time, I’m the Tech Talker, keeping technology simple!
Google and Eiffel Tower images courtesy of Shutterstock.