MOUNTAIN VIEW, Calif. — Google is going visual, audio, location-based, social and real-time starting immediately, the search behemoth said today, unveiling a wealth of new enhancements to its flagship search engine.
How soon its users will all get access to all the new features remains to be seen, however.
In a fast-paced presentation for the press here at the Computer History Museum, Google vice presidents Marissa Mayer and Vic Gundotra, along with Google Fellow Amit Singhal, revealed a slew of enhancements to the world’s leading search engine.
Those include visual search via a product known as Google Goggles, audio search via Google Voice, real-time audio language translation (starting with Japanese), location-based “what’s nearby?” searches, and up-to-the-second, real-time searches incorporating Twitter and news feeds.
“Seconds matter in this information environment,” Singhal said as he unveiled Google’s integration of real-time search technology into its universal search home page. “In today’s world, information is coming from around the globe every second by tweeting, posting updates, creating Web pages, writing blogs. Information is being created at a pace I’ve never seen before.”
Rather than crawling the Web for new data once a month, as Google did at its outset — or even once every few minutes as it continues to do — Google must serve up new data instantly as it becomes available, Singhal said.
Now, the idea is that by integrating real-time searches of Twitter, commercial news organizations, blogs and, soon, public Facebook and MySpace pages, Google can lead users to the freshest possible data from moment to moment.
Additionally, by weaving together computing, connectivity and the cloud, Google hopes to create a seamlessly integrated information environment. The addition of visual search, voice search, location services and language translation creates what Gundotra calls “a mouse pointer for the real world.”
“We’ve already seen powerful demonstrations of what happens when you take a sensor-rich device and connect it to the cloud,” he said. “Devices will help us explore the world around us, help us understand our own speech or others’, and augment our own sense of sight by helping us to see even farther.”
While beta testing Google Goggles, Gundotra said he took a photo of a bottle of wine given to him for a dinner party. Using both the system’s image recognizer and optical character recognition to read the label, Google’s search engine returned results that indicated the wine had “hints of apricot and hibiscus blossom” — a bit of trivia far beyond Gundotra’s own wine expertise.
On stage, Gundotra took a photograph of an image of Japan’s Itsukushima Shrine, a familiar tourist attraction. Google Goggles instantly recognized the landmark and uploaded a detailed description to his Android mobile phone.
But there’s the catch: Google Goggles works for now only on images taken with and submitted by an Android phone.
Google spokespeople declined to say when other phones, or even PCs, might be able to submit images for visual search.
Real-time comes to Google
The same limitation doesn’t apply to the real-time search functionality that Singhal also announced, calling it “one of the most exciting things I’ve seen in my career.”
While it may take until mid-week to get real-time search enabled at all of Google’s global data centers, the function is available immediately by clicking on the search windows beneath “Hot Topics” at http://www.google.com/trends.
Unlike Google’s current search results page, which shows only static results and requires the user to hit “refresh” to see any new results, the real-time search results will scroll along a section of the results page. A “pause” button allows users to stop and start the scrolling action, while a scroll bar lets them check out what they may have missed in previous minutes
Google’s partnership with Twitter, announced in October, plays a key role in today’s roll-out of real-time search, Mayer said. She announced two new relationships, with Facebook and MySpace, that will extend the search engine’s capabilities even more by making those service’s public pages accessible via search.
Google also announced plans to make its voice recognition technology available on mobile phones during the first quarter of 2010, which will allow instant voice search as well as instant translation of more than 50 languages.
In addition, the new “nearby” search functionality – which will reveal which stores have products in-stock within a given number of miles, as well as which restaurants or other public facilities are available — opens search to entirely new markets.
“It’s a technological marvel,” Mayer said. “Nothing like this has ever been done before.”