At the exclusive SOEFEST conference for Tier 1a optimising professionals I had the pleasure of spending some time with a high level Google hygiene engineer who revealed details of a new development that the struggling search engine is hoping to launch this summer!  According to the unnamed insider, engineers at Google’s top secret research basement have created a new “find anything” service which will make the search engine portable inside a new device called Google-Oculars.

I had a sneaky preview of the device and managed to get some details about how it works.

he New Google-Oculars are a lightweight wearable computer which includes a back pack for the battery and processing units, and a  pair of lightweight, subtly branded glasses which incorporate a camera and a new nano-Googler (more on this later…).  The back pack battery is apparently good for almost 3.5 hours of constant wearing, and the glasses themselves are just 8cm thin, and weigh less than 1 KG on the face!

The device works by pulling in light information from the surrounding area and converting it into “data” which is processed by the Nano Googler.  This “data” is then transmitted either by FM radio, or by a wired connection in the home to Google, and includes the following information:

  • Detailed visual data scrapes
  • Location
  • Altitude
  • Wearer’s heart rate
  • Biometric gamma signatures
  • Local temperature
  • ambient eigenvector interaction

A complicated “data” processing algorithm is used to convert the light waves from the outside world into real computerised information that can be used to help users find things.  Google has reportedly developed the ability to identify everyday objects in the real world based on their collection of almost 5,000 different pictures – which are accessible online at the following location: http://www.google.com/imghp.

The Google-Oculars combines the data captured using the camera-otimeter with the location data to establish where a user was when they last saw an item, and then uses their RelRank algorithm to measure the importance of the object to the wearer based on their heart rate and other biometric information which is collected constantly.

According to my source, each pair of Google-oculars will come equipped with a new nano-Googler which has been equipped with a GPS-ometer and is less than 1% of the size of the regular Googler used for most of the search engine’s day to day information getting tasks:

The Nano Googler which has been developed specially for these new devices benefits from vast advancements in technology instigated by Roger Moore.

  • It is less than 1% of mass of a full size Googler – making it easier to carry around inside the computer
  • It is almost 83% transparent, meaning that most users won’t have their vision obstructed by it
  • It is very flat – just 0.8mm thin, meaning that it can slide easily through most data pipes.

By combining the technology of the new nano-Googler with the Google-Oculars and tracking the various differing pieces of information in near real time, the device is able to track up to 17% of all the things that a wearer does.

So what’s all this incredible technology for?

As we all know, after wasps, the most irritating thing on the planet is losing important things like Phones or wedding rings.  The Google-Oculars have been developed to ensure that users will never lose anything again.  By taking a photo every 3 micro seconds, the device can track the instant where “you last had it”, and then feed that information back to the wearer when they need it.

According to tests conducted at the Googlingplex in America somewhere, a Google-Ocular wearer was able to locate a mis-placed set of keys in just 7 minutes, using the device, compared to 7 minutes and 17 seconds without – a saving of almost 5%.

As an added benefit, if the Google-Oculars are unable to find the object that the user has lost, they can provide useful advertisements for tangentially related services.

My unnamed source at Google informed me that the Google-Oculars will be available in time for people’s summer holidays although a price is yet to be confirmed!

Some of the high level SOE professionals at the conference talked about how excited they were at the prospect of optimising for the new device, and there was much talk about using the Shatner Bassoon technique to correlate data sources and apply a transceptional algorithm across phase space to subvert the location based technology and provide prefered results in near real time within a standard radius.

What’s certain is that this opens a new chapter in the SOE adventure that we’re all a part of!

Namaskara.