It is a fact that getting to number one in Google is one of the hardest things in the world. Very few SOE professionals ever manage it for one keyword, let alone two, but it is possible, and the key to beating Google is to understand how the search engine works.
The Internet has millions of pages in it, and this number grows every year. Until recently, every single page that went into Google had to be manually checked by Larry Page himself, however since 2008, the search engine has used robots called Googlers to do the job.
Once a web page has been submitted or emailed to Google, it is put in a queue to be processed by a Googler.
Each Googler is able to process more than a thousand web pages each hour. The web page follows a submission vector to the Googler, at which point it is categorised based on the semantic index structures held within each document space. The Googler uses more than ten indicators including meta keywords and how many alt texts there are within the document alongside other calculations such as keyword density.
Once Google has categorised the page into one of its 3 document storage areas, the relationship eigenvectors between the documents are calculated, and added up to find out how many Page Ranks should be awarded to the web page.
Once a web page has been copied into the Googler storage area, it can then be used to provide answers to people:
Once a person has searched for a keyword in Google, the Googler has a rummage around in the file store for all of the web pages that include that keyword and then performs some ranking sums to decide which should be first.
- The main things that the Googler looks for are:
- Whether the meta ranking tag has been included for that keyword
- Whether the keyword is included in the Meta Keywords
- How close to 16.7% the keyword density is
- What the latent semantic eigenvectors for that page are
- Whether the alt texts are present
- How many Page Ranks the page has
All of the ranking sums are done in less than a minute and the results are generated.
The process for most searches is the same, however adult type queries are handled slightly differently because a different Googler is used to prevent the delicate web graph intelligence of the prime googler robots being corrupted.