Posts Tagged SEO

Black Hat Tips

As one of the world’s leading SOE professionals, one of the questions that I am most frequently asked by clients is what kind of hat they should use while optimising their websites.  While my own preference is to optimise only whilst wearing a white cap, there are other options available.

In this post, I reveal many of the most intensely guarded black hat tips used by the “Order du Chapeau Noir” – a highly secretive organisation that I managed to infiltrate over the last few months, and which should put anyone wanting to try out black hats for SOE into a great position!

Black Beret

Black Beret

Black Beret

The Black Beret is a type of hat principally worn by the French.  It is most useful when optimifying a website that is structured around a clear heuristic sensibility, and of particular use within the cheese industry.

Black Bowler

Black Bowler

Black Bowler

This timeless classic has long been a favourite of city financiers and inscrutable Chinese Valets, as such, it is commonly used by people working in the finance vertical.  The use of black bowler hat SOE techniques such as Clear Unified Natural Tropism will generally result in a +4 to all skill rolls when optimating a website for the keyword “long term repayment mortgage”.  This hat is particularly effective when you are doing the optimisation in Ask Jeeves.

Black Cap

Black Cap

Black Cap

The favoured hat of the colonists, this workmanlike and simple design is often decried for being basic, however it is most useful when optimising a sporting goods website, and when properly used, can lead to a +2 for stamina on secondary search co-efficients in Google and Bing.

Black Yarmulke

Black Yarmulke

Black Yarmulke

The lack of accoutrements on this simple but elegant design make it ideal when working on the high intensity techniques required for developing a long tail mass penetration strategy for Yahoogle.

Black Topper

Black top Hat

Black Topper

The top level black hat club members call this hat “the super effective black hat super star” for a reason.  When you optimicate a website using this astonishing piece of kit, you can see almost instant results across the most intensely competitive clinical keyword groups.

Black Fascinator

Black Fascinator

Black Fascinator

Due to not being a proper hat, the Fascinator is still effective when doing a little bit of optimication on a black hatted website.  You will normally be able to rank at number 10 or less when you use this for a fashionable website or one selling wedding gear.  Despite being black, it does not work for funeral sites – except in Liverpool.

Black Fedora

Black Fedora

Black Fedora - The ultimate Black Hat!

Only level 9 members of the most secretive order of the Chapeau du Noir are able to successfully utilise the enormous power of the famed black fedora.  This remarkable piece of dark black headgear confers almost limitless power on the wearer when it comes to any type of cross site scripting, or SQL injection technique.  Wearers are able to use otherwise blocked API techniques to mass create blogger domains that use stealth cloaking to block all visitors and redirect using 307 to a pharmacy or predatory lending website.  If the Fedora is used in conjunction with black Ray Bans and leather pants, other Doors are opened to the wearer, including the ability to generate on demand number 1 rankings in Google without the use of more traditional techniques such as Meta Rank, or Latent Semantic Eigenvector Distributions across the Social Hierarchy Graph!

Tags: , , , ,

Social Hierarchies in Term Engagement

If you are operating at the cutting edge of SOE right now, the chances are that you are investigating the impact of Social Hierarchies in Term Engagement.  By performing analysis into the way in which people from different social orders interact with your website, it is possible to deliver superior content to users and improve the quality of interaction within your website.

I was recently the keynote speaker at a highly exclusive conference for the highest echelons of online marketing in which I made the following presentation that provided insight into the best ways to use S.H.I.T.E. as part of a comprehensive digital marketing strategy.

For those unlucky enough not to be amongst the audience at the event, I have added the presentation I gave below:

Slide 1

Slide 2

Slide 3

Slide 4

Slide 5

Slide 6

While this particular technique is likely to go well above the level that most SOE professionals will understand or be able to achieve, it is of increasing importance to familiarise yourself with the concepts behind S.H.I.T.E. in order to be able to communicate effectively with clients.

Happy Optimising

Namaskara

Tags: ,

2009 The year in SOE

2009 The year in SOE
2009 has been an epic year in SOE.  We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites.  More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day.  In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.
January
Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
February
Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag.  Black Tuesday saw more than seventeen previously successful websites disappear entirely.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
March
Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
April
Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
May
Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing.  Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.
Google Frank Update – Google’s response to the launch of Bing was predictable.  The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
June
The engineering department at Googleplex were on holiday this month, which meant that no new websites were made.  To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
July
It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update.  Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
August
Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
September
Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings.  Some websitemasters boasted of getting up to a hundred extra visitors as a result.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
October
Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code.  Google Engineers suggested that this was merely an echo of a previous update being expressed in the code.  Nonetheless, the search company quietly retired chocolate a week later.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
November
Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world.  SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
December
Google Joshua Update – An all new Googler was released as part of the Joshua Update in December.  This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.
The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.
Search Casualties
Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009.  Some of the most notable engines that will not see 2010 were as follows:
Searchbot.com
Searchfunk.com
Directorysearch.com
Usearchwesearchallsearch.com
Doasearchnow.com
Pleasesearch.com
Betterthangoogle.com
Greatinternetsearchengine.com
Cuill.com
2010 Predictions
Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:
Dogpile.com
We plan to double the size of our index to provide greater relevancy to both our users – watch this space.
Lycos.com
We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.
Infoseek.com
We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.
Ask.com
We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.
Bing
We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.
Google
We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.
Quote of the year
Larry Page to Bill Gates on the subject of Bing:
Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force
Predictions for 2010
The Googler Funding Bill is passed. The system goes online on August 4th, 1997. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

2009 has been an epic year in SOE.  We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites.  More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day.  In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.

January

Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

February

Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag.  Black Tuesday saw more than seventeen previously successful websites disappear entirely.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

March

Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

April

Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

May

Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing.  Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.

Google Frank Update – Google’s response to the launch of Bing was predictable.  The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

June

The engineering department at Googleplex were on holiday this month, which meant that no new websites were made.  To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

July

It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update.  Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

August

Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

September

Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings.  Some websitemasters boasted of getting up to a hundred extra visitors as a result.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

October

Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code.  Google Engineers suggested that this was merely an echo of a previous update being expressed in the code.  Nonetheless, the search company quietly retired chocolate a week later.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

November

Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world.  SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

December

Google Joshua Update – An all new Googler was released as part of the Joshua Update in December.  This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

Search Casualties

Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009.  Some of the most notable engines that will not see 2010 were as follows:

  • Searchbot.com
  • Searchfunk.com
  • Directorysearch.com
  • Usearchwesearchallsearch.com
  • Doasearchnow.com
  • Pleasesearch.com
  • Betterthangoogle.com
  • Greatinternetsearchengine.com
  • Cuill.com

Search Engine Goals for 2010

Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:

Dogpile.com

We plan to double the size of our index to provide greater relevancy to both our users – watch this space.

Lycos.com

We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.

Infoseek.com

We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.

Ask.com

We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.

Blekko

We plan to launch a genuine competitor to Google next month, and we have secured additional funding.

Bing

We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.

Google

We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.

Quote of the year

Larry Page to Bill Gates on the subject of Bing:

“Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force.”

Our Prediction for 2010

The Googler Funding Bill is passed. The system goes online on August 4th, 2010. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

Tags: , , , ,

Rank Better in Bing

I was recently involved in a special technology retreat with some of the leading tech bloggers, and as you would expect, we all started talking about search engines and where the next challenger for the Google might come from.

There are loads of start ups at the moment, and picking one of them is pretty hard, because they are all equally amazing, however the name that everyone seemed to be talking about was one called “bing”.

Bing, which you can try for yourself at Bing.com, is a search engine like Google or Cuill, but with a twist – it works on Windows, unlike most of the other search engines where you have to access them through the internet.

Of course the big question for SOE professionals is “How do I rank better in Bing for my main keywords?”

In order to find out, I made a phone call to a leading engineer at Microsoft to ask him what the key to number one rankings in Bing was, but apparently (typical Microsoft!) it is a secret, and they can’t make it public.

Luckily, I have managed to run some tests on this new search engine, and got the answers that the SOE world needs!

There are many similarities between Bing and other search engines.  Like Google, Bing has a special electronic robot that they have trained to read HTML.  The Bingle is apparently just as clever as the Googler, only bigger.  It is kept in one of Bill Gates’ air conditioned garages near Seattle, so it is cold and bitter.  As a result of its bitterness, it will not forgive any errors on your pages.

Things to consider for ranking better in Bing

Things to consider for ranking better in Bing

The key factors that you need to remember to rank well in Bing are as follows:

Use a .Net domain

these are like the .net framework and can fool the Bingle

Write your pages in .aspx

Bingle cannot understand HTML properly, and reacts badly to php

Do not use Flash or AJAX

Bingle can only understand Silverlight

Host your website on an IIS Server

Bingle cannot interface with Apache, and any sites on Apache will not be included in Bing

Code your pages using Front Page

The Bingle is incompatible with the code used by other software, and can get broken by it. Once Bingle breaks, it takes almost an hour to start again, and it will not come back to your website.

Tags: ,

Alternatives to Links

Although links are all the rage in SOE right now, a discussion I had with an employee from one of the top search engines suggested that there are alternatives to traditional links.  In addition to the 5 types of SOE link, Google also considers other factors as part of their calculation of how many page ranks a web site should have.

Despite the publicly promoted image of the Googler as being a sort of computer programme that lives in a computer in Matthew Cutts’ garage, the truth is that there is actually evidence that the Googler is in fact a real mechanical robot that is kept in Larry Page’s air conditioned shed where it spends time looking at the Internet and helping little Tanzanian children to read:

The Googlerplex

The Googlerplex

In addition to having super fast 500K broadband that enables it to look at the whole internet at record speed, the Googlerplex also has a device called the Optiscope, through which the Googler robot is able to visually see any spot on the earth’s surface via a network of mirrors.

According to my contact who worked in a classified hygiene management project at Google, it is this unique optiscope that provides the Googler with a credible alternative to traditional links when it comes to  deciding how popular websites should be.  This connection to the “real” world lets the Googler see physical citations of websites, and factor these into the ranking:

Page rank equivalency table

Page rank equivalency table

Shops offer a high number of page ranks because any company that has them is likely to treat customers well, and appearing in a newspaper is also good because it means a company is trusted.  Billboards don’t offer much for a company because they are the equivalent of a paid link – although apparently a graffiti of your URL is worth 7 page ranks because it is editorially given.

A typical Googler view

A typical Googler view

In order to take full advantage of real world link alternatives, it is important to be certain that you associate them with relevant semantic indicators to provide the same information that anchor text does on a website, so a good tip is to include your standard meta keywords on any posters along with the URL of the product that you are advertising.

Happy optimising,

Namaskara

Tags: , ,

Alt Texts – A Massive Missed Opportunity

For any SOE professional, there is nothing more frustrating that doing everything that you think needs doing to a website and still not be at number one.  You might have updated the eigenvector distribution and upgraded the meta keyword semantics until you’re blue in the face, but you still don’t see any benefits.  It’s frustrating, and can make you wonder where you’re going wrong.

Something that a lot of less experienced SOE professionals often overlook is alt texts optimisation.  Very few people realise the power that alt texts have to help you rank well, and they are still one of the best kept secrets in SOE.  Thankfully, they are very easy to implement on your web site with the addition of just a few hundred lines of code on every page of your website.

If you imagine that the Meta Keywords tag is used to provide a semantic guidance protocol at a page level of your website, and inform search engines what the whole content of the page is, the alt texts are used to explain certain smaller elements of the page for search engines:

Alt texts tell the googler about your page

Alt texts tell the googler about your page

Because the Googler is able to see the HTML code that makes up your website, it can read tags like alt tags very easily, and use them to understand how important your pages are.  Basically, whenever the Googler sees an alt text on your page, it knows more about what it is seeing.  This is because the Googler is like the Rain Man, and although it is as clever as a scientist, it can’t understand properly and gets confused.

You should add alt texts to the following things within the document vector hierarchy within your pages:

  • pictures
  • links

If you have an image on your page, you should make sure that you tell the Googler exactly what it would see if it had proper eyes.  You should make sure that you include all of your keywords in the alt texts.  If you had a picture of a girl on your web site about loans, you should text it up like this:

<img src="girl.png" alt="this pretty girl is happy to get loan and borrow loans from lending money with good interest, no fax loan, payday, finance, laon, and mortgage finance credit bad" />

By doing this, you can tell the googler exactly what the picture is and improve your SOE.

When you are using alt texts on links, you should include all of the meta keywords that are used on the page that you are linking to.  This gives you the best semantic distribution, and ensures that all the eigenvectors line up properly.  If you are linking to a page about dogs, your alt text might look like this:

<a href="dogs.html" alt="dog, dogs, canine, labrador, alsation, chihuahua, shi'tzu, pug, doggie, fido, rover, gnasher, lassie, collie, west highland terrier, terriers, schnauzer" />

It is important to remember that the Googler can get bored if you use too many words in your alt texts.  When we have done experiments, we have found that you can use 137 different keywords in each alt text.

Happy optimising

Namaskara

Tags: , , ,

Route Augmentation with Interaction Probability Vectors

One of the hottest areas in SOE right now is the implementation of route augmentation on websites to improve the user friendliness of the web interface.  Although the actual science of route implementation is highly complex and requires thorough knowledge of high level techniques such as interaction probability vectors, it is becoming a very important tool in the armoury of a good SOE, as modern search engine robots like the Googler are very much like real people when they come to your web pages, and there are certain things that you need to do.

When most people look at a website, they do not realise how important certain parts of the page are for the Googler, and how much of a website is only there to help the Googler find things.  Meta Keywords and the like are completely invisible to users, but are very much a requirement to help a website to rank.  It is also important to include overt signals to the Googler to add significant augmentation layers to the website’s usability.

A typical interaction vector diagram is shown below, and indicates how the Googler reacts to a normal web page:

This is how a Googler normally interacts with the page

This is how a Googler normally interacts with the page

A simple change such as making one of the links a different colour can make a big difference to the probability vectoring that the Googler uses, and augment their route through the site:

In this version of the interaction vector, the Googler has a greater probability of visiting one page over the other

In this version of the interaction vector, the Googler has a greater probability of visiting one page over the other

It is possible to perform route augmentation through a variety of different means, and all of them will have a different interaction probability vectoring impact on use.  The example above uses red text, and increases the probability of the Googler visiting a particular page by exactly 50%.  As the Googler prefers green text, a link that is green, will increase the probability vector  by 67.4%.

Other page modification metrics that most people will overlook include link placement, the use of images, and even text that flickers.  It is important to experiment with the probability vectoring that you use for route augmentation in your website, and consider what impact it will have by asking Google via their feedback form.

Happy Optimising

Namaskara

Tags: , ,

Page Ranks – Google’s Measure of Web Link Quality

I was chatting with a couple of my best SOE friends this afternoon, and they were stressing the importance of getting as many Page Ranks as possible for client’s websites.  This is something that I have long advocated, and I was glad to see that other SOE people have come around to the idea.

Often, people who are new to the SOE world of meta tags and vertical semantic integration really struggle with what Page Ranks are and why they need them.  In my introduction to SOE seminars, I explain the importance of Page Ranks in the following mnemonic, which has served me well:

To get a page to rank, get Page Ranks

Basically Google has 2 different categories of Page Rank.  Basic and Green.  There are 10 different types of Basic Page Ranks and each of these have different properties.  The exact blend of Page Ranks that your site gets from the rest of the websites determines the Green Page Rank score, which is used to decide exactly where it should rank, and what it should rank for.

The 10 different flavours of basic Page Rank are as follows:

Page Rank 1 – User Friendly

Page Rank 2 – Correct Spelling

Page Rank 3 – Meta Keywords Present

Page Rank 4 – Page has a clear semantic latency

Page Rank 5 – Interesting Content

Page Rank 6 – Useful (eg How To Page)

Page Rank 7 – News type website

Page Rank 8 – Highly SEO’d Page

Page Rank 9 – Secret

Page Rank 10 – Website owned by a Googler / Major Government

The Page Rank of 9 is a secret known only by staff at Google, although it is thought to apply to groups of web pages, and be calculated via a pagination probability vector.

Google assesses the blend of basic Page Ranks that link to a page, and then assign it a secondary decimal value which is the green Page Rank that allows it to rank websites.  This  can also be seen in the Googler tool kit as a green bar (incidentally, green is Sergey Brin’s favourite colour):

How Google uses basic Page Ranks to calculate the Green Page Ranks for a page

How Google uses basic Page Ranks to calculate the Green Page Ranks for a page

Once the googler has done the calculations about how many green page ranks a particular page should be awarded, this figure is stored and then used every time Google has a search for one of the keywords on the page.  All of the results are simply ordered in terms of the number of green Page Ranks that the websites have.

Getting more green page ranks can be quite tough, but asking for links from other websites that also have a lot of green page ranks to share is a good method. It is also important to get a lot of basic page ranks that are of the same type as your website, so if you have a news website, it is a good idea to get lots of other websites with Basic Page Rank scores of 7 to link to you.

Happy optimising.

Namaskara

Tags: , ,

What makes a good keyword

There are two questions that come up a lot whenever I speak at a conference.  The first is whether I will give an autograph, the second is what makes a good keyword.  The answer to the first is yes, the answer to the second is a bit more complicated.

A keyword is used by people when they are trying to find something using a search engine like Ask or Hotbot.  The user enters the word into the search box, and then the search engine machine takes the word and finds information about it.  Think about the search engine like an amazing lock that opens millions of doors, and the keyword as a word that is a key to just one of those doors.  When that key is put into the searcher, it unlocks the door and allows the user to get the information that they want.

Knowing which keyword is right for your website is a fairly complex process and when I am providing a keyword for a website, it can take a lot of work to thoroughly investigate the various semantic paths that are possible.  For a website that sells a particular product, it is important to look at the various different products on the website.  I use a range of special tools including dictionary.com and the thesaurus on Microsoft Word (power tip: highlight the word and press <shift>+F7 and you will get a list of synonyms).

I will usually make a semantic hierarchy diagram like this one:

How a semantic hierarchy looks

How a semantic hierarchy looks

This is then provided to the client in order to explain what a user is looking for when they are looking for a product.  For a website or web page selling something like dog food, I will create a number of keywords that users are likely to be  interested in looking for.  Examples would be “canine nutrition”, “labrador meals”, or “Chihuahua Cheese”.

These words should then be inserted strategically throughout the page into the various hotspots like the H3 and Alt Tags.  Remember to add them as appropriate to the Meta Keywords element – they should be done alphabetically in order to match the eigenvector distribution patterns that the Googlers look for.

Once Google has visited the page and read the code, they will see the keywords that you have put into the page and make you number one, or at least put you into their results.  You can see how well you are doing in Google by searching for your keywords and then seeing where your site ranks.  I like to click on the results.

Happy Optimising.

Namaskara

Tags: , , ,

A little known Google Secret

The only constant about SOE is change.  What worked yesterday won’t work today, although it might work tomorrow, but probably not next Thursday, but it might start to work again in June next year – at least until the end of September, when it will stop working again.

While the hottest technique at the moment is the inclusion of Meta Keywords and eigenvectorising internal dynamic latencies, it is only a matter of time until Google’s algorithms change, and new techniques are required – who knows, next year, something as left field as iterative semantic branching or distributive trust methodologies could be the main way of getting to the top of the search results.

I recently had dinner with a top engineer from Google – lets call him Scat Butts, and he revealed that there is a flaw in the Google Algorithm that acts as a kind of Easter Egg, and releases additional functionality without recourse to other techniuques.

While a number of SOE professionals are now aware of the benefits of Meta Keywords Inclusion, few know about the various other Meta tags that are available including one called Meta Robots.

From my understanding of the conversation, the Meta Robots tag is a direct means of communicating with the Googlers who come to your website to decide how they should treat it.  If you don’t use it, your page will be confusing to the googler when he comes to read it:

Googler doesn't know what to do without being able to see the Meta Robots

Googler doesn't know what to do without being able to see the Meta Robots

On the other hand, if you do include meta robots, the following will probably happen:

Googlers know what to do with a page when the Meta Robots are included

Googlers know what to do with a page when the Meta Robots are included

How to do it…

From my understanding, the actual inclusion of a Meta Robots element in the page is a surprisingly complex act, and it is essential for any SOE who wants to do this to think carefully about what they are doing.  The element is broken into the following parts:

Name

This is where you put the name of the robots that you want to target.  Thankfully you do not need to know their actual name – Robby, Metal Mickey, Dalek Sec etc, you just need to know the name of the organisation they are from.

In most cases, you will want to instruct all robots to do something, although there are also specific instructions for individual robots that you might want to consider.

Content

This part is where you put the actual instruction.  This can be something like “noindex,follow” if you are an entry level user, however power optimisers can also include more specific instructions such as “rank, 1, loans”, which would advise the search engines that you want to rank at position 1 for the keyword loans.

Rank Command

The Rank command only works in Google, and you will need to add a separate line for each term that you want to rank for – Google does not branch these terms semantically, so if you want to rank for both “loan” and “loans”, which is a plural, you will need to add two lines.

One other thing to remember is that the ranking command uses a Dutch auction model to determine where the site should be placed – imagine if more than one person wanted to rank at number one – there would be chaos at the top of the Serps.

In competitive search results, you might need to put in a value for rank of something like 0.00000000000000000000000000000000000124 in order to rank at the top of the results, however for less competitive terms, you might be able to rank at number one with a bid as high as 10001, as not many people know the technique, and as such are not using it yet.

The full code that you need to add to your pages is as follows:

<meta name="robots" content="rank, 1, SOE" />

Happy optimisation – why not post your successes below along with the ranking number that you needed to add for a number one position.

Namaskara

Tags: , ,