SEO : See a website like Google Bot

Hi everybody and again sorry for my poor English, but I hope that the following will be easy to understand ! 🙂

Every SEO expert (… and not not only « search engine optimization guys ») really have to care about how Googlebot sees web pages. To many people, seeing a web document like Googlebot only means looking at a web page in its plain text version, without images, css and Javascript… Yes thats a good way to figure out how a search engine spider sees your content, but are you 100 % sure, Googlebot allways sees it in that way ?

In many cases Googlebot sees a totaly different content, unfortunaltly disabling cookies, JS switching user agent is not enough to check as Google’s Spider, but you really must find a good way to make sure that you’ll see the content like Googlebot see it.

Why it’s so important looking at a website as Googlebot ?

Nowadays on Google results pages, your competitors are becoming more and more sneaky. While ranking higher is becoming harder, unfortunatly a new tactic that these competitors, instead of trying to rank better, is making your website rank lower. Negative SEO really exists and can hurts your search engine rankings.

One of the most common issues webmasters are facing when it talks about negative SEO, is getting tons of backlinks from cloaked sites. While you think it’s easy to protect yourself controlling new outbound links from your Majestic SEO, Open site explorer or ahrefs reports, they are sevral ways to hide these new incoming links.

This tactic is single example of what these bad SEO can do… Cloaking a website they own, in a way that the site show at least 3 different versions… 1 special Googlebot version, 1 version for link building analysis tools (as Majestic SEO, OSE, ahrefs and many others), and 1 version to human users. In a second step they creates tons of spammy backlinks (using your targeted keywords) that points to them cloaked website. Third step, once their website droped (for Google bombing) on search engine results, they set a 301 redirect to YOUR website. Just few days and you can start to see your website dropping in search engine results without understanding where is the smoking gun. While Googlebot see a 301 redirect, human users just see a non suspicious website.

How to prevent this negative SEO tactic ?

While this can help in most of the time, sometimes it depends from the clocking script. First thing to do is regularly check if a Google bombing is happening around your keywords. Majestic SEO offering a very nice solution within them « search explorer feature » that let you aware if their’s a Google bombing around. If this happen you have to check the domain that is getting these new tons of links… but once you visited the website, you need to look at it as Google bot. Changing user agent, disabling cookies and JS is not enough. A best way to make the script think that you’re from Google is using a Google tool.

The first idea that came to my mind is the « see your page like Googlebot » from Google webmaster tools. Unfortunatly this feature allows you to see web pages that you own… so how to use this feature to see a third party website . Very simple : Just create a web page on a website you own, then redirect this page to the domain you want to check as Google, then go to you GWT account and uses your redirected url in the see ass Google bot feature… Magic !

If you discover that the negative redirect point to your web site, the next step is to alert search engines within the spam report. While the webspam team receives ton of reports each day, the guys there are more receptive when it’s about cloaking. So, in the description field make sure to give the most important details to ensure that they’ll move fast.. if you do everything in the right way sure they’ll do. Hope it helps.

Soyez le premier Ă  commenter

Poster un Commentaire

Votre adresse de messagerie ne sera pas publiée.


*