Quantcast
Channel: Tech
Viewing all articles
Browse latest Browse all 13772

How Googlebot Ticks: Optimizing Your Website Content

$
0
0
2014-10-28-websiteoptimization.jpg

To many, Google is an absolute wonder. They have the infrastructure and the man-power necessary to build a highly advanced algorithm to regularly scour the web, indexing and understanding the content it finds.

If you're a website owner, you know that Google comes by your website regularly to inspect your website's pages, judging and analyzing the quality of everything it finds. This is how Google determines search result relevance when visitors are looking for information on a certain term, so naturally, you want to make a great impression when they see your content.

There are thousands of ways to improve the onsite optimization of your website, but in this article, I want to dig more into the logic and psychology behind Googlebot; something that isn't covered very often in the SEO world.

Let's dig right in.

Google knows which pages are seen the most

Google has a very good understanding of your website's pages and your site architecture. The pages in your navigation help users navigate throughout your site, and other less visible areas of your site may be clicked less often. Google understands this and is privy to user experience and browsing behavior.

Lets take a website owner with a blog, for example. He knows his blog is important for SEO, but he doesn't want it in his main navigation bar, as it isn't as important as his money pages. He decides to put in his footer instead, as there will still be a link there, and he knows Google will still find it.

However, this also makes it far less likely that the bulk of his visitors will ever find his blog. As a result, his blog (a very important part of his site) will likely be considered less important than if it was a core part of his website architecture. Can you blame Google?

It's important to take user experience into consideration when optimizing your website.

Don't try to hide anything

Sometimes, there's a good reason for hiding content and code from search engines. More often than not, however, people do it to hide content that Google may not like to see, such as affiliate links or landing pages.

Certain pages and files, like JavaScript and CSS code on your website, can be easily hidden from search engines by disallowing crawling access in your robots.txt file on your website. However, Google has been known to crawl them anyway; after all, they need to determine that the code and links aren't hurting user experience. They're regularly sending their visitors to your website, and they need to be able to trust it.

So why bother hiding your CSS and JavaScript? Some studies show it can actually hurt your rankings; the more you try to hide or obfuscate your content and code, the less trustworthy your website becomes.

Thin sites look weak

Would you buy a book that had 6 pages? Probably not. Would you a buy a book that had over 600 pages? More than likely.

Google is far less likely to hand out authority to "thin" sites; sites with very little actual content or pages. These sites are very easy to create, and most spam websites are very thin (because little effort was put into the creation of it).

Having a rich content-filled site shows Google you care about your site and you put effort into it, which can make a world of difference on your site performance.

Google knows when your backlink isn't organic

Googlebot is looking for links that are naturally derived. A website owner linking to your site on their own good will is a good sign that your content is quality and worth sharing, which is a vote of confidence to Google.

However, this has been heavily abused over the years, and as such, Google is eerily efficient at determining if a link was organic or not.

Put yourself in a webmaster's shoes when building links. Would you really link to their homepage with their primary keyword they're trying to rank for as the anchor text? That's generous. A little too generous. Actually, that would never happen in the real world.

Link to pages of high value, and do so in a way that actually adds value to your content. Otherwise, Google is going to spot it from a mile away.

Final thoughts

Google's hardest job is maintaining search relevance, and to do that, they need high quality trustworthy websites packed with solid content that will help their visitors find what they're looking for.

This is the reason Wikipedia is on the front page of Google for nearly every search term; it's trustworthy, regularly updated, full of great content, and linked to from millions of authoritative websites.

Create a site that is easy to navigate, maintain a link profile of quality natural links, and don't do anything deceptive with your content and links, and you'll be on the right path towards improving your website presence.

Viewing all articles
Browse latest Browse all 13772

Trending Articles