fbpx
Log in

Login to your account

Username *
Password *
Remember Me

Create an account

Fields marked with an asterisk (*) are required.
Name *
Username *
Password *
Verify password *
Email *
Verify email *
Captcha *

SEO 101 by Clarity: Crawlability or why my website does not appear on Google – PART 2

SEO 101 by Clarity: Crawlability or why my website does not appear on Google – PART 2Last week we’ve looked into crawlability and why it is important. We also discussed that whenever we see that a page, or the whole website does not appear on Google’s search results, even if we search for the brand’s name, the first and most important thing to do is to check for crawlability issues.

CLICK HERE to see Part 1.

So, what does it mean and where to look?

  1. Check the robots.txt file. This file is a place where Google bots will look for instructions. Sometimes, it is necessary to block some part of the website from crawling, for example, that part is still under construction and you don’t want to be shown on Google just yet. The respective instruction can be placed on the robots.txt file, so once the crawler comes to the website, checks the robots.txt and sees the instruction, he will not crawl the “under construction” part of it as previously intended.

Now, you can already imagine what will happen if there is an outdated instruction in that file, or, God forbid, for some reason there is a line, restricting crawlers from your website as a whole, right?!

So, here’s what you need to do:

- open your server files;

- open your domain root folder;

- find robots.txt file (if it exists);

- check it for any “Disallow” directives and, if necessary, correct them.

Make sure, you fully understand what you’re doing before editing the file, as you can literally block Google from your website by mistake. If you’re not sure, ask for professional help.

  1. Further to the robots.txt, there is one more way to give instructions to Google bots, but on a single page level. It is possible to add a special meta tag to the page – the “noindex” tag. In this case, the Google bots will crawl the page, but won’t add it to its index, so the page will never appear in the search results.

That’s why it’s so important to check that the pages that are supposed to appear on Google don’t have that tag.

So, here’s what you need to do:

- Open the page in question in your browser;

- Right-click and choose the option to see the page source code;

- Once the code is opened, hit ctrl+F and search for “noindex”;

- If nothing is found, then everything is OK. But if you find a noindex tag on a page that is supposed to be indexed, you will have to contact your IT team for them to remove the tag.

Most likely, after checking these two points, you will find and solve the problem and after some time, when Google re-crawls your website, the pages will be inserted to its index and become available.

If the problem remains, seek professional help in order to investigate the issue in more detail.

For more information, bespoke strategies and efficient digital marketing solutions, just contact the Clarity’s girls through info@yourdigitalclarity.com or visit our website at www.yourdigitalclarity.com.

 

Pin It

You must be a registered user to make comments.
Please register here to post your comments.