Log in

Login to your account

Username *
Password *
Remember Me

Create an account

Fields marked with an asterisk (*) are required.
Name *
Username *
Password *
Verify password *
Email *
Verify email *
Captcha *

SEO 101: What is crawlability or why my website does not appear on Google

SEO 101: WHAT IS CRAWLABILITY OR WHY MY WEBSITE DOES NOT APPEAR ON GOOGLESo, your website doesn’t appear on Google? Even if you search for your brand name, there are no results? Or, maybe, the problem is that your newly made page – that you spent long hours preparing – is nowhere to be found… Feeling desperate, right? OK, even though this is rather scary, take a deep breath and don’t panic. The problem can most likely be solved with a quick fix. However, in order to succeed, you need to understand what crawlability is.

Google has robots (crawlers) that, without ever resting, 24/7, follow links on the internet, going from page to page, discovering new websites and new pages, or checking for changes on the existing ones. This process is called crawling.
So, crawlability is Google bots’ ability to follow a link, enter and ‘see’ a page. If Google crawlers get blocked from your website or page, they cannot know it exists or discover its content, and thus Google won’t add your website or page to its index, and therefore it won’t appear on the search results page. If you notice that some of the pages aren’t being shown in the search results, your first reaction should be to check the crawlability of the website.

Now, this is when things get a little technical, so, if possible, ask your marketing agency or IT team for help, so they can check your website for crawlability issues. But, if you’re curious and want to learn more about this, or if you’re a fan of a DIY approach, let´s dive into the details of where to look when checking the website’s crawlability.

1. Check the robots.txt file. This is the ‘place’ where Google bots will go to look for instructions. Sometimes, it is necessary to block some part of the website from crawling, like, for example, those pages that are still under construction and that you don’t want to be shown on Google just yet. This instruction can be placed on the robots.txt file, so once the crawler comes to the website, checks the robots.txt and sees the instruction, it will not crawl that ‘under construction’ part, as previously intended. You can already imagine what will happen if there is an outdated instruction in that file, or, God forbid, for some reason, there is a teeny-tiny line of code, restricting crawlers from your website as a whole, right?!

If this is the case, here’s what you need to do:
- open your server files;
- open your domain root folder;
- find robots.txt file (if it exists);
- check it for any ‘Disallow’ directives and, if necessary, correct them.
Make sure, you fully understand what you’re doing before editing the file, as you can literally block Google from your website by mistake. If you’re not sure, ask for professional help.

2. Further to the robots.txt, there is one more way to give instructions to Google bots, but on a single page level. It is possible to add a special meta tag to the page – the ‘noindex’ tag.
In this case, the Google bots will crawl the page, but won’t add it to the index, so the page will never appear in the search results. That’s why it’s so important to check that the pages that are supposed to appear on Google don’t have that tag.

In this case, what you need to do is:
- open the page in question in your browser;
- right-click on the page and choose the option ‘view page source’;
- once the code is opened, hit Ctrl+F and search for ‘noindex’;
- if nothing is found, then everything is OK, but if you find a noindex tag on a page that is actually supposed to be indexed, you will have to contact your IT team for them to remove it.

Alternatively, if the issue is specific to a certain page, you can use the URL Inspection Tool in Google Search Console. Checking the page in question there, will give you information, straight from Google, on whether the page was crawled and indexed, and, if not, what might be the problem.

Most likely, after checking these two points, you will have found and solved the problem, and after some time, when Google re-crawls your website, the pages will be added to its index and become available.
However, if the problem remains, seek professional help and request a SEO audit in order to investigate the matter in more detail.

For more information, bespoke strategies and efficient digital marketing solutions, just contact the Clarity’s girls through info@yourdigitalclarity.com or visit our website at yourdigitalclarity.com.

Pin It

You must be a registered user to make comments.
Please register here to post your comments.