Argh…you’ve spent weeks finessing your SEO, created a killer bank of content, and, yet, your presence is missing from the Google rankings like it ran out on you at the last minute.
First up, don’t panic.
There are many reasons Google might be snubbing your site.
It’s usually pretty easy to fix, as long as you’ve been through a trouble-shooter and identified the spanner in the works.
Here we’ll run through some of the common causes to help you discover why Google isn’t crawling your site and how to correct it. Advancing your technical SEO skill and having a proper website structure can be vital to getting on Google’s good side.
Check if You’ve Blocked Google Bots
Perhaps the most common problem is your robots.txt file has a code snippet that prevents bots from crawling your page.
Another similar issue is a ‘noindex’ meta tag.
The easy solution? Remove the offending code.
It’s also worth checking for any crawl block in the same file; you can use a URL inspection tool in the Google Search Console if you’re unsure whether that’s the problem.
If Google says that its crawlers are being blocked, the problem, unfortunately, is you, not them.
Newer websites typically need at least a week to be crawled, so you can also open a Search Console account and give Google a sneaky little signpost, so they’ll catch onto it faster.
Take a Look at Your URL Structure
The second reason Google isn’t crawling your site is probably that it doesn’t like your URL structure. That could be for any of the following reasons:
- The URLs are too long, complex, and confusing.
- There are too many URLs pointing to duplicate site content.
- Your URLs have more parameters than Google wants to deal with.
If your URLs are poorly structured, the bots would need to expend more bandwidth to make sense of the website, and they’ll often skip over a page entirely.
Having a clear, consistent URL structure makes your website more accessible for a bot to navigate. So keep them short and precise.
This sort of problem is often due to relying on a CMS like WordPress to auto-generate your URLs.
Suggested URL structures are often random or contain useless information that won’t help, like the post ID.
Clean up your URLs, and Google bots will find them much easier to digest.
Make Sure Your Web Content is Optimized for Bots
Let’s be clear – search engine optimization and optimizing for Google bots are slightly different things.
Bots crawl your site, search for meta content, and analyze things like keyword density and the relevance of your information. This information can be used for determining rich snippets that can help boost your site as well.
You want to look at technical ranking factors, so:
- Quality content that is 100% original, filler-free, and without duplicates.
- Sites that are easy to travel around with a navigation bar and internal links to all the essential pages.
- A high text-to-HTML ratio, preferably between 25% and 70%, with minimized JavaScript since the bots rely on signals from the text within your HTML code.
- Reasonably fast loading speeds.
- Good data structures. Look at Schema mark-ups to ensure you’re following best practice structures.
Google bots are busy and also a bit lazy.
Make their job easier, and they’ll be more likely to visit and be back more often.
Investigate Whether You’ve Been Sandboxed
Now and again, Google will remove a website from its index, usually temporarily.
This is a punishment for websites, so it’s often because Google believes you’ve breached a quality guideline or have done something a bit dodgy.
The punishments can be:
- Being banned or de-indexed – you’re entirely removed from the Google search pages.
- Penalties, usually enforced by the algorithm to prevent your page from being found in the search results.
- Sandboxing – Google Sandbox is a rumored filter that stops new websites from crashing into the top rankings. If you suddenly see a drop in traffic and haven’t been penalized or de-indexed, you could have been sandboxed.
None of these issues should be a surprise; Google will notify the webmaster if they’ve violated a guideline.
All is not lost. If you have breached a rule, you can modify the site and then request Google take another look to lift the restrictions.
Count Up Your Internal Links
Finally, it’s worth revisiting your internal links.
Google relies on those links to identify, make sense of, and index your pages, and they’re essential to:
- Create an information hierarchy.
- Spread link equity across your pages.
- Help users navigate the website.
Your most authoritative pages need to include internal links so that every time Google crawls the main pages, it follows those links and indexes the next page down.
If you’ve added a new page, and it’s floating out there in space, tie it into your primary pages with an excellent internal linking profile, and it’ll come in out of the cold to join the indexing party.