How often your site gets crawled by Google varies and it’s mainly based on how often you post content on it and it’s age.
Let me explain it this way…
1) Any page that is over 6 months old and posts content daily will likely get crawled at least once a day if not more often.
2) Any page that is under 6 months old and posts content every few days, once a week or less frequently will probably get crawled once every few days.
3) And any page that is under 6 months old but posts content daily will also get crawled once every few days.
Now most people who start a website are likely going to to be in positions 2 or 3 (for at least 6 months) and the ideal state to be in is position 1, where it happens daily.
But this takes time and I’m going to show you ways to increase how often your pages get crawled if you’re in positions 2 and 3 and there’s 3 main methods I’ll be discussing.
Why this topic is a big deal for SEO…
The general belief and understanding among people doing SEO is that the more often Google visits a page and crawls it, the more it has a chance to rank it better. There is truth to this.
However, there’s more to ranking than just this (other ranking factors). But in any case, like I said before, I will show you how you can literally control how often Google visits your page and how to intelligently play this game so that you really get the best SEO results your page can get.
Enter the 3 options…
The 3 ways explained:
1) The first way is the fastest and that is through the URL Inspect tool through Webmaster Tools, which you can learn more about here.
It’ll allow you get crawled (it’s also referred to as spiders visiting your page) within an hour, even for existing content.
So with this, you can literally MANUALLY control how often your page gets crawled with up to 500 fetches a month allowed. I highly recommend using the (and personally use) the URL inspection tool whenever you publish new content.
And just as much, you should also be using this SEO tool to inspect old content you update.
2) The second is submitting a sitemap through Webmaster Tools. This is the second fastest option which you can leave alone after it’s set up. Instructions.
3) And the third is literally just posting content regularly. And your page can then get crawled very often, even a few times a day. Like I said above, publishing frequency affects crawling frequency. And doing that at least once a day is excellent. You can also update and republish old content and that will also count just as well.
This is the most hands on approach out of the 3 and it certainly requires the most work, but without it, the other 2 options above won’t really have as much of an impact.
The ideal way to use these methods:
The ideal solution is to publish new content daily, and/or update and republish old content daily (These are SEO ranking tips I advise), and URL inspect it daily through webmaster tools, especially if your website is under 6 months old, which is a number you have seen me mention numerous times above.
The reason I keep saying 6 months is because there exists a sandbox period which usually lasts this long and after it’s passed, the site will generally get crawled more often. Before that period hits, you should be manually using these 3 options to affect your crawling frequency.
The crawling process explained (in case you’re new to this term):
First, let me say that Google has it’s own explanation of how it applies this method to pages, which you can read about here.
But while that link/page I pointed to focuses on the general start to finish process and doesn’t really mention the frequency (how often) at which it sends spiders to your page.
So I’d like to explain my experiences with this process, since I’ve had years of experience seeing it from beginning to end and the general frequencies that I noticed go on, from the very beginning when you’re just starting the page, to the later periods where that page is huge, getting traffic and more…
When you start your website…
And it’s first post, even if you do not use any of the above 3 methods, you may find that may take up to a few weeks for that first post to be crawled, after which usually, the index happens. It’s annoying, but it’s the regular SEO process all new sites experience.
However, if you are the sort of person who posts content very frequently (good for you!), you will find that Google will send it’s spiders back to your page more and more often, usually proportional to your posting rate.
And to be honest, this is what you want, if rankings and SEO improvement is what you are seeking to get. It’s just not an easy thing to do, take it from me.
So generally speaking, posting once a day, will likely get your site crawled at least once a day as well like I said before. There are other factors, such as interlinking and backlinking playing a part, social shares having an impact and more.
2 of the 3 methods above (fetching and using sitemaps) are truly a speedier way to get that first crawl, for fresh content going and I would absolutely advise you use them when your page is still new.
The Google dance also plays a role in how often crawling happens…
I have a post that in my opinion “brilliantly” (I don’t like bragging, but it’s really a good article) describes the Google dance process and breaks it down into 3 (tier) stages.
You should read that article to understand why it goes on (why rankings go up and down) and why after you hit tier 3 status with your site, that you’ll get crawled, indexed and ranked way faster than in the first 2 tiers.
Let me put it this way:
Tier 1 lasts for about 1-2 months. During this time, the spiders rarely visit new pages. Probably about as rarely as I mentioned above (a few weeks sometimes), but you should still be posting new content and using the fetch option for ever new, fresh piece of content posted, to get it those spiders coming into your page ASAP and it’ll save you a ton of time (it can add up to months saved, literally!).
Tier 2 is going to be where your page is at during months 3 and 4 and you will get crawled more often (if you keep the peddle on the content going). Here, you may get spiders coming to the page within a few days of each new post you put up. Again, I advise using the fetch tool here to also cut down on the waiting time.
Tier 3 is where your page is out of the sandbox which is a “timeout” period for new sites when they first start their SEO process and try to rank on Google. Your page WILL be in the sandbox during tiers 1 and 2, keep that in mind, but again, the fetching will really help out A LOT.
Now when tier 3 hits, you can expect to get spiders visiting the page very often, including DAILY and even a few times a day. It is during this period that I would recommend not using the fetch tool unless you are using it for new content that you just made. For old content posted, the increased frequency of spiders visiting the page will handle that on it’s own.
In fact, if you have a sitemap set up (option 2), that will itself regulate things and help your old content rise. In other words, let the old content get visited by spiders automatically, but use the first option (fetching) for new content created.
When you want old content to get crawled faster.
Certain old posts of yours (in fact many) will likely not get high rankings overtime, no matter how times they get Google spiders coming in.
But if you are doing SEO, you’ll likely want to get ALL your posts ranked as high as they can, but if that’s not happening to the old ones, here’s what you’ll want to do:
Optimize old posts using these 15 tips I listed. Then either wait (pretty much option 3) for that new piece of content to get those spiders OR do another URL inspection for the newly updated article/post and see if that helps rankings improve.
You will WANT to get old content crawled by the spiders ONLY when there are improvements made to that content, such as more comments added, more content added, or should I say, more GREAT content added and that’s when you will WANT Google to take another look at it.
Note: It’ll do that on it’s own, but again, URL inspecting expedites that process, just use it wisely when it’s truly worth using.
And only do this for truly old posts (we’re talking posts that are several months old) that are not seeing ANY improvement in SEO rankings.
For ones that are a few weeks old, it doesn’t really help to re-fetch those as they are already in the process of going through rankings (so if you fetch those, you may actually restart the ranking process and ironically, slow down the results). It’s the really old posts that have stalled in SEO, that you want to ping Google to take a look at again.
Overall, here’s the summarized info:
In the end, crawling content should be manually controlled on your end when it’s freshly made and published. Have a sitemap on Webmaster Tools and use fetch for newly published posts.
As for old posts, leave them alone and let Google send the spiders to them on it’s own, as they will do it regularly, IF you are growing your page regularly.
Only manually make spiders visit your page (aka use fetch) if an old post is not getting better rankings in Google, and only do that after a few months have passed for the said post/s and if you did the 10 optimization tips I linked above.