How Often Does Google Crawl my Site? 3 Ways to Make it Faster.

How often your site gets crawled by Google varies and it’s mainly based on how often you post content on it and it’s age. 

how often does google crawl my website

Let me explain it this way…

1) Any page that is over 6 months old and posts content daily will likely get crawled at least once a day if not more often.

2) Any page that is under 6 months old and posts content every few days, once a week or less frequently will probably get crawled once every few days.

3) And any page that is under 6 months old but posts content daily will also get crawled once every few days.

Now most people who start a website are likely going to to be in positions 2 or 3 (for at least 6 months) and the ideal state to be in is position 1, where it happens daily.

But this takes time and I’m going to show you ways to increase how often your pages get crawled if you’re in positions 2 and 3 and there’s 3 main methods I’ll be discussing.

Contents

Why this topic is a big deal for SEO…

The general belief and understanding among people doing SEO is that the more often Google visits a page and crawls it, the more it has a chance to rank it better. There is truth to this. 

However, there’s more to ranking than just this (other ranking factors). But in any case, like I said before, I will show you how you can literally control how often Google visits your page and how to intelligently play this game so that you really get the best SEO results your page can get.

Enter the 3 options…

The 3 ways explained:

1) The first way is the fastest and that is through the URL Inspect tool through Webmaster Tools, which you can learn more about here.

It’ll allow you get crawled (it’s also referred to as spiders visiting your page) within an hour, even for existing content.

So with this, you can literally MANUALLY control how often your page gets crawled with up to 500 fetches a month allowed. I highly recommend using the (and personally use) the URL inspection tool whenever you publish new content.

And just as much, you should also be using this SEO tool to inspect old content you update.

2) The second is submitting a sitemap through Webmaster Tools. This is the second fastest option which you can leave alone after it’s set up. Instructions.

3) And the third is literally just posting content regularly. And your page can then get crawled very often, even a few times a day. Like I said above, publishing frequency affects crawling frequency. And doing that at least once a day is excellent. You can also update and republish old content and that will also count just as well.

This is the most hands on approach out of the 3 and it certainly requires the most work, but without it, the other 2 options above won’t really have as much of an impact.

The ideal way to use these methods:

The ideal solution is to publish new content daily, and/or update and republish old content daily (These are SEO ranking tips I advise), and URL inspect it daily through webmaster tools, especially if your website is under 6 months old, which is a number you have seen me mention numerous times above.

The reason I keep saying 6 months is because there exists a sandbox period which usually lasts this long and after it’s passed, the site will generally get crawled more often. Before that period hits, you should be manually using these 3 options to affect your crawling frequency.

The crawling process explained (in case you’re new to this term):

First, let me say that Google has it’s own explanation of how it applies this method to pages, which you can read about here.

But while that link/page I pointed to focuses on the general start to finish process and doesn’t really mention the frequency (how often) at which it sends spiders to your page.

So I’d like to explain my experiences with this process, since I’ve had years of experience seeing it from beginning to end and the general frequencies that I noticed go on, from the very beginning when you’re just starting the page, to the later periods where that page is huge, getting traffic and more…

When you start your website…

And it’s first post, even if you do not use any of the above 3 methods, you may find that may take up to a few weeks for that first post to be crawled, after which usually, the index happens. It’s annoying, but it’s the regular SEO process all new sites experience.

However, if you are the sort of person who posts content very frequently (good for you!), you will find that Google will send it’s spiders back to your page more and more often, usually proportional to your posting rate.

And to be honest, this is what you want, if rankings and SEO improvement is what you are seeking to get. It’s just not an easy thing to do, take it from me.

So generally speaking, posting once a day, will likely get your site crawled at least once a day as well like I said before. There are other factors, such as interlinking and backlinking playing a part, social shares having an impact and more.

2 of the 3 methods above (fetching and using sitemaps) are truly a speedier way to get that first crawl, for fresh content going and I would absolutely advise you use them when your page is still new. 

The Google dance also plays a role in how often crawling happens…

I have a post that in my opinion “brilliantly” (I don’t like bragging, but it’s really a good article) describes the Google dance process and breaks it down into 3 (tier) stages.

You should read that article to understand why it goes on (why rankings go up and down) and why after you hit tier 3 status with your site, that you’ll get crawled, indexed and ranked way faster than in the first 2 tiers.

Let me put it this way:

Tier 1 lasts for about 1-2 months. During this time, the spiders rarely visit new pages. Probably about as rarely as I mentioned above (a few weeks sometimes), but you should still be posting new content and using the fetch option for ever new, fresh piece of content posted, to get it those spiders coming into your page ASAP and it’ll save you a ton of time (it can add up to months saved, literally!).

Tier 2 is going to be where your page is at during months 3 and 4 and you will get crawled more often (if you keep the peddle on the content going). Here, you may get spiders coming to the page within a few days of each new post you put up. Again, I advise using the fetch tool here to also cut down on the waiting time.

Tier 3 is where your page is out of the sandbox which is a “timeout” period for new sites when they first start their SEO process and try to rank on Google. Your page WILL be in the sandbox during tiers 1 and 2, keep that in mind, but again, the fetching will really help out A LOT.

Now when tier 3 hits, you can expect to get spiders visiting the page very often, including DAILY and even a few times a day. It is during this period that I would recommend not using the fetch tool unless you are using it for new content that you just made. For old content posted, the increased frequency of spiders visiting the page will handle that on it’s own. 

In fact, if you have a sitemap set up (option 2), that will itself regulate things and help your old content rise. In other words, let the old content get visited by spiders automatically, but use the first option (fetching) for new content created.

When you want old content to get crawled faster.

Certain old posts of yours (in fact many) will likely not get high rankings overtime, no matter how times they get Google spiders coming in.

But if you are doing SEO, you’ll likely want to get ALL your posts ranked as high as they can, but if that’s not happening to the old ones, here’s what you’ll want to do:

Optimize old posts using these 15 tips I listed. Then either wait (pretty much option 3) for that new piece of content to get those spiders OR do another URL inspection for the newly updated article/post and see if that helps rankings improve.

You will WANT to get old content crawled by the spiders ONLY when there are improvements made to that content, such as more comments added, more content added, or should I say, more GREAT content added and that’s when you will WANT Google to take another look at it.

Note: It’ll do that on it’s own, but again, URL inspecting expedites that process, just use it wisely when it’s truly worth using.

And only do this for truly old posts (we’re talking posts that are several months old) that are not seeing ANY improvement in SEO rankings.

For ones that are a few weeks old, it doesn’t really help to re-fetch those as they are already in the process of going through rankings (so if you fetch those, you may actually restart the ranking process and ironically, slow down the results). It’s the really old posts that have stalled in SEO, that you want to ping Google to take a look at again.

Overall, here’s the summarized info:

In the end, crawling content should be manually controlled on your end when it’s freshly made and published. Have a sitemap on Webmaster Tools and use fetch for newly published posts.

As for old posts, leave them alone and let Google send the spiders to them on it’s own, as they will do it regularly, IF you are growing your page regularly.

Only manually make spiders visit your page (aka use fetch) if an old post is not getting better rankings in Google, and only do that after a few months have passed for the said post/s and if you did the 10 optimization tips I linked above.

8 thoughts on “How Often Does Google Crawl my Site? 3 Ways to Make it Faster.”

  1. I’m very interested in the whole sandbox period aspect of SEO. Where did you find the data points, besides your own, to explain the consistency of the theory?

    How consistent has the sandbox period been throughout all your sites? (Granted that you have more than one active sites).

    Thanks for the informative and thought-provoking post, by the way.

    Reply
    • Pretty much every serious SEO person and site talk about it Angel but from a more anecdotal point, it’s happened to every single site I started over the past few years. 

      Perhaps 10 years ago, I could get a new site to hit top rankings in a day, but after about 2009, most people noticed a BIG delay in their rankings improving and thus this was attributed to and titled as the Google sandbox but you can also title it website maturity or domain authority.

      I can also tell you that every other successful SEO person I know, including those who do better than me, also believe in it and expect it on any new site.

      Reply
  2. Thanks Vitaliy for taking the time to create this informative and educative post. I have gained quite the value from it. Every site owner especially internet marketers would want their sites to rank high on Google especially using the free methods (organic), and that is exactly what you are offering.

    So in essence I should manually control crawling from my end for new content but allow Google to do the same for old content. What about old posts that I update? Should these be left for Google to find?

    Reply
    • Hi Tolu, I have a simple policy for this:

      1) Fetch any NEW content you create and publish to speed up ranking.

      2) For old posts, use the fetch tool only when you make a major update to the post. You usually want to do this if the post isn’t ranking well and it’s been months and it hasn’t shown improvement. Doing this is a win-win because it gives Google the chance to reexamine the post and possibly make it rank higher.

      3) Even if you do NOT use options 1 or 2, Google will usually find your new and old content. If you see that it doesn’t, use the fetch tool on the things it hasn’t found.

      4) Make this whole process easier by going through your old posts and optimizing them, and especially interlinking them.

      Reply
  3. Hi, thanks for the great information and for laying out the Google process in such an easy to understand way. So I’m wondering do you get out of the sandbox just after a certain amount of time like after four months, or is it more complicated than that (probably)? Someone once gave me the advice to submit my new URLs to be crawled and submit my new sitemap each time I created new post. Your information it seems like doing both might be a little overkill but I’m hoping doing both had adversely affected often Google calls my site.

    Reply
    • Hi Lynne, you don’t need to resubmit your site map every time you post something new, but if you log into webmaster tools and see that your site is not getting all of it’s posts and pages indexed, then it makes sense to resubmit the site map. 

      Sometimes Google will not be able to fully crawl your site, and you will want them to recheck it if that occurs. Webmaster Tools WILL tell you out of how many pages and posts, how many were crawled.

      Now for your sandbox question, in my opinion it’s not just a matter of time before it “graduates” from it. If that were the case, then ANY site with little content could just wait a few months and then get high rankings for new articles it puts out, but that is NOT the case.

      From what I’ve learned, it’s a combination of time, and actually building up the site which determines how much authority it’ll have AFTER the sandbox period ends.

      You can think of it as a report card where the more you do during the semester (sandbox period), the better the grade Google will give it and that will determine how well it’ll rank for the content you had during that period and the new content you’ll be putting up after it, which will rank higher IF the grade is higher. 

      Reply
  4. Hi Vitaliy – thanks for sharing this useful article. I am fairly new to this process, and you have explained it in a way that I can understand. I was aware of the importance of asking Google to fetch new content but didn’t know the reasoning behind it. I have bookmarked your site to read later as I noticed an article on Google Sandbox which I have not heard of before. 

    All the best, 

    Diane

    Reply
    • Hi Diane, telling Google to fetch new content simply speeds up the process, but it will do that on it’s own, without a fetch. Just wanted to let you know. 

      Reply

Leave a Comment