5 Ways To Prepare For Site Closure


Are you planning to shut down your website for a day or more? With advice from John Mueller, Google’s research advocate, here are five ways to prepare.

Mueller shares these tips in tweets while linking to relevant Google help pages.

Spoiler alert — there is no good way to temporarily shut down a website. You should avoid doing this if possible.

– Advertising –

However, there are things you can do to minimize the negative impact.

Mueller’s recommendations include:

  • Use HTTP status code 503
  • Keep HTTP 503 up and running for up to a day
  • Edit the robots.txt file to return status code 200
  • Prepare for the consequences if the site is down for more than a day
  • Expect a reduction in Googlebot crawling

More details on these recommendations and how to manage the negative impact of taking a site offline are explained in the following sections.

1. HTTP 503 status code

When you take a website offline, make sure it provides an HTTP 503 status code to crawlers.

When crawlers like Googlebot encounter a 503 status code, they understand that the site is unavailable and may become available later.

With a 503 code, crawlers know to check the site again rather than removing it from Google’s search index.

Mueller explains how to check for a 503 status code using Chrome:

2. Keep the 503 status code for no more than a day

Googlebot will return to a site after initially encountering a 503, but it won’t return forever.

If Googlebot sees a 503 code day after day, it will eventually start removing pages from the index.

Mueller says, ideally, you should keep the 503 status code for no more than one day.

“Keep 503 status – ideally – for a maximum of one day. I know, everything is not limited to 1 day. A “permanent” 503 can cause pages to be removed from search. Be frugal with 503 times. Don’t worry about the “try again after” setting.

3. Robots.txt – Status Code 200

While closed website pages should return a 503 code, the robots .txt file should return a 200 or 404 status code.

The robots.txt shouldn’t serve a 503, says Mueller. Googlebot will assume that the site is completely blocked from crawling.

Additionally, Mueller recommends using Chrome DevTools to examine your website’s robots .txt file:

4. Prepare for negative effects

As we mentioned at the beginning of this article, there is no way to take a website offline and avoid all negative consequences.

If your website is offline for more than a day, prepare accordingly.

Mueller says the pages will likely be removed from search results regardless of the 503 status code:

“Hmm.. What if a site wants to shut down for >1 day?” There will be negative effects no matter which option you choose (503, blocked, noindex, 404, 403) – pages are likely to disappear from search results.

When you “open” your website again, check to see if the critical pages are still indexed. If not, submit them for indexing.

5. Expect a reduction in exploration

An unavoidable side effect of a 503 portion of code is reduced crawl, no matter how long it takes to prepare.

Mueller said on Twitter:

“A side effect of even 1 day of 503s is that Googlebot (note: this is all with a Google lens, I don’t know of any other search engines) will slow down crawling. Is it a small site? It does not matter. Is it giant? The keyword is “crawl budget”.

Crawl reduction can affect a site in several ways. The main things to consider are that new pages may take longer to index, and updates to existing pages may take longer to show up in search results.

Once Googlebot sees that your site is back online and you are actively updating it, your crawl rate will likely return to normal.


Source: @JohnMu on Twitter

Featured Image: BUNDITINAY/Shutterstock



Leave a Comment