The ONE THING You Need to Check Before Launching Your New Website
Book a Free Consultation ▸OK, so there are actually a few things that you need to check before launching a new website. But you don’t have to worry about them because you paid an agency good money to handle these details for you, right? Well…I have some bad news for you. Not every agency pays attention to all the details necessary to ensure a successful launch or relaunch. While I could argue that 301 redirects are the most important (and they definitely are critical to a successful relaunch), we’ll save that blog post for another day.
So here is the tip – the minute your website is launched you should check your Robots.txt. If it says “Disallow: /” than you should reach out to your development agency immediately and tell them to unblock your website. Then you should find a new agency to work with that would not have made such a sloppy error.
Question #1 – How do I check this?
Just type your domain into the browser followed by “/robots.txt.” If your website is www.GoRedSox.com (Sorry Yankee fans) then type in www.GoRedSox.com/robots.txt.
Question #2 – What do I look for?
Let’s keep this one simple. There are several things that an experienced SEO expert would look for, but you just need to make sure your website is not blocked from the search engines. You do not want to see this:
User-agent: *
Disallow: /
If that is what you see, then your website it telling search engines not to visit any page on your website—and that’s not good. Another way to tell that your website has been blocked is if you see a search engine result like this:
Question #3 – Why is this Bad?
If you had a website prior to launching your new website then you’ve built up some history that is beneficial to your SEO. Your website may be ranking great and producing a ton of leads, or your website could be doing very little. Either way, there is history there and value that should be retained. The moment a robots.txt goes up that disallows your entire website, you begin to delete that value and history. This will lead to the pages of your website no longer being indexed. Not only is Google not ranking your website when that happens, they do not even recognize these pages.
Question #4 – Why did this happen?
The most common reason this occurs is just due to a careless mistake. Developers will block your new website while it is being developed so that it is not indexed by search engines prior to being launched. The mistake happens once it is time to push the new website live. If the developer forgets to unblock the website, they will end up pushing live a website but blocking all search engines from crawling it.
Don’t let this happen to you! Always check to make sure that your website is being indexed properly after a relaunch. If you have any questions about this issue, don’t hesitate to contact us!