It’s a concern to all webmasters when launching a new site – what if it affects my SEO work thus far? There are some things you should be aware of before you launch a new site, so here’s a few ways you can prevent a drop in rankings after a new site launch.
301s
If your existing site carries a lot of page equity then implementing 301 redirects is essential. There are two ways you can do this; via manual mapping or generating a script. The first step is to grab all your old URLs and assess the size of the site to start with. We recommend using xml-sitemaps.com for quickly gathering a list of all URLs, but the free version only lets you grab 500 pages, so for larger sites you could use the Xenu Link Sleuth, which is a much larger, more in-depth crawler. Either way, it’s best to cross-reference with both tools to make sure you have a definitive list of all your old pages.
For manual mapping, you would do this by collating a list of your old URLs and supplying their new destination URLs in a .csv. If you have a big site, this is a time consuming process because you want to be sure that you are correctly mapping your old URLs to their most relevant pages on the new site. Avoid 301’ing all old URLs straight to the homepage, as this only builds up the equity of the root domain and doesn’t pass any value through to your new inner pages. Plus, if you have been backlinking to a number of internal pages, you want those links to be carried across to the closest relevant page on the new site.
If you’d prefer to automate your 301s as opposed to manually mapping, you can generate a script to map them for you. This is useful if you have a large ecommerce site with thousands of pages, for example.
For those that use WordPress, there are some handy plugins available which makes implementing 301s quick and easy.
noindex
When you’re developing the new site, chances are it will be sat on a staging site. This means that you can work on it like a live site, but the content / pages should be kept out of Google’s index to prevent any duplicate site issues. If working on a preview site then be sure to add the ‘noindex’ tag to all pages to prevent it being picked up in the SERPs. Simply adding it to the robots.txt won’t necessarily prevent Google crawling it if it chooses to ignore your robots file (which it can do…and does do frequently).
When you’re nearing the stages of putting the site live, remember to remove the ‘noindex’ when you launch the site because this could harm your rankings otherwise. It’s a common mistake to leave it in, but you’ll soon find your site is wiped from the index if you do. By all means move the noindex tag to those new pages on your site which you genuinely don’t want to be picked up by the search engines (your /thank-you-for-your-purchase pages for example) but check its not in the root folder when going live.
At The Floating Frog, we love using WordPress, so if you have a WP site which has lots of tags and categories generated, it’s a good idea to noindex your tag and category pages too, as these can contain duplicate content otherwise.
PageRank Sculpting
There are a lot of reasons why you shouldn’t pay too much attention to Google PageRank anymore, but let’s also remember this is a Google-owned metric and we shouldn’t dismiss it completely. A few years ago, it was important for webmasters to consider PageRank sculpting when launching a new site so that your root domain equity isn’t passed to ‘unnecessary’ pages on your site. You would do this using the ‘nofollow’ tag. For example, you may have a Privacy Policy on your site but you don’t want this page gaining a lot of weight in the search engines over other important pages. By nofollowing some of these less important pages, it means more root domain value can be filtered through to your product pages, which could be 3 or 4 pages deep. In our view, PageRank sculpting is still worth considering if you want to help your more important pages get the equity over your Ts and Cs, Privacy Policy pages etc.
Duplicate Content Canonicalisation
If you’re launching a large ecommerce site, chances are that you’ll have a few products that sit in more than one category, meaning you could be generating duplicate content, and this in turn could affect your rankings. It can be useful to decide which category is the main one and canonicalise this to tell Google this is the preferred page and therefore should exemplify you from the duplicate content penalisation. The “rel=canonical” tag is commonly found when there are duplicate homepage issues too. Check your site doesn’t reload the homepage with /index or /home, in which case the canonical tag can help to prevent Google picking up duplicate homepages of your site.
Pages with Thin Content
Following the Google Penguin update, webmasters have become much more aware of pages which might cause them problems in the SERPs. For example, if you have a lot of pages with some light copy on, but nothing really adding much value, you might need to rethink your site and page structure. Putting a site live with ‘thin’ pages is just one of the many elements that Penguin wants to penalise you for. Think about where these pages are – are they close to the root? Can any more useful content fill these pages? If not, you’d do better removing them altogether. If Google decides to exclude any pages from its index, without you specifically telling it to, then you know you need to go through the site and find where you can add value. As Google moves closer to being a knowledge engine rather than a search engine, considering how your site is going to add value is key. This will also help to prevent a drop in rankings.
Site Audits
This is one of the most basic elements required when launching a new site and trying to prevent a drop in rankings. Having said that, it’s still something that can be overlooked when the pressures of launch deadlines are looming. It’s important to always find time to do a thorough audit on the site just before it goes live and as soon as it is live (see our Pre Launch Checklist and Post Launch Checklist post). This isn’t just from an SEO perspective, this is also for the health of your site overall. Checking broken links and URL structures will be key if you’re looking to retain rankings.
Webmaster Tools
If Google Webmaster Tools isn’t your best friend by now, it’s time to grab a coffee and get to know it better. With all the data you can get from here, you should place paramount importance on the elements being flagged to you. Why? Because this is data Google is supplying to you, hence they think it’s important and so you should too. The first port of call upon a new site launch would be to upload your new XML sitemap to Webmaster Tools and ensure this is set to update automatically when new pages are added (you can set up a cron job through xml-sitemaps.com)
Other data that Google Webmaster Tools provides which can be useful in helping to prevent a drop in search rankings includes:
- Index status – tells you the number of pages which have been indexed since you first added your site to the account and you can assess indexation changes with your new site compared to your old.
- Crawl errors – its not uncommon for a new site to have some crawl errors if something has been missed and a few pages are 404’ing, but it’s useful to keep on top of what Webmaster Tools is showing you so you can sort the errors out early on.
- Internal links – with this tool, you can check how many internal links there are to your various pages. It’s important that you have links to all pages on your site, and for those really important pages then be sure they are linked to from your homepage.
- HTML improvements – check there are no issues with any HTML elements, such as missing title tags, duplicate tags etc.
There’s a lot more to uncover in Google Webmaster Tools, and when launching a new site it’s a good indicator of what Google likes and what they don’t. One thing to remember though is that even if you correct some elements and clean up the site, you might not notice the changes appearing in your Webmaster Tools account for some time, as its notoriously slow to update with the recognition that you have made changes and the errors found before are no longer there.
Under the Radar?
There’s a lot to be said for the fact you might not want to tinker too much with an old site that hasn’t been updated in a while and you’ve still got good solid, high rankings. Despite the Google algorithm updates, there’s still some sites which have ‘slipped through the net’ shall we say, and don’t have a particularly great site structure or valuable content, yet they have an old authoritative domain which has held the weight for them all this time. So what do you do about preparing to launch a new site when you’ve got something really important to hang onto?
By not doing much with your old site for so long and then launching a new one, you’re actually waving a big fat flag in front of Google’s spider letting it know that you’ve taken the time to update it with fresh content, a great design and a much cleaner, tighter site all round. You’d think that would automatically give you a boost in the SERPs, right? Well, not exactly, because there are some sites which have flown under the radar for so long that by turning everything on its head and doing a full site re-launch, chances are your rankings will drop. That’s not to say they wouldn’t anyway with a new site launch – pretty much all sites will see rankings bounce around for a bit straight after launch but things generally come back within a few weeks and settle at a strong position again, providing you have followed the steps above in this article – but its still opening yourself up to a more significant hit when launching a new site if its vastly different to your existing, solid and well-liked by Google site.
In our view, if you have an old site which is doing well and you’ve had the good rankings for some time, don’t prioritise a re-launch. Does it need it or can you tighten what’s already there? Check your site traffic and see which landing pages / top content is being viewed. If you’ve got some strong authoritative internal pages that you haven’t touched in a while but still rank at the top of the SERPs, it makes sense to hang onto that if it’s valuable, converting traffic.