Web traffic FUBAR? Common new site fails (and fixes)
After some late nights, endless rewrites and changes- your website launched on Friday. It was down all weekend for some reason and of course you couldn’t get hold of your web dev. Well, you’ve learned lesson 1. Don’t launch at 5PM on Friday. Launch on Monday. Then you can shout at the developers all week.
But now you've got another problem.
It's a potentially career limiting problem. The new site is live and in Analytics the traffic has dropped off a cliff.
Don't start scouring the jobs websites just yet... It might be a simple fix.
Here’s what you should check and in what order.
1. Google Analytics Fails
Is Google Analytics code snippet installed?
This is a fairly common one. Your web developers may have forgotten to add the Google Analytics code snippet when they launched the site.
Paste the address for your homepage in here: http://builtwith.com/ In the “Analytics and Tracking” section is where you’ll get confirmation that the analytics code is in there.
Builtwith also has a plug in for Chrome. Worth installing if you want to do more snooping on your site (and competitor sites).
Is the correct Google Analytics code installed?
It may be that the code snippet exists on your website but it’s not tied to the Analytics account that you've used historically. To check this, go into your Analytics account and find the UA number for the site you want to check and then cross reference your UA number with the one that appears on your page (go to Step 4 in those instructions).
If your web dev has arbitrarily decided to give you a new account, tell him to undo that shit and give your old stuff back… else year on year / month on month comparisons are going to be really difficult, and without that sort of context, what is the point of all those graphs?
Is the correct Google Analytics code installed on every page?
You can check that Google Analytics is on every page of your website by using GA Checker’s automated crawler. Our techies don’t like the load it puts on web servers, so maybe only use this if there is other evidence (drastic bounce rate increase perhaps) that points to a potential problem with a page by page install.
2. Google Was Blocked From Crawling The Dev Site. You Forgot To Unblock Google On Launch.
When developing a site, web developers don’t want Google seeing your work in progress. So they normally put up a sort of no entry sign (but made of code), aimed specifically at Google. They often forget to take it down. Here’s how to check you’re welcoming Google not pushing it away.
Site Wide Check:
Check your robots.txt file. In the address bar of your browser, type this: www.<yourdomain>/robots.txt and hit return.
Make sure that file doesn't have a line that says:
If it does, get your web designer to change it IMMEDIATELY.
It might include a line like this
Which is entirely correct – you don’t want Google rifling through your private / admin stuff, so don’t worry about that… Just check that you’re not accidentally disallowing anything that should be public. For more info on the little file that you've just discovered, go here: http://tools.seobook.com/robots-txt/
Page by page check
Go into your web page, view source (learn how to do that here) look into a bit near the top of the source code called
And check that you can’t see
<META NAME="ROBOTS" CONTENT="NOINDEX”>
If you can, you found your problem (or one of them). Get your web developer to remove this at once from any page that should be publicly available.
The time it takes for Google to re-crawl and re-index the content that was previously blocked is normally proportional to the length of time the block was in place. That is to say, if you only blocked Google for a few hours, then you should see a recovery of your position in the results in a matter of days. If you blocked it for weeks (or even months - I've seen it happen) then it will take weeks or even months to recover the position you previously enjoyed.
AGGGRHHHH! I've been blocking Google for weeks!
Then you have a problem. You can expedite re-crawl / re-index by getting new authoritative and trustworthy links to the pages you blocked. Think links from your local press, trade magazines, the BBC - not directories. This means your website needs something worthy of inclusion on those sorts of sites. Proper newsy / human interest stuff... Give some time to a local school, help out a local cause, get plenty of pictures, blog about it on your site and then sell that story to the press! Yes, it's hard. But THAT'S SEO. Buying 400 links from PageRank 6 and above websites from some geezer who just emailed you is not SEO. That is SPAM. I'm ranting... Noisy Little Monkey SEOs have a tendency to. See Ste's post: WTF is 'ethical' link building?
3. You Left No Forwarding Address, So All That Old Content Returns Error Pages
All the pages on your shiny new site probably each have a nice shiny new URL (a unique address, which Google never forgets) and your backward looking jerk off of a web dev hasn't bothered to set up redirects set up to help visitors / Google find the new, fresh stuff.
What you need is a set of 301 redirects to point from the old URLs to the new URLs. There’s more information on why 301 redirects are important and how to set them up in this post I wrote waaaay back in 2008.
If your web developers try to fob you off with 302 redirects instead of 301 redirects, call me. I’ll pop round theirs and set their office on fire.
4. Links to resources on dev server.
Often the site will go live without anyone checking that all the content successfully moved from your dev server (e.g. dev.<yoursite>.co.uk) to the live site (e.g. www.<yoursite>.co.uk) so images, style sheets and even whole pages will be over on that old address… That may give you a suspiciously high bounce rate in the short term but will cause a major issue in the long term when the developer deletes all the stuff on the dev server.
The best way to check is to crawl your own site with an awesome tool called Screaming Frog. You need to get a bit technical to do this but the tool is free and if you've checked everything else, this is the final port of call before you start spending huge great wads of cash with a reputable digital marketing firm or completing your letter of resignation.
Click the ‘External’ tab and sort by the various filetypes and make sure the only ‘external’ links you see are to other sites, not the dev version of your own. Screaming Frog will allow you to do this for 500 URLs for free… if you have more than this, or this is all too geeky for you, give us a shout. We run a launch support process (from £450 +VAT) to cover lots of this off.
Checked all these and still can't figure it out?
Why? Because we don't want your web developer to do it to you or anyone else again.