Bobby has a fair amount of SEO knowledge, and is capable of doing an SEO site audit. Suzie, a friend of Bobby’s, has recently seen her Google rankings decimated and she begs Bobby for help. Unfortunately, money is extremely tight for both Suzie and Bobby. He’d like to help, but he can’t afford the tools he needs to do an effective audit for Suzie. She can’t afford to pay for them either. Bobby promises to do what he can, but he’s stressing over the fact that he has limited access to data with a non-existent budget.
Sam, on the other hand, has been doing his own less-than-stellar site optimization for the last few years. He knows he’s no expert, but he’s barely been able to keep his head above water the last few years. He realizes his site has been struck down by both Panda and Penguin, and he’s willing to tackle the chore of digging out from the penalties. But with very little budget, what can he do?
Whether you are a site owner that needs to analyze your site on a next-to-nothing type of budget, or you are someone with enough knowledge, but not enough money, to help solve a friend’s site problems, you need the best tools available for as few pennies as possible. The tools below are the ones I’d recommend to people who need the most data for the fewest dollars.
I won’t lie to you. This collection of tools won’t be as useful as the ones that cost significantly more money. Some return a pretty limited set of data. Still, desperate times call for desperate access to data. You’ll be able to do a much better job with these tools, than you could without them.
So if you have no other choice – or you just love being a cheapskate – then you’ll want to use the tools below.
Sticking with the above examples, let’s say that both Suzie and Sam run sites that have between 2,000 to 3,000 pages. These sites are large enough to need tools to help with the audit process, but not so large that a cheap analysis would be completely out of the question. (You wouldn’t want to use this set of tools for a million-page site). For a 2,000 page site, you could do a decent audit for as low as $9, with ever-better analysis as your budget gets closer to $100.
URLs, Meta Data, and Crawl Analysis
Although the goto site crawling tool is Screaming Frog SEO Spider, the cost to crawl a 2,000 page site is too high for our penny-pinchers (anything over 500 URLs costs 99 pounds, which is somewhere in the 160 dollar range). Luckily, there’s another crawl tool that can handle up to 15,000 URLs for just $9.00. It’s not quite as data rich as Screaming Frog is, but it’s a really great set of data for just 9 bucks. The tool is called SEO Crawler. A screenshot of the kinds of data you can get from SEO Crawler is shown below. (URL lists, titles, keywords, descriptions, H1 tags, robots data, status codes, etc.}
With a simple list of urls and associated meta data, you can very quickly sort things to see big problems quickly (duplicate titles, duplicate pages, missing things, 404s, 301s, etc.). Even if this was the only tool you had at your disposal, there’s a good chance you could solve a lot of SEO problems. So do yourself a favor, and start with SEO Crawler. It is free for up to 500 URLs, only $9 for up to 15,000 URLs, a mere $19 for up to 50,000 URLs, and just $49 for up to 200,000 URLs. In addition, this is pay-as-you-go, with no monthly fees. If you don’t need to analyze more URLs, don’t pay for more. Then, if you suddenly need to analyze more 6 months later, pay for them then.
Sam’s site uses both Google Analytics (GA) and Google Webmaster Tools (WMT). That gives him a fair amount of rankings information, but he should probably also set up Bing’s webmaster tools as well. All Sam has to do is use his GA/WMT tools to determine what was ranking well, when the rankings dropped, and can probably use that information to determine why. Suzie never set up GA or WMT, nor did she ever track her rankings via any other tool. Bobby, therefore, doesn’t have much to work with, so he needs to get as much (or as little) rankings data as he can manage to scrounge up. His best bet is to set up her site with GA, WMT, and Bing’s webmaster tools now. Next, he should run Suzie’s URL through the SEMRUSH* tool, which will give him a bit of instant data on what her site has historically ranked for. The free version will give him a brief glance; $69 gets him a lot more data (for 10,000 results / 500 keywords). That $69 is a monthly fee, but Bobby can always just pay for a month, do his audit, and decide to not renew for future months. (Sam may as well run his site’s URL through SEMRUSH too. Why not…more data is good. The screenshot below gives you an idea of the kind of data SEMRUSH provides, and you can look at a sample report here to really get a feel for all the data returned.
Let’s face it. These days, a Google penalty often involves bad backlinks. Dealing with a manual link penalty, or the scary Penguin algo slap, requires looking long and hard at all of a site’s backlinks. Gathering that data is difficult. Sure Google’s WMT and Bing’s webmaster tools are the first places to start, but you’re only getting a relatively small sampling there in most cases. There are several excellent tools out there that provide a ton of great backlink data but those tools come at a pretty hefty cost. Sam, Bobby, and Suzie need more for less. The best bet in this situation is to use a combination of the free data from Open Site Explorer, combined with the $10 plan from LinkDetective. If you’ve ever used the free version of Open Site Explorer, you’re probably wondering how the small amount of data returned is of any use here. But what you might not realize is that the CSV export give you 10,000 links, not just the 20 they show on the site. LinkDetective requires the upload of an OpenSiteExplorer csv file, so for the $10 they charge, you get a 10,000 backlink report. Combine and merge that with the link data you have from Google and Bing webmaster tools, and you have a fair amount of data to work with. What does the ten buck plan get you? 10 projects, 25 domains, and 250,000 links, which will definitely handle the 10,000 links obtained from Open Site Explorer. Below is an example screenshot of the backlink data you’ll get from LinkDetective.
Anchor Text Ratios
Now that you have some good backlink data, you really need to analyze the anchor text data. You can manually work some magic by manipulating the link data in Excel, but you can also get a quick overview by using Removem’s free anchor text optimization tool. As you can see below, you get a nice overview of your site’s anchor text ratios in just a couple of seconds.
OnPage Optimization Overview
I’ll admit, I’m not a fan of the data that any tool provides about onsite optimization, no matter how expensive the tool is. Honestly, the best tool for this job is a combination of your brain, a crawler like SEO Crawler, and some manual perusing of the site’s pages. But just because I’ve never found a tool I liked for this task, doesn’t mean you shouldn’t make use of one or two that at least provides a bit of useful information. For that purpose, plug a URL into either Nibbler and/or CrawlerFX. You can see screenshots of the kinds of data you’ll get from each below.
Curious about what other sites are sharing space on the site’s server? If the site isn’t on a dedicated server, then there’s a good chance that there are bad “neighbors” sharing the server, possibly lots of them. A quick way to check is to use the site’s IP address within Bing’s Reverse IP Lookup tool. Just enter this into your browser’s address bar: //www.bing.com/search?q=ip:22.214.171.124 (Of course, substitute 126.96.36.199 with your site’s IP instead). Bing will list all the sites being served from the same IP. If your site is living side by side with many obviously spammy or “bad” sites, you’ll want to make a note of that in your audit, and consider moving the site.
Luckily, there are lots of page speed tools available for free. My favorites include Google’s Page Speed tool, Gtmetrix and a cool tool from WebPageTest that lets you watch a video of sites loading side by side. Great for competitor comparisons with your own site. Site speed matters, so make sure you include these tests in your audit.
Just jump into WMT and look for any messages from Google. Also check each area of WMT for any other issues that may be important, but didn’t merit a message from Google. Common errors include crawl errors, blocked URLs, malware notices, and more.
If a site was hit by Panda, one possible reason may be due to too much duplicate content. Copyscape is the standard tool to check for dup content. Although you can use the tool for free as often as needed, you do have to manually enter each URL one-by-one to avoid spending money. Our sample 2,000 page site would make this a very tedious chore, but it can be done. You could choose the premium / bulk plan, instead, but of course that comes with a price of 5 cents per page. For our sample site, that would cost a hefty $100. Instead, I’d recommend running a premium batch run of a smaller sampling of the site’s page URLs, such as the most important, top 10%. In this example, you could run the top 200 pages for just $10. That would very likely be a large enough sample to at least determine whether or not the site has a major duplicate content problem or not.
Local SEO Overview
If the site is a local brick-and-mortar business, then it’s imperative to do a Local SEO audit. This could be an entire post in itself, so I’ll have to keep it simple at this point. Local SEO relies heavily on citations, so run the domain through the GetListed tool. You’ll get a good overview of the main listings that can be claimed or edited from places like Yelp, Foursquare, Yellow Pages, CitySearch, etc.
SEO Audit Tools From $9 to $98
For as little as $9, to as much as $98, you can amass a large bucket of data for your site audit. Of course, you’ll put in some sweat equity, but you’d have to put in some sweat equity no matter how much you spend on tools .. (unless, of course, you spend that money to hire someone else to do all the work). I hope these tools help the Suzies, the Bobbys, and the Sams of the online world. Best of luck!
P.S. “Cheapskate” is a “baity” headline word. I don’t mean anything negative by it. Substitute “frugal” if you prefer. 🙂