Wednesday 14 August 2013

A Step-By-Step Website SEO Audit Guide




At some point SEOs need to audit a site to find out what is going wrong and what needs to be fixed. There might be a number of things preventing a website from reaching its full potential but finding those problems can be difficult. The outline below will help educate you and show you what you should be looking at and how to fix it.
If you are performing SEO on behalf of clients, especially new clients, you need to have their sites thoroughly examined for technical issues. Whether a site has crawling issues, indexing problems, or other issues that are inhibiting the site’s ability to rank, this process will find it.
It should be used when new clients sign on but could also be used as a sales tool. Free site audits can be compelling for showing your leads what is wrong with their sites, and shows them the route you would take to deal with those issues.
Most websites that you come across are going to have something wrong with them. Having a process in place to efficiently identify these issues is essential to maintaining site health and rankings.
Let’s get started.
Screaming Frog Spider
Your first task is to use a tool called Screaming Frog. It is a free application that can be used to crawl sites. When the client’s site has been crawled, you can export the data to excel and analyze it. The tool looks at every page of the site and reports back on the following:
  • Duplicate Pages – Identifies any pages where the content is the same or similar to another page on the web.
  • Errors – Reports any client or server issues, such as 404 pages.
  • External Links – Shows all of the sites that you link out to.
  • Title Tags – Shows any missing, duplicate, short or long titles.
  • Description Tags- Shows any missing, duplicate, short or long descriptions.
  • URL Issues – Shows URLs with upper cases, dynamic URLs, URLs that are too long or have underscores.
  • Redirects – Shows any permanent or temporary redirects.
  • Headings – Shows information on any h1, h2 or h3 tags used on the site.
  • Meta Robots – Lets you know what you are allowing to be indexed, and what you aren’t.
  • Anchor Text – Identifies the anchor text you are using for any images or web pages.
  • Internal Links – Shows where you are linking to other pages on the same domain.
  • Follow & Nofollow – Shows you which links are follow links and which are nofollow. This can be useful to quickly find links that need the nofollow tag added to them, which will minimize link juice being passed to other sites.
  • Bot Crawling – Crawls the site as the Google, Bing or Yahoo bot, allowing you to see what the search engines see.
  • Images – Gives you information on the site images, the alt tags used, and where alt tags are missing.
  • Page Depth Level – Finds how many levels deep the search engines have to crawl to find all your content.
  • File Size – With smaller file sizes, your website will load faster. Identifies spots where you need to make files smaller.
When the site has been crawled, which will take a few minutes; you can export the data and refer back to it as needed.
Google Webmaster Tools and Analytics
Make sure the site is registered on Webmaster tools and in Google analytics, and that you can access them both.
Through Webmaster tools, you can see any crawl issues that Google is encountering, the general health of the site, and loads more. It is the easiest way to discover any problems with your site.
Webmaster tools needs to be checked once a month for all clients at a bare minimum, to ensure that any problems that crop up are dealt with quickly.
Keyword Analysis
The information on title tags and description tags that you acquired from the Screaming Frog tool can be used to understand what the site is currently trying to rank for.
You can then combine that information with Google analytics to see if the site is actually getting traffic for the keywords you are targeting. You can then decide where keywords need to be changed and target keywords that will bring more quality, relevant traffic to the site.
URLs
The Screaming Frog report will also give you information on the URLs that are used across the site. Ideally, the URLs need to obey the following rules:
  • Must be static – Static URLs contain only numbers, letters and dashes. For some reason, search engines struggle to understand URLs if they contain anything other than this.
  • Easy to remember – Make sure all URLs are user-friendly. Simpler URLs are ideal.
  • Under 100 characters – As a rule of thumb, make sure URLs aren’t longer than 100 characters.
If any of the URLs fail to meet these criteria, you may want to consider changing them. If you do, make sure you redirect the old URL to the new one to maintain any of the link juice flowing to that URL.
Title Tags
The data from the Screaming Frog tool will give you information on all of the title tags used across the site. Title tags are probably the single most important place to insert your target keywords. Make sure all of the title tags follow these guidelines:
  • Must be 50-70 characters in length.
  • Must be unique.
  • If possible, and if it makes sense, use the target keyword for that page twice.
  • Use the name of the city for the business if it makes sense to do so.
Description Tags
Descriptions don’t actually help with rankings, but they do substantially alter your click-through rates. Description tags need to be compelling, and give good reason for searchers to want to click through and visit the site. Follow these guidelines:
  • Make sure that every description is unique and relevant to that page.
  • Include a call to action.
  • No more than 160 characters in length, no less than 51. But ideally, the longer they are the better.
  • Use the target keyword for that page where appropriate. It will appear bolded in the search results which will draw searchers’ eyes to your listing and improve click-through rate as a result.
  • Use the name of the city for the business, if appropriate.
Headings
H1, H2 and H3 tags need to contain the target keywords for that page because they help a lot with ranking. Follow these guidelines:
  • Search engines generally feel that keywords that are bigger in size are more important and give them more authority as a result. Therefore, it is important to make the headings big, prominent and ideally the first thing that people see on the page.
  • They should only contain text – no images or logos.
  • Use a H1 tag at the top of the page and break up paragraphs of text with h2 or h3 tags. They work well from a usability standpoint and help with rankings.
Content
You need to make sure there is enough content across the pages on the site. You can’t use any of the tools to help you with this, but it should be easy enough to just scan the site and identify where more content needs to be added.
Each page should have a minimum of 300 words of content. If any of the pages don’t, add some more. But don’t add content for the sake of adding it; only add useful, engaging content.
It is more important to fulfill the needs of your actual visitors rather than that of the search engines. Large chunks of spammy, unnecessary content will just ruin the credibility of the entire site.
Make sure content is split up into bite size chunks as testing has shown that visitors skip over content if it is laid out in one massive paragraph.
Internal Linking
If you link to inner pages of the site where appropriate, search engines find it much easier to crawl the entire site. You can refer to both the Screaming Frog data and Webmaster tools for information on internal linking.
Where you can, ensure you don’t have more than 100 links on any one page, and around 2 or 3 internal links.
Using anchor text that is too rich with your target keywords can actually have a negative effect on rankings so only use target keywords around 10%-30% of the time when linking internally. Use click here’s or similar text links instead.
Image Text
Every image must have an alt tag. This is for the benefit of the search engines and visually impaired users. The alt tag needs to describe the image and contain a keyword if that keyword is relevant to the image. Alt tags do have a slight impact on rankings so try and get the keyword in where possible.
Nofollow
If one page links to another, and Google sees the linked to page as relevant, some of the link juice is passed through. When the nofollow tag is used, Google will then pass 50%-100% less link juice through.
Use the nofollow tag on sitewide external links, blog comments, and anywhere else where you don’t want to lose link juice.
Excluding Pages
Search engines don’t like it when you have loads of pages with barely any content on them. Where you can’t add more content, you may wish to stop that page from being indexed altogether.
Simply addto any pages that you don’t want to be indexed.
Sitemap
Sitemaps make it easier for search engines to index all of the pages on a site. Go here to get a sitemap created, and then submit it to Webmaster tools. It will tell you how many submitted URLs are actually getting indexed.
Make sure you check Webmaster Tools regularly to ensure the sitemap is still working as it should.
Redirects
There are two types of redirects: a 301 redirect (permanent), and a 302 redirect (temporary). 302 redirects are a dead end for SEO, and don’t pass any link juice.
Unless the reason for a redirect is truly a temporary solution, such as a timed promotion, a 301 redirect should be used.
Take a look through the data from the Screaming Frog tool and make sure deleted pages or URLs that have changed are using a 301 redirect.
Duplicate Content
The search engines really don’t like duplicate content. Following the Panda update to Google’s algorithm, sites with duplicate content have been identified and penalized.
There are 3 main ways to deal with duplicate content:
  • Rewrite the content to make it unique.
  • Use URL redirects.
  • Use a rel canonical tag to specify the original page to the search engines. You can use this tag within the head section of the page to show the canonical web page:
Broken Links
If a spider comes across a broken link when crawling a site, it will stop crawling and immediately leave the site. If there are too many broken links on a site, it will be seen as providing a bad user experience, and rankings will suffer as a result.
You should use Webmaster Tools to check for broken links and you can also use the Xenu link sleuth tool. When you find broken links, use a 301 redirect.
Page Load Speed
Google recommends that sites load in 1.4 seconds or less. Any longer and rankings are not going to be as good as they could be.
Use Pingdom’s speed tool to test the speed of the site and then use the pagespeed tool from Google, to identify solutions. If the load time is longer than 1.4 seconds, there are a number of actions you can take.
  • Browser caching
  • CSS sprites for images
  • Reduce image file size
  • Combine CSS or javascript into fewer files
  • Install the W3 total cache plugin, if the site is using WordPress.
Incoming Links
Use Open Site Explorer to analyze inbound links. You will get a good idea of the type of link building activity that has been performed on the site in the past.
Thoroughly look through the link profiles and search for spammy or low quality links. It may be that certain links are having a negative effect on the site’s ability to rank, and you may need to use the disavow tool to discount certain links.
Domain Authority
Use SEO moz’s domain authority rank, which can be found in Open Site Explorer, to judge the overall authority of the site.
It ranks the site from 0-100, and is currently the best way of working out how much authority the site has, which is going to seriously effect its ability to rank.
Compare the domain authority to that of competitors. If the site’s authority doesn’t stack up well against theirs, you’ll want to avoid targeting the same keywords as them.
As a rule of thumb, if the domain authority is lower than 30, you’ll want to adjust the competitiveness of the keywords that you target.
Once you build a site’s domain authority to a decent level, ideally above 40, you can begin to target more competitive keywords.
Social signals definitely have an impact on rankings, and it is
important to assess the number of mentions the site has. Again,
you can find this data within Open Site Explorer.
There are two simple ways to modify a site to increase the chances of acquiring social signals.
  • Integrate sharing buttons clearly across the site and on any blog posts.
  • Create content that is worthy of sharing, and reach out to people to ask for feedback.
That’s it. Hopefully you learned a few things about auditing a site. Don’t forget to bookmark so that you can come back as and when you need to. If you have any comments, concerns, or queries, leave them below and I’ll get back to you.
resource: http://www.sitepronews.com/2013/08/14/a-step-by-step-website-seo-audit-guide/

No comments:

Post a Comment