A solid website architecture is one of the most important factors toward enabling the search engine bots to crawl your content correctly. Site architecture comprises of how the different pages on a site are structured, linked together and the content that the pages comprise of. At the most basic level, your website should have a structure that allows the search engines to effectively access the different webpages on the site in order to consume the content within them.
I have talked a lot about the necessity of effective on-page optimisation recently and ineffective website architecture is a problem that I come across all too often. With this in mind I have decided to share some of my favourite tools that help identify site architectural problems. Some of the tools are free and some are paid for, so you can decide which ones are relevant to you and make the most of them.
Xenu Link Sleuth
This free crawler tool is one of my personal favourites. Xenu Link Sleuth allows you crawl your website and find a whole host of issues. The software is ran off your desktop, so you need to install it on your computer and is really quick at carrying out its’ tasks. Here’s what you can do with Xenu Link Sleuth:
Find Broken Links
Like most crawler tolls, you can Xenu Link Sleuth to identify broken links on your site. To do this, click the ‘check URL’ button, enter your website URL and untick ‘check external links’. Once Xenu has scanned the site, press CTRL+B and it will display a list of broken links on the site. Make sure you get rid of/update these pages on the site to ensure that it will be crawled correctly by the search spiders.
Duplicate Page Titles
Another feature which you can utilise is the ability to quickly find duplicate content, such as page titles, within the site. To do this, simply sort the ‘Titles’ field in ascending order and you will be able to quickly point out any duplicates.
Pages With Low Internal Links
The internal linking of your webpages is really important toward letting the search engines know which pages you prioritise to be crawled. Having pages that have very few internal links pointing to them would signal to the search bots that these pages are not particularly important. To make sure that none of the pages you want to be crawled regularly are being overlooked, you can sort the site data by the number of ‘in-links’. You can also get further information on the individual pages by right-clicking and selecting ‘URL properties’. This will then display information on the actual pages that link to this URL and the outbound links going from the page.
Deep Level Pages
My favourite feature that I use with this tool is the ability to see what level in the site structure any given URL is. For example, if a page is on the 5th level of the website it means that the search engine spiders need to crawl through 5 levels of pages to reach the page. Pages at the top levels of the website structure will be given a higher priority within the search engines; therefore your most important pages shouldn’t be hidden in the nether-regions of your site. To find out pages that are deep in your website’s structure, sort the ‘Level’ column by descending order:
Screaming Frog SEO Spider
This tool has been becoming extremely popular amongst SEO agencies and is available in both a free and paid-for version. The paid version costs £99 + VAT per year and for this price you have the 500 URI limit removed, access to extra configuration options and a few other benefits. In my opinion, well worth the £99. Screaming Frog SEO Spider is fairly similar to Xenu Link Sleuth but comes with some extra handy features. Here is a little overview of some of my favourite features:
Find Missing and Overly Long Webpage Titles
A nice little touch from Screaming Frog’s software is the ability to quickly see the length (in characters) of each of the page titles for the URLs within your website. This way, if any of your titles are in excess of around 70 characters, you can identify and quickly change them. You will also be able to easily identify any pages that are missing their title.
Multiple H1 Tags
Like with the page title length column, there is also a column that shows how many H1 tags appear on each page. All you have to do is sort the column in descending order and you can identify pages with more than one H1 tag. To be completely SEO friendly, all of your pages should only contain 1 H1 tag each.
Check for Canonical Tags on Webpages
One task that I have always found to be particularly laborious is canonicalising each of the pages within a site to prevent duplicate content issues. Screaming Frog SEO Spider makes light work of this by simply letting you know if a canonical tag has been placed on the page and if so, what the canonical link points to. Great feature!
Website Source Code Search (Paid Only) & Sitemap Generator
Within the paid version of the software it allows you to do a site-wide search within the source code of the site. This can be extremely useful if you’re looking to see if there are duplicate content issues, if you are checking for specific script within the site or if you simply just forget which page a bit of code is on!
In both the free and paid versions of the software you can generate a full sitemap of your website which is something that you should really be doing on a regular basis, so this can take away the pain of either writing it yourself or using some crappy online tool.
This has to be one of the best tools I have ever used within an SEO project. PowerMapper creates visual sitemaps of website so that you can have a complete visual view of the structure of a website. I recently worked on an SEO project where we have had to migrate the client’s bespoke-built website to a content management system. I used PowerMapper to generate a visual sitemap which we then used to both understand the existing structure of the site and plan out how to develop a more effective architecture.
The software is a paid-for tool and comes in three different packages, starting at £99 and going up to £349. You can get a 30-day free trial but you’re limited to scanning 1000 URLs on a website. I would recommend at least giving the free trial a go because it really is a fantastic tool. You can view the sitemap in a range of different styles to suit your own preference and to be honest, some of them look amazing. Here are a couple of them:
There you have it; my favourite tools for helping build a solid architecture to your website. The tools that have been discussed have loads of other features that I haven’t even touched upon so check them out and see what you think. If you use any particularly useful tools then make sure you drop me a message on Google+, Twitter or within the comments of this article.