
Technical SEO, whilst we will not delve deep into technical SEO, in order to understand it fully you at least need to know what the main technical elements are. This section provides a high-level overview of the main technical considerations; however, it is important to know where your strengths are and to appoint an agency to assist you if you do not have deep knowledge or resource in this area.
The tags that matter
The two most significant ‘tags’ that warrant attention are
- title tags and
- meta description.
Neither are as scary or inaccessible as they might sound.
Title tags
The title tag is a brief description of the page content and is contained within the HTML. It is visible in search results and used by search engines to interpret site pages. Title tags should be unique, ideally less than 75 characters and have important keywords close to the front. In addition, it is generally good usability practice to include your brand at the front. As the title tag is highly visible to potential site visitors it is of course crucial to make sure it is as compelling as possible.
Meta description
The meta description is a longer description of the page content and is also displayed within the search engine results if Google feels it is relevant. No more than 160 characters are recommended – and make sure they are unique, relevant (and therefore hopefully include some of your focus keywords) and, most importantly, readable. While the meta description does not appear to influence rankings directly, a well-written one can improve click-through rate, which in turn might help drive a ranking improvement.
Site structure
As we are in danger of getting too technical, we will not spend too long on this section, but hopefully we can provide enough ammunition to ask some questions of your chosen SEO expert:
● Hierarchy: your navigation flow should be logical. By this we mean that each level of your structure should sit logically below the previous level. For example, a page on your website that promotes ballpoint pens should sit underneath a page on pens, which should in turn sit under a page on stationery so that there is a logical path for a user to follow to effectively filter their way down to their destination.
In other words, make sure your hierarchy uses some common sense.
● URL structure: search engines use ‘robots’ to interpret sites. If your URL looks something like this – www.mysite.com/categorypage. asp?prodId=1274234 – then you are not helping the robot to work out what your site offers. The ideal is: www.mysite.com/Stationery/ Paper/A3_Paper/Economy_A3_paper.html.
● Site maps: create two – one for users and one for search engines. The search-engine-friendly one should be an XML site map file, which can also be submitted through Google’s Webmaster tools.
● Provide alt text for images. In the HTML you can assign ‘alt attributes’ to help search engines interpret visual content.
● Avoid overusing Flash as it cannot always be read and therefore hinders the discovery process.
● Avoid duplicate content. Note that this means not reusing copy on other pages of your site, as well as ‘borrowing’ others’ content. Google will penalize a site that uses a lot of duplicate content, so it is best avoided. If you cannot avoid duplicate content, do some research on ‘301 redirects’ and the ‘rel=‘canonical’ attribute’.
● Webmaster Tools: make sure someone in your organization knows how to navigate Webmaster Tools (a Google-provided platform). Webmaster Tools help in a number of areas and are quite powerful. It is therefore also quite dangerous in the wrong hands, so proceed with caution. A summary of the main functionality available within Webmaster Tools is as follows:
- Shows crawl errors: this is useful as a site with a lot of errors is unlikely to rank highly.
- Allows submission of an XML site map.
- Allows for modification of the robots.txt files (which can be used to remove URLs already crawled by Googlebot).
- Identify issues with title and description meta tags.
- Provide a high-level view of the top searches used to reach your site.
- Remove unwanted site links that Google may use in results.
- Receive notification of quality guideline violations
- Request a site reconsideration following a penalty.
The danger mentioned above arises from the fact that these functions could enable someone to cause serious SEO issues to your site such as deleting your robots.txt file, uploading an incorrect site map or simply not being aware of serious SEO issues with the site.