Damaging SEO Mistakes in 2014

The practice of SEO (Search Engine Optimisation) has been in a state of flux in recent years. The main catalysts were a number of Google’s algorithmic updates, including Penguin and Panda. Both of these updates penalised websites for attempting to manipulate Google’s search results through spammy techniques.

keywords associated with seo

Penguin, which was first rolled out in 2012, penalised sites for over-optimisation and attacked a number of web-spam tactics such as keyword stuffing and overuse of keyword rich text backlinks. The first Penguin roll-out was estimated to have affected 3.1% of English queries and had a considerable effect on many business websites, transforming many page 1’s overnight, and no doubt crippling many businesses that had relied too heavily on Google’s traffic.

Panda targeted sites for having thin content, low quality content, duplicate content and a number of other low quality factors. Top heavy, another algorithm update, had a major effect on sites with little content above the fold.

Alongside these algorithmic updates, Google also started to hit out at websites with manual actions. If your site has been unlucky enough to receive an “unnatural link” warning message in Webmaster Tools it means you’ve been knowingly (or unknowingly) building a great deal of spammy links to your website (or atleast someone has).

These two types of penalties sent shockwaves through the search and digital industry, forcing many to rethink their tactics and forcing site owners to invest in meaningful content strategies as part of their website marketing. Aside from these Google penalties, many webmasters continue to make basic SEO mistakes that are hindering their site’s potential, such as improper usage of title tags and robots.txt files, while site owners are unaware of many of the dangers of the decisions they can make.

Now that we’re well into 2014, I still see many out-dated SEO tactics on a daily basis, some risky practices that could end up with the websites getting kicked out of the SERPs (Search Engine Results Pages), and websites updated by designers without any thought given to the consequences.

If you want to ensure your site continues to rank well and turn over a profit for your business, please be wary of these damaging SEO mistakes.

Redesigning a website and ripping out the foundations

Hands down this is the biggest and most costly mistake I see all the time, and over the last couple of years I’ve worked on a handful of sites where the new design/development team have ripped up the whole site and moved to a different site structure or content management system with little thought for the existing site. There’s been absolutely no migration plan, and in many cases the website owners have been unaware until the amount of conversions starts dropping dramatically.

Changing the URL structure, navigation, the page titles, headers, internal links and copy without some degree of thought isn’t recommended. All these features will have aged in Google and gained some degree of trust.

Having no site migration plan can lead to a serious drop in rankings.

If you have a relatively small business site then a site migration might be a simple matter of mapping URLs to URLs, open up an Excel spreadsheet and map new with the old and then do your redirects, or there may be plugins or extensions depending on the software that you are using.

I know of one site owner recently that comtemplated ripping up a site that had very good traffic and rankings for a local niche market, and replacing it with a cheap off the shelf package from MR Site. The problem with the site wasn’t it’s rankings it was the goal funnel and the difficulty setting up an account and ordering, coupled with little tracking. Luckily he took good advice and just made changes to the design.

Other sites I have worked on have changed all the URLs, the content management system, updated the homepage heavily, replaced keyword text with non specific text, ripped out internal links, replaced text with just images, and in most cases paid the consequences.

Un-optimised title tags

Optimising your title tags is basic SEO. You’ll be surprised at how many webpages I audit where the homepage’s title tag is simply their “brand name”, or equally as useless “home page”, with no indication of what their brand actually is or does.

The title tag continues to be the most important on-page element. Not only does it help you rank and tell Google what your web pages are about, but it’s what potential users and customers see in the SERPs when they see your listing. It’s effectively a visitor’s first impression of your website, so treat title tags with care.

A good pattern to follow for page title elements is to make sure each one is front loaded with the keywords that best describe the theme of the page. The amount of characters has changed recently, here it suggests ‘Google typically displays the first 50-60 characters of a title tag, or as many characters as will fit into a 512-pixel display’, in a test I just did, there was 62 characters displayed.

Make sure you aren’t stopping search engines from indexing files

This happens to everyone developer at some stage! Trick is to catch it at worst in the first hour or so. A robots.txt file is the first file a search engine spider looks at when it visits a website.

robots text file

It’s a simple text file that webmasters should upload to their web server and is placed in the root folder, for example www.yourwebsitename.com/robots.txt.

It’s a powerful file for webmasters as it works like a set of instructions and tells search engine spiders which files you do not want indexed by search engines.

Not having a robots.txt file isn’t necessarily a problem, but not correctly writing or implementing a robots.txt file will cause you problems. For instance, I recently came across a potential client’s website that had a robots.txt file that looked like this:

User-agent: *
Disallow: /

These two short sentences will tell search engine spiders not to crawl your website. This was a massive error as the client most definitely wanted their site to be crawled so that searchers could find their business! The effect in this case was just to leave a reference to the site homepage, and all the other pages weren’t indexed. Obviously, this text had been leftover from when the developers had been working on the site and didn’t want the spiders to crawl the site until it was finished, they simply forgot to update it when the site went live.

There are a few simple ways to check whether your robots.txt file is blocking search engine crawlers.

Although it isn’t essential to have a robots.txt file, I recommend that you use the file to block certain pages in your website that do not deliver any real value by being indexed by Google, such as a login page or the subsequent “thank you” page after something has been downloaded.

If you don’t currently have a robots.txt file, you’ll need to upload one to your server: this will usually be in the same place as where you put your website’s main “index.html” welcome page. Google provides an easy to follow resource on how to create a robots.txt file and shows you how to test it.

Too many links from your home page

internal links on a website

An internal link is where a link on your own website points to another webpage within your website. You should make use of internal links to pass authority through targeted anchor text links and their usage is also helpful for search engines and users who are trying to navigate around your website.

Many webmasters are tempted to go overboard with their internal links, particularly to product or service pages, because they see it as a good opportunity to push traffic through to their key sales pages. However, too many over-optimised anchor text links to certain webpages will look spammy and repeatedly doing this throughout a single webpage will dilute the flow of PageRank, leaving each channel to receive only a tiny share.

PageRank is an important part of Google’s algorithm. It works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. So the more links that you have, the less authority you’ll be passing on to each channel. As a good rule of thumb, decide which links you don’t want to pass link juice through and eliminate them. If your CMS is generating too many internal links you might want to consider changing to another CMS platform.

Duplicate content

Duplicate content issues tend to be of a technical nature. In layman’s terms, duplicate content occurs when your webpages’ content can be accessed via multiple routes (URLs). It’s a widespread problem which generally affects large websites, like ecommerce sites, whereby a particular product can be accessed via multiple URLs due to sorting options that generate unique URLs in each instance.

For example, if you have a page with 10 items and a different URL is generated when those items are sorted alphabetically as opposed to price, then you’ll end up with two pages with the same content but with different URLs, and Google is likely to index them both.

This is probably one of the most common SEO mistakes and affects many websites. Duplicate content can cause indexing problems and be detrimental to your rankings because Google will not know which page to rank and can choose the wrong one to index.

Recently I worked on an ecommerce site where the amount of URLs that Google reported in the ‘crawl index’ in Webmasters Tools went through the roof.

The quick fix is to use the canonical tag. This will ensure that these pages are no longer picked up as duplicate content and will ensure that the link juice is directed to the right location, resulting in a stronger landing page.

Not selecting www or non-www for your domain

Similar to the duplicate content issue mentioned in the previous section, if content on your site can be found via a www and non-www URLs, such as;

http://www.yourdomain.com, or
http://yourdomain.com

you should have one webpage accessible through duplicate URLs. Do a site operator check e.g. site:yourdomain.com and see if you can see your listings with a mixture, with and without the www. Best practice is to consolidate all the link juice to one URL.

Making sure that you select the correct www or non-www version for your site is one of the first steps any webmaster or online business owner should take, but you’ll be surprised how often this stage is overlooked.

The best way to fix this is to select which version you want to associate with your domain and 301 redirect the other versions of the URL. This will help to consolidate your link popularity into one URL and maintain better rankings. There are a number of ways to do this depending on the coding or server of your site.

On the same principle if you have links in your navigation or throughout your site to /index.asp or /index.php or /index.htm then it might be an idea to just link to the domain without the additional filename.

Using the same anchor text for every single link

If you’re building links, but using the EXACT same anchor text for every single link, you could find a Google Penguin penalty snapping at your website.

As previously mentioned, Penguin was designed to target websites for over-optimisation. In fact, this is how many businesses get their websites kicked off the search results pages. For example, if you’re a dog groomer and you have a backlink profile of 1000 links all using the same anchor text e.g. “experienced dog groomer” it’s highly likely that you’ll receive a penguin penalty.

When link building it’s important to create an even and natural distribution of anchor text links based on quality and relevant websites.

If you’re concerned about Penguin, here is a decent keyword anchor text guide which cites that:

40% brand name anchors
25% naked links – this means linking back with your full URL e.g www.yourdomain.com or www.yourdomain.com
20% generic – e.g random anchor text like “go here” or click here”
10% keyword variations – so instead of using “dog groomer” for example you could use “experienced dog groomer in Northern Ireland”
5% exact anchor matches – this is where you use the phrase you want to rank for e.g. “experienced dog groomer”.

Poor site structure

Having a clear and logical site structure and an intuitive internal linking structure makes it much easier for search engines to crawl your site. But these factors aren’t only important for Google; the navigation of your site structure is also important for creating a positive user experience too.

structure of a website

A tree-like or hierarchal information structure is the best way to organise your site’s information. Think of your website’s structure like this: websites are usually organized around a single homepage. This then links to subpages. Kind of like a tree, with many smaller branches, and sub-branches attached to those, like this:

Image courtesy: http://blog.hubspot.com/blog/tabid/6307/bid/34195/How-to-Design-a-Site-Structure-Visitors-AND-Search-Engines-Love.aspx

This type of information architecture remains to be the most successful for optimisation and indexation by Google because it is simple to follow. If you don’t organise your content and information into relevant categories or sub-folders, people will be put off browsing your site and Google won’t crawl it, thus leading to lowered sales and a lack of indexation.

It’s recommended that all of your navigational menus are methodically structured and that no page should ever be more than three clicks away from any other page on your website. This creates a positive, user-friendly website whilst at the same time appeasing Google, which is increasingly adding “usability” elements to its algorithm.

Non-friendly URL structures

Non-friendly URLs are the types of URLs that are either randomly generated by a CMS or contain special characters that aren’t readable, like this:

http://www.yourdomain.com/13525qw&rfr4433-432rf

It’s important to make sure that your site’s URLs contain keywords, not only for ranking purposes, but also so that a user trusts your website. In some cases and with some setups the effort may not justify the costs for doing this, but most popular content management systems should have the option.

A URL that contains ampersands, questions marks and too many numbers looks spammy and can even make it difficult for a search engine spider to determine what your webpage is about and therefore it might not be indexed correctly.

When first designing a website, you can make sure that your URLs are friendly by following a clear navigational structure. But if you already have widespread erratic URL structures, you might have a big task on your hands trying to create dynamic URL rewrites.

No Text

No text for Google to index, just a lovely big image, with text in the menu isn’t great. For many sites without a decent backlink profile how is Google to know what keywords you’re targetting. This makes it very difficult to do well for a broad range of keywords.

Markup

Poor HTML markup is another mistake. If it’s a header use a header tag, if it’s a paragraph use a p element. Let the search engines know the priority and importance attached to text items.

Lack of Updates

If you’re in a market where a site is expected to have constant updates, for instance an employment site that should have new jobs listed every day, then it’s best to keep the site fresh with latest jobs. Just look at current job sites to see this in action. I believe I have seen sites perform better in the last year or two just on the basis that they have been updating information even if it’s just weekly or monthly.

Linking to Bad Neighbourhoods

Don’t link out to sites with a poor reputation. If you search Google for ‘bad neighbourhood’ checkers you’ll find the odd tool to make a check. Quick research would be to type site:thewebsite.com into Google and see if the site is indexed. If it isn’t then remove the link.

Keyword Choice

Poor keyword targetting can stop your site before it even gets any traffic. Targetting the wrong keywords, competing against sites with a much bigger budget, going after trophy keywords rather than targetting a wider number of less competitive terms can lead to poor rankings and little traffic.

I remember once having a client that was adamant that they just wanted to target 1 keyword, despite offering numerous services, they wanted to rank for 1 trophy keyword term. Although it did work, this is a very poor and risky strategy.

Analytics & Goals

Always install an analytics package such as Google Analytics and set up Goals. Even if you aren’t constantly looking at your stats, 6 months down the line or a year down the line, the next guy can get some feedback on how your site performs, and what are the strengths and weaknesses of the site. When I really didn’t know what I was doing, I’d make decisions that weren’t based on what Analytics or other packages back then told me.

Wrong Platforms

If a website company has their own internally developed platform then I’d give it considerable thought before going with them. I’ve seen a number of custom platforms and they never tend to be as SEO friendly as the Open Source or Popular content management systems. Updating the system costs more and invariably takes more development time.

SEO Training

Educating the company’s marketing team, the web designer, business owner or whoever looks after the website is increasingly important. Many of the mistakes above wouldn’t have been committed if those responsible for the sites had some training and could take a bigger and more informed role in development and decisions.

There’s alot of stuff that didn’t make it so I’ll add more and beef this out as I go along. If you’ve any yourself then please add them.



2 Responses to “Damaging SEO Mistakes in 2014”

  1. Nathan GotchNo Gravatar Says:

    Michael, great article and I really appreciate the link! Thanks you very much

    – Gotch

  2. Louise McCartanNo Gravatar Says:

    Great article. You’ve hit the nail on the head. These mistakes really do come up a lot. Having all this in order is essential if the site wants to be found by search engines.Cheers, Louise

Michael Wall

Michael Wall is an experienced SEO based in Belfast N.Ireland currently running his own SEO Agency.

Michael Wall is available for Internet Marketing, Google Adwords PPC and SEO work. Please call Belfast 02890 923383 or use the contact form.

Google Partner

Popular Posts

SEO training in Belfast