At Cloud Fulfilment our aim is to help ecommerce store owners sell more efficiently. Allied to that primary objective is the additional desire to help you sell more, a lot more. As part of that process our resident SEO expert has pulled together a number of tips to help you perform better in the search engines – which starts today with perhaps one of the most crucial aspect of a profitable search engine optimisation campaign: duplication.
Over the past decade and a half of helping companies sell more online I’ve noticed a steady improvement the SEO capabilities of ‘out of the box’ eCommerce solutions. However, many still fall short – and worryingly so. If the type of problems they can create are left unchecked they can negatively affect your performance to staggering levels. During my time I’ve worked with companies that have replaced their previous website with some new ecommerce software, added their products and then wondered why sales have fallen – its because they really haven’t got to grips with their content.
So here is the list of the causes of the most common problem: Duplication – and how you can fix them:
Internal duplicated content – since Google’s Panda algorithm updates (bear with me) having your content duplicated within your website can have very serious consequences. You should be aware, however, that there are various different levels of duplicated content – some are more problematic than others, but all need to be resolved.
- Product duplication – In the past Shopify, for example, allowed your products to be viewed at different URLs (some installations may still have this issue) – depending on how many categories a product was added to. You might have a product found at mysite.com/motoring/blue-widget/ or mysite.com/gifts-for-men/blue-widget/ for example – in normal circumstances Google will not like this. To find out if this is an issue take a line from one of your product descriptions, place it into the Google search box and put ” marks around each side of it and then add (with a space between these to elements) site:yourdomain.com [whatever your domain is] and click ‘search’ [EXAMPLE] . This will show you how many times your product description is duplicated – which will indicate the level of duplication (if any( you have. You might want to change the ‘site:yourdomain.com’ part to show just the pages in your product folder i.e. site:yourdomain.com/products/ – so just amend that part in your search to filter just on your product pages.Solution:make sure you choose ONE URL for your product. If you need to have products found at different URLs as that’s how your product listings work then use a canonical tag to tell the search engines which is the definitive location.
- Category/tag/search duplication – the same sort of search above will often highlight a very major problem: category, tag or search results pages duplication. This occurs when your products are listed on various archive type pages. Max had allowed this issue to occur when his Magento software created several thousand archive pages based on canned ‘search results’ pages. The end result was that the same products could appear on several dozen different archive pages:You can get the scale of the problem using Screaming Frog which actually crawls your site and summarises the information in an easily digestible fashion. If you let it run and then organise the data in the ‘URI’ tab alphabetically you will often see a large list of URLs in the same folder that shouldn’t be there. You can also use the ‘duplicated’ option to filter pages that it considers duplicates. Also,if you also run a blog have a look for /tag/, /author/ and date-based archives.Solution: generally speaking these search results pages shouldn’t be available to the search engines. Google states here that you should “use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.”Robots.txt is preferable as you don’t want to waste ‘crawl budget‘, but in most circumstances just adding a ‘noindex, follow’ tag to them will do just fine. Doing a robots.txt wrong can have major consequences.
- Test domain duplication – this site has their entire site duplicated on a test domain. Make sure you ensure that your web developers keep their test sites to themselves – and don’t let the search engines in.
External duplication – This is another major issue that does need investigating.
- Affiliate feeds – many ecommerce stores with affiliate programmes create a problem by offering their affiliates product feeds. A problem occurs if you offer your feeds to affiliates which then use your data on more highly powered websites (more good quality links) then Google could have trouble determining which is the best website to show their customers when they do searches.Solution: this is a difficult one, but consider offering partial feeds, perhaps with just the product name, price, Meta Description and perhaps other data such as colour, availability, size etc. Looking at data from a large number of ‘product feed’ or ‘price comparison’ sites and largely they have been punished severely by Google for duplicating content and adding little value to their customers.
- Plagiarism – this is another difficult problem to fix. You might not be giving partners your content legitimately in the affiliate case, but your competitors may lazily ‘borrow’ your product descriptions. As the search engines want to reward useful AND unique sites, this can cause issues. You can use the method at the beginning of this post to find duplicates, but a more robust method would be to use PlagSpotter where you can add your XML sitemap and it will check for duplicates on other sites. It will return a list of ‘problem’ sites and highlight what text is duplicated and which sites have the same text. You can then either contact those sites to have your contact removed, or rework your own. You could take this as an opportunity to improve your own content, or you might just want to stand your ground.
If you can find and fix duplicated content issues then you should be well on your road to improving your website’s search engine performance.
Image © Stéfan