How To Avoid The Duplicate Content Trap

Unique quality content is essential for search and social. Duplicate content dents your chances of achieving high SERPs at best, or penalises your site at worst.  Search Engines do test for the uniqueness of copy. If your content is similar to another, your site can be subject to duplicate content penalty. Read on to find out what is considered duplicate content and why you should avoid duplications at any time.

 

Seeing Double

Search Engines in general don’t like duplicate content and strongly dislike repetition and replication.  Copying content from other websites has been common practice amongst website owners. It is a bad practice and can be easily detected by search engines. The original content publisher gets all the credit, duplicate pages and sites get penalised.

Duplicate content is generated accidentally or even carelessly by website owners.  Often as a result of  poor content management or lack of knowledge. Just copy & pasting content in order to create new pages, with small changes in keyword focus does not make your content unique. Lack of attention when DNS/domain mapping can also lead to duplications. Google for example can detect exact duplicates and near duplicates all the way from the crawl, through the indexing, through the rank scoring.

Typical scenarios of duplicate content:

  1. Identical pages within your site because they contain only minor changes such as keyword replacements
  2. Localised pages within your international sites could be considered identical
  3. Campaign landing pages with minor keyword changes to original pages
  4. Identical page to pages published on other websites. If the other site has a better PR, your site’s content might be considered duplicate and may be removed from the index
  5. Press releases or other articles containing similar content syndicated across different sites

Make your pages unique
The best way to let search engines know that your content is unique by making all copy and all the page mark-up factors unique:

  • Document/file name
  • Meta title
  • Meta description
  • Meta keywords
  • Vary first paragraph copy
  • Heading 1
  • Heading 2
  • Heading 3
  • Alt text
  • Vary link text to other pages

If you can’t avoid duplicate content (often the case for ecommerce sites), exclude the duplicate pages in your robot.txt file and make them no-index.

Give us a shout should you need some help with your content optimisation.

Add new comment

Preventing

13 + 3 =

You might also like

Christmas decorations down? Check. New Year’s resolutions made (and for most of us...

read more

Tis the season of goodwill, cheer and presents! We all know everything is changing in...

read more