Blogging

Minimize Your Blog’s Duplicate Content

August 20, 2007

Duplicate content on blogs is usually a big issue if left unaddressed. By properly using your robots.txt file, you can significantly cut down on your blog’s duplicate content.

For example, if you make a blog post that is relative to multiple categories, that very same content will appear in multiple places on your blog. Let’s say you make a post titled “SamplePost”. In WordPress, before you upload that post to your site, let’s say you determine that the post fits into 3 different categories on your blog: Category1, Category2 and Category3. That means, when the SE robot crawls your blog, it will find the exact same post with the exact same content in multiple places:

  • www.yourblog.com/category/category1/
  • www.yourblog.com/category/category2/
  • www.yourblog.com/category/category3/
  • www.yourblog.com/index.php
  • www.yourblog.com/feed
  • and the archives section

So, your best bet is to use your trusty robots.txt file and block the robots access to your blog’s category sections. Here’s how you do it:
Disallow: /blog/category/category1/
Disallow: /blog/category/category2/
Disallow: /blog/category/category3/

Add that code to your file and upload it to your server. That way, each new post you create will only show up on the home page, the RSS feed and in the Archives. Then, once that post falls off your blog’s home page and your RSS feed, it will only appear in one place on your entire blog – the archives.

The more times the same content appears on the same site, the more the Search Engine devalues it. By controlling where the SE robot goes and what it sees, you will be making your site more valuable.

Kudos to Ross at stepforth.com for this great tip 🙂

designer

CCT

Are You Ready?

When you are ready to get more specific information about your project, click here and fill out our handy online form for a free web design quote.

Get My Free Quote

you need a website?
we can help

    order a project

    arrow

    Recent Posts

    In Web Design

    Domain Name Services fake letter.

    Blogging

    Domain Name Services Scam - Kind of

    Aug 20, 2007

    Duplicate content on blogs is usually a big issue if left unaddressed. By properly using your robots.txt file, you can significantly cut down on your ...

    Captivating Website Design - Morehead City NC

    Blogging

    Web Design Basics

    Aug 20, 2007

    Duplicate content on blogs is usually a big issue if left unaddressed. By properly using your robots.txt file, you can significantly cut down on your ...

    Dynamic Web Design in Morehead City NC

    Blogging

    Three Common Web Design Mistakes

    Aug 20, 2007

    Duplicate content on blogs is usually a big issue if left unaddressed. By properly using your robots.txt file, you can significantly cut down on your ...

    arrow

    Go to our blog