The question of whether duplicate content can get you penalized in the search engines -- and how to avoid that if it can -- is a topic of hot debate these days.
Many webmasters live in fear, agonizing over questions like:
- What is duplicate content?
- Why do search engines care about duplicate content?
- Should they worry about duplicate content?
- What can they do to avoid a duplicate content penalty if there is one?
First, understand that there is no clear definition of duplicate content and it is not absolutely clear that there is any such thing as a duplicate content penalty. In general, however, the belief is that repetition of large blocks of the same content (such as an article or syndicated blog post) within either the same domain or across many different domains could trigger a penalty in search engines such as Google.
So why would search engines care about duplicate content?
If you think about how search engines work and how they make their income, the whole topic from their perspective becomes clear. Search engines live or die based on two things:
1. Do search engine users feel like they get the results they are looking for when they search?
2. Do the advertisers (and thus the search engine) make good revenue?
Suppose you publish an article on "How to write a good article." You submit it to all of the popular article directories and within a couple weeks it is published on 300 different websites.
If someone were to then go to Google and search for "how to write a good article," do you think they would feel like Google was producing good results if the first 300 were all links to your article, but just on 300 different sites?
Of course not. In fact, people would probably either complain or stop using Google. And as a side effect, advertisers would start making less revenue and would stop advertising. Then Google would not make money and before long they would have some serious problems.
So it becomes clear why Google, or any search engine, would not want to display many copies of essentially the same content for any given keyword search.
The logical conclusion is that search engines absolutely need to filter out as much duplicate content in the results as possible. If they filter out your website and display someone else's, even though the content is the same, it's going to look as if you were penalized for duplicate content. But actually, they are just trying to deliver the best results.
Many people believe that search engines should give precedence to the original source of the content. But how can Google or any search engine accurately determine which is the original of the content and which is the duplicate? While algorithms can likely make some logical conclusions, the truth is that it is not really possible to determine this every time.
Instead, search engines have to go by things like where the content seems to have first been published, which site is the biggest authority, etc.
This means that if you publish the same article on your site and on 100 sites that are all older and more established, it is likely that it may appear your site was republishing duplicate content, not the other way around.
There is an easy way to avoid this problem. Don't publish the same articles on your website that you submit to the article directories. You can publish variations, but be sure they are significantly enough different that they are unlikely to get your site filtered or penalized due to duplicate content.
Interestingly, this will also work for you in another way. If your articles are published on other sites which then all link to your site, and you don't duplicate any of the content, your site appears to actually be more of an authority site than the sites that are all sharing duplicate content.
And if your site is on the same theme and the article directories link to your website with keyword-rich anchor text, the incoming links will also seem to indicate to the search engines that your site is the authority on that theme. The sites publishing the duplicate content seem less authoritative.
In a similar way, if you syndicate your content, be sure to only syndicate the title and description paragraphs, not the whole article. That way, your site always has even more content on the topic, again making your site seem to be the authority.
This strategy can be far more effective than publishing the articles on your site. In fact, if you have been submitting articles for some time, you may even have seen this happening already.
As you can see, it's not really a question of whether there is a duplicate content penalty or not. Search engines must filter out duplicate content in order to present the best results to their users and maintain advertising revenue.
You can use this to your advantage by being sure that any duplicate content is on other websites that link to yours. Some of those sites will be filtered out of the search engine results and some won't. But they will all point to your site as the actual authority on your keyword-based theme. Do this, and any duplicate content will be your friend rather than your foe.
By Charles Hopkins |