Duplicate Content Penalty?

Yes, there are lots of webmasters who tried to post a duplicate content on their website and get banned. Once you post a duplicate content that you get from other site, Google and other search engines might send you a notification that your site is now "no-index". No-index means your site is not "crawlable".
 
This is actually not the case, there are many news sites out there that post articles from wire services as well as press releases and they will will be included in the search results, depending on a number of factors.
 
of course its slanted , google doesnt want u to know how to game their system, they want u to have to pay for google adwords.

case in point, hpbl networks have been getting deindexed left and right recently....
 
In terms of the HPBL nets though, thats a known pay to play thing going around google.

They dont want you buying homepage links to get higher organic instead of paying them for clicks.
 
As long as its 30% its safe, and the penalty is a myth.

Here's a simple logic test to explain why the penalty is a myth.

Say there was a penalty, how does google determine which content was there first?

The spider doesn't visit every page on the internet every second, it can't, there is not enough bandwidth.

To tank a competitors blog, you could put up a "duplicate" blog, then use a crontab/curl job to scrape his blog. Then dupe all content the second it posted, and ping google.

There'd be a 50/50 shot you'd get credit for his original content.

Stick that on an offshore server, private reg, in a country that doesnt allow recopical court agreements, and you could do it till he turned that site into google, then do it with a new ip, another new ip, into infinity.

Why don't people do that now? Is everyone marketing on the internet playing nice?

It's because there is no duplicate content penalty. Multiple people, including myself, have tested it over and over.

The way google determines part of their punishment for panda is by flagging sites and paying people from home to manually look at them to try to find out if they're using scraped content, but thats only after it flags an automated trigger, and it takes WAY more than 50% pure duplicate content.

You can also be reported by the person who originally wrote the content to google, and they will investigate, but only if reported, and 30% unique passes copyscape.

Again, the idea of a duplicate content penalty is something that was floated by matt cutts, and you can 100% of the time do the opposite of what he says to do and see better results.

He says things to spread misinformation about the flaws and holes in googles system, a ton of what he says is pure lies.

The comedy is that a lot of people repeat it, as if his goal for the company he works for is to help you achieve organic search position rather than buy PPC from the company who pays his salary.
- - - - - - - - - - - - - - - - - -
The reason I know about them paying people to manually check is that I know a guy that was in the program working from his house doing it.

They've been trying to use the results of the manual program to develop criteria to do it automatically, but the problem is that no matter what, the higher you tried to make the accuracy rate be, the more likely you'd be to punish the innocent.
 
Last edited:
Back
Top