5 Content Syndication Myths | Hard To Believe, But True!

Duplicate content? NO WAY!

duplicate content

That’s the reaction we get from a lot of content creators and marketers. That sounds perfectly right!

You shouldn’t have duplicate content because it is plagiarism (and illegal!) and you should consider content syndication (which is legally sharing content on third-party sites) because it is not.

There is a huge difference between in both duplicated content and syndicated content. Owing to this confusion there are a number of myths which surround content syndication as a content marketing strategy.

The biggest content syndication myth: Syndicated content is duplicated content

That is one of the most common content syndication myths that float around the marketing world. But, that is not completely true.

You need to double check your duplicate content facts. Let’s take an example.duplicate content factsHuffington Post features already published content sometimes but they always credit the original content creator. Contrastingly, there are sites which only scrape content from others. Neither do they have any original material nor do they credit the source.

If you fall in the second category, you have successfully proved the myth right. You will get a penalty if you haven’t already.

content syndication

But, if you fall in the first category, you have nothing to worry.

google penalty

However, there is one thing you should worry about and i.e. ranking.

As per Google: “If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer.”

This happens when you syndicate on a website which has a higher domain authority than yours. One way to tackle this is to include a link back to your original post and claim Google authorship.

You should ask your syndicating partner to add the rel=canonical tag to your re-published posts. It will tell Google where the original content is. Alternatively, ask them to add “noindex” tag to the content. According to Neil Patel, the former works much better.

dont panicA lot of content creators also believe that blocking the crawler access would help. This gets us to our next myth

You should block crawler access

If you have ever read Google policies, you will know the truth of duplicate content.

According to Google Support “Google does not recommend blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods”. If search engines can’t crawl pages with duplicate content, they can’t automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages.

A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel=”canonical” link element, the URL parameter handling tool, or 301 redirects.

However, you can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”

Losing out on traffic

Isn’t that logical to think at first? Why would you want to syndicate your original content and lose out on the organic traffic?

organic traffic

But, this is a flawed thought. Think about it.

If you are syndicating content on a branded website like Huffington Post, Forbes or  Hubspot, you will get referral traffic which (honestly!) could be more than what you would have received had you not syndicated.

After all, most of the times you are allowed to link back to your website (which is also good for SEO considering you are getting a backlink from an authoritative website) or social media channel.  

social media traffic

Instead of feasting on your traffic, content syndication helps you feast on the traffic of an established website and put your content in front of more visitors.

Scraper will hurt your website or republishing will

Scrapers, as discussed above, are generally penalized by Google.

I am sure you are already aware of Google’s Panda update. It was released to deal with this issue. Google’s Panda penalizes sites which are primarily scrapers (who offering no value to the readers) and rank them low. There is no way they can hurt your website or outrank original content.

You need not worry about this at all now.

Google PandaThat gets us to the next question, does it hurt to republish a guest post on your own website?

It could. But, there are ways to ensure it doesn’t happen.

Use the  rel=“canonical” Tag. This is used to indicate that the original version appeared somewhere else. Just add the “tag” and publish without worry.

If you are still worried, make changes to the post. You can add more value or change the tone. For example, if you had a guest post on content syndication myths you could rewrite it as a guide to content syndication. You can also repurpose it in a different format.

Affects brand reputation

This is plain ridiculous! Content syndication gives you an opportunity to build your brand and re-publish on authoritative sites. It also gives you a chance to be among the thought leaders of your industry.

There you go and there goes all the confusion surrounding content syndication myths and duplicate content facts.

With this awareness, you should no longer be raising eyebrows when your team suggests content syndication

Vikas Bhatt

Author Vikas Bhatt

Vikas is the co-founder of OnlyB2B ITES Pvt Ltd and a Demand Generation cum Data Cleansing Expert. He has 10+ years of experience in B2B Lead Generation, Data Mining, and Content Syndication.
Say hi on vikas.bhatt@only-b2b.com

More posts by Vikas Bhatt

Join the discussion One Comment

Leave a Reply