Just like many other webmasters looking for greater attention from search engines and users alike, you too may have considered using content curation to expand your reach on the net. However, you need to understand all aspects on curated content before you try out your SEO tactics and get noticed by major search engines such as Google.
Content curation that can prove to be of great help to visitors are links that can lead them to the latest in news and providing resources on specific topics in the form of a list. If you have wondered how major search engines such as Google treat content curation then Matt Cutts, the visible face of Google has provided an answer with a chart.
The chart displays a spectrum that explains a rise in publisher quality that leads up to The New York Times, which possesses very high rankings. Google has determined high rankings for The New York Times based on its high brand strength since the site possesses fresh and high-quality content as the publication cannot risk diluting its image by publishing poor or duplicate content.
The NYT website also possesses high quality links, which has transformed the site into an authoritative one. Sites with links to other spam sites end up on the other side of the spectrum as compared to the NYT site that maintains a high standard in linked resources.
Two other signals that are considered by Google while ranking are the rel=publisher and rel=author tags, and you must use them as per Google’s guidelines to attract the attention of the company’s search bots. However, while Google has provided ranking for The NYT on one side of the spectrum and sites with poor content curation on the opposite side, it does not work hard to rank sites that may possibly be somewhere in the middle.
In case your site has curated content that you feel may take you to the middle then you need to take a few more steps to reach towards the other side of the spectrum. One way to achieve this is by including analysis and views from accepted experts as well as unique experts along with using the rel=author tag. You must ensure that such content including expert reviews and analysis are not published on other sites. This will provide a distinctive point of view that will be recognized and probably rewarded by Google.
You should also try to access data from sources that may not be accessible to Google. This data should be relevant to queries posted by users and has a high potential of getting noticed by Google. In addition, you can speed up your research and posting process to ensure that you can update fresh content even before Google. This tactic too will be rewarded by Google with higher rankings.
In short, Google will boost your webpages to the top only when its bots attach sufficient value to your content and your efforts. In case you feel that Google will not attach value to your content then it may be a good idea to put noindex tags on such pages.