In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
This truly amazing and I’m gonna share this with like minded people. I loved the part about flippa. What a great source to get ideas. Building links tends to be the hardest to do, but a few good quality links is all you need now a days to get ranked. I currently rank for a very high volume keyword with only 5 links all with pr 3,4 and good DA and PA. Good links are hard to get but you only need a few which is encouraging! Props for this post!
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
Great article, Brian. Like that you’re finally talking about Domain Authority (DA). It’s essential to make skyscraper technique work as well. Also, a great pointer on comments as I have personally seen articles perform well because of comments. Do you recommend closing the comments as well a few days after the article is published? Kinda like Copyblogger does now.
I have been trying to produce more content because I believed the lack of traffic was to the small amount of content, but after reading your blog post, i’m beginning to doubt wether or not this is quality content. I will definitely do more research on influencers on my niche, now I have to figure out how to get their attention with my kind of content.
Google Analytics is free to use, and the insights gleaned from it can help you to drive further traffic to your website. Use tracked links for your marketing campaigns and regularly check your website analytics. This will enable you to identify which strategies and types of content work, which ones need improvement, and which ones you should not waste your time on.
It’s an awesome post which I like the most and commenting here for the first time. I’m Abhishek founder of CouponMaal want to know more like you’ve said above in the points relaunch your old posts. Here I want to know is there any difference between changing the date, time and year while we’re relaunching old post OR we should relaunch the old post with the previous date, time and year. I mean it matters or not.