Page 1 of 1

Looking Back to 2012: Major Changes in SEO

Posted: Mon Dec 09, 2024 9:27 am
by mstlucky8072
As we leave 2012 behind, we come across many articles abroad summarizing the past year in the SEO world. We wanted to make a short compilation and share it with you, but instead of the article that we could not finish due to the New Year rush, we wanted to translate the article " 2012 Year In Review: Important Shifts In Google SEO " published in SearchEngineLand last month and present it to you.

After 2012, when spammers suffered a major blow, we hope that updates like Penguin & Panda will continue in 2013 and provide us with a better search quality. In the 2013 publication period, we will continue to share white-hat SEO techniques with our valued readers. On this occasion, we wish all our readers a happy new year! Without making you wait any longer, let's leave you alone with this beautiful summary prepared by Tom Schmitz.

Reputation & Trust
Two of the words I hear and see the most are reputation and trust. However, as an SEO consultant, I will be dealing with the content reliability, design and external links of a website rather than a reputation management expert. Since the beginning, Google and similar search engines have provided a consistent quality service while criticizing cheaters. What has changed is that Google can now control unethical behavior more effectively and comprehensively than ever before. Google has started to show its teeth.

Google is making noise
Google used to avoid notifying domains that used homeowner database malicious webspam techniques through Webmaster Tools. That changed in April when the search engine expanded the types of messages and alerts it sent. Also see: Google Sent Over 700,000 Messages Through Webmaster Tools in the Past 2 Months .

Image


Penguin
Google introduced Penguin on April 24. Penguin penalizes websites that display signs of fake external links. When it comes to compensation, Google is pretty strict about requiring websites to make a big effort to remove all fake and low-quality links, no matter how old they are. The search engine rolled out its link disavow tool last month, but it treats the site addition process as a strong recommendation rather than an effective and fast shutdown command.

Even with the disavow tool, Google is in no rush to restore a site’s good standing. It waits until it starts crawling and indexing the URLs you disavowed before taking action. The window during which spiders crawl previously unindexed or low-quality pages can be as long as weeks or months.

As a result, there is no change in Google’s statement that some domains cannot be improved. It is worth noting that Google ignores links that it cannot trust. This means that websites can have many unreliable links until they pass certain statistical thresholds and Penguin steps in. Penguin is not an application that will replace manual reviews. Even if the Penguin algorithm is used, Google may still take manual action against a site due to unreliable links.

Panda
Google loves its Panda algorithm, which penalizes websites with too much low-quality content. Since November 18, 2011, Google has updated Panda 13 times. Panda works like a ratio-based penalty system. I’ve seen sites improve by replacing poor quality content with well-written, useful content. They also collect or better distinguish duplicate content, in whole or in part. A good example of this would be a company that uses the exact same text except for the city, country, and address information on each of its offices in different locations.

Rewarding Quality
With Google spending all its time trying to find low-quality work, it was nice to see the search engine make changes to identify and reward high-quality work in its June-July status updates.

Webspam
In April, Matt Cutts reported: We’re making a significant algorithm change in the coming days targeting wespam. This change will lower the rankings of sites that we believe violate Google’s current quality guidelines. Matt didn’t explain much about how this algorithm works, which is estimated to affect 3.1% of queries. In one case, the harms of obvious keyword stuffing were noted. In another, links to manipulated content were noticed. Because Matt had this to say: The sites affected by this change may not be easily identified without deep analysis or expert witness work, but the consensus is that these sites are doing much more than just good-faith SEO practices, and we believe they are using a number of webspam tactics to manipulate search engine rankings . I also think the update will include some form of language analysis.

Over Optimization
Last March, Matt Cutts announced an over-optimization update. We’re working to make GoogleBot smarter and more user-friendly. However, we’re also working to detect those who abuse this by putting too many keywords on a page or exchanging too many links or links that are too large to be expected.

What is the over-optimization penalty? We don’t know, but the SEO community has some opinions. When he made the announcement, Matt pointed out that a page he had previously identified had too many keywords. This month, Matt drew attention to sitewide backlinks and compared how Google counts these backlinks and keywords. I believe sitewide backlinks are part of the over-optimization algorithm.

I have provided a general example below. The first degree is good, the second is better, the third or fourth is very good, but each additional degree after that becomes less and less important until you cross the threshold of over-optimization. After a certain point, your optimization may even become suspect. I have chosen the Golden Ratio and the number of degrees arbitrarily in order to clearly explain this concept. It is not known how Google actually applies formulas for things like keyword frequency or repeated links. Of course, the number of degrees may also vary.