SEO Algorithm Updates-2019

Google+ Pinterest LinkedIn Tumblr +

In computer language, to perform a particular task, a set of commands are executed, these set of commands are known as an algorithm. The same definition applies to Google algorithms also. The search engine returns millions of results for inputted queries depending on given input for anything searched on Google. But how does it decide which results to show you, and in what order? The answer to this question is The Algorithms, which helps to do this in a manner wherein you get accurate results in the shortest possible time. 

What Is a Google Algorithm?

Google has several very complex algorithms for serving search results, and it keeps updating the algorithms relatively frequently to keep up with the accuracy of the search results. They do announce the rollouts to algorithms but do not make the exact algorithm public. Here are a few elements that inevitably have an impact on a page’s ability to appear in the results for certain keywords:

  • The appearance of keywords in the page’s title, meta description, and header tags
  • The number of organic links to the page
  • Performance of website on mobile devices, such as smartphones and tablets

These are just a few of several characteristics the Google algorithm explores when determining how to deliver and rank pages. 

As the Google algorithm “reads” a webpage, several algorithms run in the background and assign a pre-ordained numerical value to each trait it’s looking out on the page. This value is then added to the final result. Finally, the web page that meets the most desirable traits will rise to the top of the page rankings because the algorithm assigns it more importance.

These calculations are done very quickly by the Google platform, and depending on it the rankings may fluctuate as web developers also keep manipulating the attributes that contribute to page rankings on a single page or across a website.

Rankings, as assigned by the Google algorithms, are known as fluid as they keep changing. A page that ranks second for a keyword may rise to first or fall to eighth as the content on both it and the other pages in the results change. Usually, the top rankings are held by businesses that keep on doing search engine optimization, or SEO, on their sites.

Even a slight change in the order of the words or spellings can change the search result as the algorithm is adjusted for each search. 

Google keeps on updating algorithm once or twice every month, but it is not essential that all of them have an equally strong impact on the SERPs. To help you better understand all the algorithms and the updates in the past years, here is a detailed list with the most important updates and penalties rolled out in recent years, along with a list of the do and don’ts tips for each. 

Contents 

1 Panda 

2 Penguin 

3 Pirate 

4 Hummingbird 

5 Pigeon 

6 Mobile-Friendly Update 

7 RankBrain 

8 Possum 

9 Fred 

1. Panda  

Panda is the name of a Google algorithm update that has been developed to reduce the widespread presence of low-quality, thin content in the search results, and to reward unique, compelling content. It is an algorithm that assigns a content quality score to webpages and down-rank sites with low-quality, spammy, or thin content. Initially, Panda was just a filter rather than a part of Google’s core algorithm, but in January 2016, it was officially incorporated into the ranking algo. Panda algorithm occasionally identifies new signals that can be used to separate documents into “low quality” and “high quality” groups. For Google’s primary indexing and scoring system, the Panda algorithm runs continuously. It essentially runs on autopilot.

In the pursuit to deliver high-quality sites and web pages to the top of the organic search results, Google Panda updates specifically tweak the algorithm. It works continuously to lower or penalize the rank of lower-quality or “thin’ web sites and pages, particularly those sites that display a large amount of advertising without much in terms of high-quality content. 

Launched: Feb 24, 2011
Rollouts: ~monthly
Goal: De-rank sites with low-quality content 

Panda Algorithm Checks For

  • Duplicate content
  • User-generated spam
  • Plagiarism
  • Thin content
  • Keyword stuffing
  • Poor user experience

How to Optimise for Panda

1. Check for duplicate content across your site. The most common Panda trigger is Internal duplicate content, so regular site audits are suggested to make sure no duplication issues are found. You can use SEO PowerSuite’s Website Auditor to do so.

If you can’t take down the duplicate pages for some reason, you can block the pages from indexing with robots.txt or noindex meta tag or use a 301 redirect or canonical tag. 

2. Check for plagiarism. Another major Panda trigger is plagiarism i.e. external duplication. If there is external duplication of some or all pages, check them with Copyscape, Grammarly or duplichecker. All these tools give some of its data for free but for a comprehensive check, you may need a paid account. 

Many e-commerce sites that have thousands of product pages cannot always have 100% unique content. So, on an e-commerce site, try to use original images where you can, and utilize user reviews to make your product descriptions stand out from the crowd. 

3. Identify thin content. Thin content websites or pages are those which have an inadequate amount of unique content on a page. Often, such pages have a low word count and are filled with ads, affiliate links, etc., and provide little original value. It’s good to measure word count and the number of outgoing links on the page. 

Use WebSite Auditor to check for thin content, navigate to the Pages module in your project and locate the Word count 

Also, check for external links, switch to the Links tab and examine the External links column, showing the number of outgoing external links on the page. 

The total number of outgoing links on every page as recommended by Google is under 100 and word count should not be under 250 or else that’s a pretty solid indicator of a thin content page. 

4. Audit your site for keyword stuffing. Keywords play a major role in SEO and pages are optimized for certain keywords but when you over do that’s keyword stuffing. Keyword stuffing is the over-optimization of a given page element for a keyword. 

Use the WebSite Auditor project to check for keyword stuffing. Go to Content Analysis, and add the page you’d like to check. Enter the keywords you’re targeting, and let the tool do a quick audit. When the audit is done, pay attention to keywords in title, Keywords in body, Keywords in meta description and Keywords in H1. Click through them one by one, and have a look at the Keyword stuffing column. if any keyword is overused in any of these page elements, you’ll see a Yes value here. Switch to the Competitors tab to see how your top competitors are using keywords.

5. Fix the problems you find. Now you know what all Panda algorithm checks for and how to stay safe from getting penalized. Try to fix the panda-prone vulnerabilities as soon as you can before the next Panda iteration or to recover quickly if you’ve been penalized. Edit your pages in WebSite Auditor if you go to Content Analysis > Content Editor. Once necessary changes are done, use the Save button to save the upload-ready HTML file to your hard drive. 

2. Penguin 

Google Penguin algorithm can be said as an extension to the Panda algorithm. They both were designed to tackle low-quality content, it started with Panda but there was still a lot of spam and to tackle that penguin was introduced. The algorithm’s objective was to reduce black hat techniques.

It aims to identify natural, authoritative and relevant links and down-rank sites with manipulative, spammy, unnatural link profiles. Penguin is part of Google’s core ranking algo since late 2016 and operates in real-time, which means that penalties are now applied faster, and recovery also takes less time. 

Penguin algorithm deals only with a site’s incoming links. Google does not look at the outgoing links at all from that site. It only looks at the links pointing to the site in question. Penguin algorithm is sort of like Google putting a “trust factor” on your links.

Launched: April 24, 2012
Rollouts: May 25, 2012; Oct 5, 2012, May 22, 2013, Oct 4, 201,; Oct 17, 2014, September 27, 2016,October 6, 2016,real-time since
Goal: De-rank sites with spammy, manipulative link profiles 

Penguin Algorithm Checks For 

  • Links from poor quality, “spammy” sites.
  • Links from irrelevant sites
  • Paid links
  • Links  from sites created purely for SEO link building (PBNs)
  • Links with anchor text which is overly optimized 

How to Optimise for Penguin

1. Monitor link profile growth. 

With one or two spammy links Google isn’t likely to penalize a site, but if there is a sudden influx of toxic backlinks it would be a problem. Check out for any unusual spikes in your link profile, and keep looking out for the new links you acquire. Using PowerSuite’s SEO SpyGlass, you can see progress graphs for both the number of links in your profile and the number of referring domains. 

2. Check for penalty risks.

Penalties that are imposed by the Penguin algorithm can be known by using SEO SpyGlass and its Penalty Risk formula. Rather than weighing each individual factor separately, you can weigh them as a whole, like how Google does. 

In your SEO SpyGlass project, go to the Linking Domains dashboard and navigate to the Link Penalty Risks tab. Click Update Link Penalty Risk by selecting all domains on the list. Let it evaluate all kinds of quality stats for each domain. When the check is done, examine the Penalty Risk column, and make sure to manually look into every domain with a Penalty Risk value over 50%. 

With the SEO SpyGlass’ free version, you’ll get to analyze up to 1,000 links; if you want to audit more links, you’ll need a Professional or Enterprise license. 

3. Get rid of harmful links.

Spammy links are a big problem and you should remove the spammy links in your profile by contacting the webmasters of the linking sites. But if you don’t hear back from the webmasters or have a lot of harmful links to get rid of, disavow the links using Google’s Disavow tool. By Disavowing, you’ll be telling Google to ignore those links when crawling.SEO SpyGlass can automatically generate them for you in the right format Disavow files can be tricky in terms of syntax and encoding. 

3. Pirate 

As the name suggests the Pirate Algorithm of Google is to keep a check on Piracy. Google introduced this filter as a new way to demote pirated sites on its search engine results that have a large number of violated copyright notes. The filter is updated from time to time. When this happens, sites that previously escaped may be impacted. The filter may also catch new pirated sites, and may also release sites that were falsely accused of piracy.

Google’s Pirate Update was designed to prevent sites that have received multiple reports of copyright infringement filed through Google’s DMCA system, from being well ranked in Google’s search listings. The majority of sites affected are relatively big and well-known websites that made pirated content (such as movies, music, or books) available to visitors for free, particularly torrent sites. All said, but  it still isn’t in Google’s power to follow through with the numerous new sites with pirated content that emerge literally every day. 

Launched: Aug 2012
Rollouts: Oct 2014
Goal: De-rank sites with copyright infringement reports 

Pirate Algorithm Checks For

  • Pirated content
  • The high volume of copyright infringement reports
  • Lowering the rank of those sites.

How to Optimise for Pirate

Don’t try to copy or distribute anyone’s content without the copyright owner’s permission. That’s it. 

4. Hummingbird 

The name of this Algorithm is derived from “The Humming Bird” known for its speed and accuracy. It is the code name given to a significant algorithm change in Google Search in 2013. This algorithm gives more emphasis to each and every word in a query takes into consideration the whole sentence and, making sure that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words to give better search results.

 The goal of this algorithm is that pages matching the meaning do better, rather than pages matching just a few words. Google Hummingbird changed the way in interpreting search queries, (particularly longer, conversational searches) and it aims at providing search results that match searcher intent, rather than individual keywords within the query. 

While keywords within the query continue to be important, Hummingbird adds more strength to the meaning behind the query as a whole. Instead of listing results with the exact keyword match, the use of synonyms has also been optimized with Hummingbird, Google shows more theme-related results in the SERPs that may not have the specific keywords from the query in their content. With Hummingbird, longer-tail queries  Google can now better answer those even if a page is not optimized for them. 

Launched: August 22, 2013
Rollouts:
Goal: By better understanding the meaning behind queries producing more relevant search results.

Hummingbird Algorithm Checks For  

  • Exact-match keyword targeting
  • Keyword stuffing

How to Optimise for Humminbird

1. Expand your keyword research. Keywords play a very important role in SEO and hence with Hummingbird, focus on related searches, co-occurring terms and synonyms to diversify your content instead of relying solely on short-tail terms you’d get from Google AdWords. Google Related Searches, Google Autocomplete, and Google Trends are great sources of Hummingbird-friendly keyword ideas. 

2. Discover the language your audience uses. 

Improving Local Search

Hummingbird uses semantic search and natural language processing that take into account how humans think, desire, and use search engines to find what they’re looking for. Your website’s copy should be in the same language as your audience, and Hummingbird is yet another reason to step up the semantic game. Utilizing a social media listening tool (like Awario) to explore the mentions of your keywords (your brand name, competitors, industry terms, etc.) and see how your audience is talking about those things across the web and social media. 

3. Ditch exact-match, think concepts. With the search engines’ growing ability to process natural language, using unnatural phrasing, especially in titles and meta descriptions among websites can become a problem. This means that the long-held practice of spamming a page with keywords and fake business names is not favored, although this is still a practice that many uses (often successfully) to this day. Stop using robot-like language on your pages. 

Including keywords in your title and description is important, but sound like a human when doing so. Improving your title and meta description helps to increase the clicks your Google listing gets. 

To play around with your titles and meta descriptions you can use SEO PowerSuite’s WebSite Auditor

5. Pigeon 

Google Pigeon is an algorithm rolled out with the purpose of “providing a more useful and relevant experience for searchers seeking local results,” according to Google.

This algorithm uses distance and location as key ranking parameters. Google Pigeon currently only impacts the organic local listings within both Google’s Map Search and Web Search in  U.S. English. According to Google, the same SEO factors are now being used to rank local and non-local Google results thus creating closer ties between the local algorithm and the core algorithm.

Pigeon led to a significant (at least 50%) decline in the number of queries local packs are returned for, gave a ranking boost to local directory sites, and connected Google Map search and Google Web search in a more cohesive way. 

Launched: July 24, 2014 (US)
Rollouts: December 22, 2014 (UK, Canada, Australia)
Goal: Provide high quality, relevant local search results 

Pigeon Algorithm Checks For

  • Poorly optimized pages
  • Improper setup of a Google My Business page
  • NAP inconsistency
  • Lack of citations in local directories (if relevant)

How to Optimise for Pigeon 

1. Optimize your pages properly. Pigeon algorithm is a boon for local searches hence the same SEO criteria are applicable for local listings as for all other Google search results. Because of Pigeon Algorithm, local businesses now need to invest a lot of effort into on-page optimization.

You can use  SEO PowerSuite’s Website Auditor to run an on-page analysis. On the Content Analysis dashboard, you will get a good idea about which aspects of on-page optimization you need to focus on, also look for the factors with the Warning or Error statuses. You can also see how your Competitors are handling any given part of on-page SEO, Switch to the Competitors tab to see. 

2. Set up a Google My Business page. For coming in the local search results, Creating  Google My Business page is a must for your business to be included in Google’s local index. The second step is to verify your ownership of the listing; this involves receiving a letter from Google with a pin number which you must enter to complete verification. 

Categorization of your business correctly is a very crucial step as you set up the page. Don’t make mistakes in the categorization of your business — otherwise, your listing will not be displayed for relevant queries. Use your local area code in the phone number; the area code should match the code associated with your location. Encourage happy customers to review your place as the number of positive reviews can also have an influence on local search rankings.

3. Make sure your Name, Address, and Phone is consistent across your local listings. Make sure that the name, address and phone number of your business and the website with which your Google My Business page is linked are all consistent as Google will be looking at the website you’ve linked for cross-reference. All elements should match. 

Make sure the business name, address, and phone number are also consistent in local directories if your business is also featured in local directories of any kind. Different addresses listed for your business on different websites may put your local rankings to nowhere. 

4. Get featured in relevant local directories. Local directories have seen a major ranking boost after Pigeon. So it may be harder for your site to come in top results now, it’s a good idea to feature in the business directories that will likely rank high.

You can use  SEO PowerSuite’s link building tool, LinkAssistant to easily find quality directories and reach out to webmasters to request a feature.

6. Mobile-Friendly Update 

Mobile-Friendly Update or Mobilegeddon is a name for Google’s search engine algorithm update of April 21, 2015.  This update is basically to improve the user experience to mobilefriendly sites as the desktop version of a site might be difficult to view and use on a mobile device.

Google’s Mobile-Friendly Update (aka Mobilegeddon) is to ensure that the pages optimized for mobile devices rank at the top of mobile search, and subsequently, down-rank pages that are not mobile-friendly. Desktop searches have not been affected by the update. 

Mobile-friendliness update works on page-level, meaning that one page of your site which is optimized can be deemed mobile-friendly and up-ranked, while others might fail the test. 

Launched: April 21, 2015
Rollouts: — May 12, 2016

Goal: Give mobile-friendly pages a ranking boost in mobile SERPs, and de-rank pages that aren’t optimized for mobile 

Mobile-Friendly Update Checks for

  • Lack of a mobile version of the page
  • Improper viewport configuration
  • Illegible content
  • Plugin use

How to optimize for Mobile-Friendly Update

1. Go mobile.

With the widespread use of mobiles for searches, making the website compatible with mobile configuration is a must. Google has a responsive design and a few mobile website configurations to choose from. For various website platforms, Google also has specific mobile how-tos to make going mobile easier for webmasters. 

2. Take the mobile-friendly test.

After making the necessary changes to the website you need to also pass Google’s mobile-friendliness criteria to get up-ranked in mobile SERPs.

You can use   SEO PowerSuite’s WebSite Auditor as Google’s mobile test is integrated into it so that you can check your pages’ mobile friendliness quickly. 

 WebSite Auditor -> YourProject -> Content Analysis -> Add page(Page to be analysed).

 Enter your target keywords and run a quick page audit. When the audit is done, switch to Technical factors on the list of SEO factors on the left, and scroll down to the Page usability (Mobile) section. 

The Mobile-friendly factor will let you know whether your page is mobile-friendly overall and also you get a mobile preview of your page. If there is any Error or Warning status, click on any factor to know specific how-to-fix recommendations. 

7. RankBrain 

RankBrain algorithm was launched in early 2015 and is said to be the third most important algorithm for ranking factors.RankBrain is an artificial intelligence system that helps Google better decipher the meaning requested in queries, and serve best-matching search results in response to those queries

There is a query processing component in RankBrain and also a ranking component to it. It is believed that RankBrain can analyze what a page is about, evaluate the relevance of search results, and get better at it with time. 

The key purpose of RankBrain is to act as a query processor, convert into machine language embedding written and verbal search strings into mathematical vectors. Google engineers periodically review the system and optimize the models it uses – in this sense, the machine learning element is carried out offline rather than on real-time data

RankBrain relies on the traditional SEO factors (links, on-page optimization, etc.), but also looks at other factors that are query-specific. After analyzing it identifies the relevance features on the pages in the index and arranges the results respectively in SERPs. 

Launched: October 26, 2015 (possibly earlier)
Rollouts:
Goal: Using Machine learning delivers better search results based on relevance. 

RankBrain Checks For 

  • Lack of query-specific relevance features 
  • Poor user experience

How to optimize for RankBrain 

1. Create irresistible snippets

 Improve organic click-through rates to increase your probability of success, as it’s suspected  that RankBrain’s “Relevance Score” is the same as AdWords’ “Quality Score.”If your click-through rates are less than impressive, focus on improving your SERP snippets. Page titles and meta descriptions should echo the user’s need, stand out on the search results page and entice the user to click for more. Simplify URLs so they reinforce value to Google and to users. Your snippet should be irresistible.

2. Mimic the pros

Taking ideas and modeling your site and content after domains that Google recognizes as authoritative can help. For instance, in the e-commerce industry, Google knows that a site like amazon.com is a reputable site and is near the top of its searchable index. Anything that looks like the structure of amazon’s site will be associated with the “good” camp. Similarly, any site that looks like the structure of a known spammy site in the e-commerce vertical will be associated with the “bad” camp.

3. Rethink how content uses keywords

Keywords have always been an important part of SEO, so this is not new SEO advice — and RankBrain might be the last nail in the coffin of the old way of thinking about SEO keywords. 

Stop giving importance to only one keyword, creating pages or content tailored to only one keyword or keyword phrase. For maximum effect, try incorporating your targeted keywords, as well as their variations and related keywords, and additional words that most commonly appear in the same context as your targeted keywords.

Another way to think of this process is that we’re grouping keywords into concepts, and then converting each concept back into a representative keyword/phrase: Keyword –> Concept –> Keyword. Gather keywords, group keywords into clusters, and generate exemplars. The result is a specific search phrase to target, but that phrase represents potentially dozens or hundreds of similar keywords.

4. Write conversationally

While keywords are still something you need to take into account when optimizing, it’s important that you should optimize for people, not RankBrain.

Focus your attention on providing a better user experience, analyzing your visitors’ behavior and making changes accordingly, don’t try to please search algorithms and systems. If your content is relevant and people appreciate it relevant, the algorithms will start doing the same naturally.

5. Decode ranking factor priorities for your industry

RankBrain seems to be weighing various ranking factors differently, based on the industry, user intent and so on, implied by the query. Hence, there is no one set of ranking factors anymore.

It was earlier that one set of fixed inputs governs every single query. But now some queries are going to demand signals in different proportions to other ones.

Sometimes there is a need for fresh content. Sometimes there is a need for very in-depth content. Sometimes you need high engagement. Sometimes you don’t. Sometimes you will need tons of links with anchor text and sometimes you need high authority to rank for something. This means that we can’t actually optimize for RankBrain, but we can optimize with RankBrain in mind. 

You need to spend more time on SERP doing good SEO research. These factors seem to be important:

  • Freshness. To check for all the top-ranking pages freshness? Are they recent? RankBrain might be prioritizing newer content.
  • Links. Check that all the best-performing pages happen to have a lot of links or linking domains. Turn on your MozBar and see. RankBrain may be prioritizing authority and backlinks.
  • Keywords. An important factor. Are certain terms or phrases appearing in every title and/or meta? RankBrain knows that users are looking for those.

6. Maximize user experience.

User experience is the overall thing. RankBrain isn’t the reason to serve your visitors better but why not optimize for user experience. 

Use Google Analytics to keep an eye on your pages’ user experience factors, particularly Bounce Rate and Session Duration. While there are no universally right values as standard, here are the averages across various industries reported by KissMetrics.

If your bounces for some of the pages are high, consider A/B testing different versions of these pages to see which changes drive better results. 

The average reading speed of a user is 650 words per minute. Use this as guidance for session duration and see if you can improve that by diversifying your content, such as including more images and videos. Also, examine the pages that have the best engagement rate, and use the information in crafting your next piece of content. 

8. Possum 

Google released a new algorithm on September 1, 2016, which appears on the search results page within Google Maps. This algorithm updated and impacted the Local Search results for Google Places, Google Maps, and the Map Pack, or the “3-Pack” for the three listings as the SEO Community calls it.

Which businesses will appear in local search results and when now is decided by Possum, Google’s local search rankings algorithm. It sometimes filters business listings out of search results. This means that some local businesses might or might not always appear for the same search.

Using Possum, Google now returns more varied results depending on the 

  • The physical location of the searcher (Chances are more to come in local results if you are closer  to a certain business physically)  
  • The phrasing of the query (even close variations in words now produce different results). 

Somewhat self-contradictory, Possum also gave a boost to businesses that are not inside the physical city area. (Earlier, if your business wasn’t physically located in the city you targeted, it was hardly ever included in the local pack but now this isn’t the case anymore.) Also, businesses are de-ranked in the search results that share an address with another business of a similar kind. 

Launched: September 1, 2016
Rollouts:
Goal: Deliver diverse and better results based on the searcher’s location and the business’ address 

Possum Algorithm Checks For 

  • Sharing a physical address with a similar business 
  • Competitors whose business address is the same or closer to the searcher’s location

How to Optimise for Possum Algorithm

1. Do geo-specific rank tracking. 

Location always was a big factor in search queries but after Possum it has a more crucial role in ranking and results which you get.

Set up a custom location for your business to check positions in SEO PowerSuite’s Rank Tracker

Open the tool->Create Project->Add search engines->Add Custom->Specify Preferred Location

Preferred location (since Possum made the searcher’s location so important, it’s important  to specify something as specific as a street address or zip code) 

You can also modify the list of the local search engines you’re using for rank checking in Preferences > Preferred Search Engines

2. Expand your list of local keywords. 

The Phrasing of the query, the word mentioned and the order of the words matter a lot with  Possum update so it’s important that you track your positions for every variation separately. With Possum there is a great variety among the results for similar-looking queries.

 To discover the variations, open SEO PowerSuite’s Rank Tracker and create or open a project. Next, go to the Keyword Research module and click Suggest keywords. Use the localized terms you are already tracking and hit Next. Select Google Autocomplete as your research method. 

This will give you an ample list of terms that are related to the original queries you specified.

9. Fred 

Black hat tactics in SEO are unethical tactics used against search engine guidelines, to get better site ranking and coming higher in search results. Black hat techniques include keyword stuffing, excessive ads and images, cloaking, and using private link networks.

Google Fred is an algorithm which was designed by Google to target black-hat tactics that are connected to overly aggressive monetization. Google Fred specifically looks for low-value content, excessive ads, and websites that generally offer very little user benefit.

 Google confirmed the update took place but refused to discuss the specifics of it but because of this update, there was a huge shift in rankings and traffic for sites deploying black-hat and heavy ad monetization tactics. The majority of websites affected were content sites that have a large number of ads and seem to have been created with no intention of solving users’ problems but for the purpose of generating revenue. The studies of affected sites show that the vast majority of them are content sites (mostly blogs) with articles of low quality on a variety of topics just to generate ad or affiliate revenue. 

What Websites Were Affected By Fred:

  •  An extremely large presence of ads
  • Content  in blog form on all sorts of topics created for ranking purposes
  • Content has ads or affiliate links spread throughout, and the quality of content is far below industry-specific sites
  • Deceptive ads which looks like a download or play button to trick someone into clicking
  • Thin content
  • UX barriers
  • Mobile problems
  • Aggressive affiliate setups
  • Aggressive monetization

Launched: March 8, 2017
Rollouts:
Goal: Filter out low-quality search results and websites using Black Hat tactics whose sole purpose is generating ad and affiliate revenue 

Fred Algorithm Checks For 

  • Low-value, ad-centered content
  • Thin, affiliate-heavy content

How to Optimise for Fred Algorithm

1. Review Google’s guidelines. Reviewing the Google Search Quality Guidelines and Google Webmaster guidelines is a good start in keeping your site safe from Fred. 

2. Watch out for thin content. All publishing sites use ads, so it’s not the ads that Fred targets; it’s the content. Thin content is less word count and low-quality content, audit your site for this and fix it with relevant, useful information. 

You can use SEO PowerSuite’s Website Auditor and look for the Word count column. Sort the pages by their word count by clicking on the column’s header to instantly spot pages with too little content. 

It’s not that short pages cannot do good, they can do perfectly fine for certain queries. Go to Content Analysis to see if your content length is within a reasonable range for your target keywords, and select the page you’d like to analyze. Enter the keyword and let Google examine your and your top-ranking competitors’ pages. When the analysis is done, look at Word count in the body. Click and see how long the competitors’ pages are. 

3.For Fred Full recovery following points matter

  • Scaling down the number of ads on your site
  • Review the placement of ads on your site. Do they contribute to poor user experience
  • Review the user experience of your site, and make a schedule to do this periodically. Keep upping the ante of your content
  • Review the content to be sure it serves a purpose, and that purpose is outlined in the form of metadata and tags

Conclusion

So, these are the main 9 updates of Google to date whose deep understanding can result in higher ranking, along with some quick auditing and prevention tips to help your site stay afloat in Google search. Following Google guidelines and keeping yourself updated about the rollouts will help you in the long run.

Share.

About Author

Leave A Reply

12 + 9 =