Penguin 4 Update And The Months To Come

Penguin 4 has replaced its Penguin 3 algorithm with the Penguin 4 algorithm, which is currently being implemented. Comparison between Penguin 4 and Penguin 3 reveals interesting insights, which are outlined below:

1. Penguin 4 does not place much emphasis on penalties, and it can be considered to be more lenient than its predecessor.

2. Google has used Penguin 4 to design the metrics used by the present manual spam team.

3. It can be inferred that Google considers the feature of disavow files to be redundant.

4. The novel granular formula that Google is implementing will take time to be computed, and its final release may be months away.

5. On 2nd September, Penguin 3 was terminated and removed by Google, so as to pave way for Penguin 4, which was ushered in on 23rd September.

Penguin

What do the Changes Mean?

The changes in the core codes of Google algorithms mean that the spam detection format will change. Previously – even before the Penguin 3 algorithm was implemented – detection of spam links was based on index of suspicion as computed by the search engine algorithm. This old format identified spam by detecting suspicious activity, including one IP address feeding multiple links to another IP address, domain having a lot of spammy anchor texts, quick addition of lots of links, and competitors filing spam reports. This would raise the suspicion of Google, and its manual spam team was then tasked with verifying the suspicions by reviewing the implicated website. If confirmed, the links were severed from the linking-out domain, and the penalized target domain was notified about the penalty. In the public domain, this penalty is invisible; but changes in page ranking can be spotted. Likewise, a penalty score could be added, and it was set to expire after a specific duration of time. Even so, this format of spam detection has changed with the implementation of Penguin 4.

Penguin 4 Algorithm

It uses a spam detection system that improves on the old detection format, in addition to introducing new and intuitive changes – including comparison of each new link added to the domain against the whole link graph of that domain, along with the link graphs of domains linking to the domain of interest; and detecting and identifying any unnatural comparisons. As expected, the process of iterating and comparing the links is not only resource-intensive, but is also time consuming. However, this ranks as the most intelligent scheme of detecting unnatural (or spam) links, as the spam domains normally work in concert when sending out spam links. Therefore, it is only by comparing the unnatural links with link graphs of all the related domains that one can identify and map out all the spam domains.

Another feature of Penguin 4 is that it can classify all the domains that are owned by a single entity. This function allows it group domains based on ownership, with domains owned by a single entity forming a single network. Thus, domains within a single network can be valued appropriately based on the nature of their links. Moreover, it allows for identification and comparison of natural and unnatural links within a network. The granular component of Penguin 4 will analyze the commonalities of the links among the domains contained in a network, and how each domain links to the other domains in the same network. It also inspects outbound links emanating from a single network. Therefore, based on the ratio of natural to unnatural links, an accumulative index of suspicion can be computed.

Accordingly, the penalty score will be computed as the sum of the individual penalties of all the domains within the network. The granular component of Penguin also maps the unnatural links across the domains in a single network, and this allows for identification of spam patterns. Even so, emphasis will be on the target domains that ranks the highest in search engine results. This allows the algorithm to pin down the domain that is manipulating its page ranking in search results. Thereafter, the suspect links will be purged; while legitimate domains that seem to have inadvertently linked to the suspect domains will not necessarily be penalized.

How Aggressive and Accurate is Penguin 4?

Penguin 4 uses the most intelligent algorithm to date to eliminate manipulation of search engine results using spam links. Its granular nature means that it will be much more effective than the earlier algorithms in detecting spam links, and relating them to their domains of origin. Thus, the system is both aggressive and accurate. The laser precision in detection of spam is evident in the confidence Google has in penguin 4, to the extent that it has abandoned the Penguin 3 algorithm in its entirety, including retiring the manual spam team that used Penguin 3 for spam detection. This also means that the cost of using more resource power will be offset by the savings Google will make after retrenching the old manual spam team. Likewise, the effects on SERPs (rankings in Search Engine Result Pages) will be quite noticeable, with the percentage of affected SERPs estimated to be as follows, in comparison to Penguin 3:

Penguin 3: 6% of SERPs are affected

Penguin 4: 18% of SERPs are affected

Nonetheless, as mentioned earlier, the full effect of Penguin 4 will not happen instantaneously as there is need to re-compute the existing link graphs, as well as reclassify the link graphs as appropriate. Thus, the effects will not be immediate nor real-time, but will be felt in months to come.

The Penguin 4 Update and What It Means for You

It’s that time again. The Google Penguin 4 update that went live this September has probably shaken up webmasters, SEO agencies and businesses that rely on these professionals to keep them on Google’s front page. So, what should you know about the Penguin 4 algorithm update?

Penguins!

1. The update went live before it was announced.

As we have seen with earlier Google algorithm updates, Penguin 4 went live a few weeks before it was announced. The announcement came on September 23rd, but the launch had already taken place on September 2nd. This makes it easy for Google to withdraw algorithms in the case of things going wrong, without having to make announcements.

2. The Penguin 4 update is a major change to the core algorithm.

The Penguin 3 update took place in October 2014, and threw the SEO community into chaos. Google admitted that it was a less-than-ideal penalizing solution for black hat SEO. Penguin has always looked for unethical manipulation of keywords, and penalties were often applied only to specific terms and then made it almost impossible for SEO professionals to help penalized websites recover lost ranking. Once a website was penalized with Penguin 3, it was very difficult to improve or recover projects that were in Google’s black books.

The reason why Penguin 4 took such a long time to launch (over 2 years) is because it involves large scale changes to the existing core algorithm. A lot of the code has been rewritten. There have been attempts to make Penguin more granular, which means that various parts of a website that has seen aggressive SEO will be caught by the algorithm while other parts may not be penalized. At the same time, Penguin 4 will be more rigorous in its search for link manipulation, without the limitations or the necessity for hacking the core algorithm as was the case in the past.

3. What does “granular” mean?

In the context of Penguin 4, the efforts to make it more “granular” can be demonstrated through the new spam-blocking method. In the past, Google would rely on spam reports or manual reviews for detecting paid links or link networks. It would also use manual IP blocks and hand-compiled data to hunt for spam. This was a time-consuming and inefficient process, though the Hummingbird update did make some improvements.

But with Penguin 4, and the observation that Google has not been carrying out manual spam detection since November 2015, it seems that things have changed. Penguin 4 is now likely using analytical tools that search for link graphs and unnatural link patterns. Google has realized that in cases of a single-owner group of domains where one domain has been caught for black hat link-building, looking at the entire group of domains under the ownership will reveal a lot more spam activity. This would enable Google to target specific manipulated keywords across multiple domains.

4. Google has not refreshed Link Graph since 2015.

Google used to refresh Link Graphs at least one to four times in a year, before it stopped the refreshes in November 2015. Link graph refreshes meant crawling again over pages with links, after they have already been analysed. Any changes to the links or pages since the last refresh would show up only after the next refresh, and could affect SERPs significantly.

It is not clear why Google has not done this for nearly a year. There are only speculations that one can make in this regard. Google may have omitted these steps before Penguin 4 because the team was too busy with the update. Or it may be that the refreshes have already been factored into the new Penguin upgrade.

5. Panda and Penguin-like Features

Google began rolling in changes in June 2016, and since then those who have been closely watching Google’s activities will have noticed that Google has been launching the new updates in waves, unlike a single Panda or Penguin update. Since Google’s new outburst of activity in this field, some results have vanished from link graphs. This is a sign that Google is going to crawl through and reanalyse pages as they go without affecting others. In other words, instead of major refreshes, we will very likely see a rolling effect and granular refactoring of link data according to Penguin and Panda rules.

6. Expect effects to roll in slowly.

If you have not yet seen the effects of Penguin 4, hold your breath. Since there has been no link graph refreshes since the new Penguin implementation, it is likely that at the moment, only link edits made before 205 December will be factored in. If you have made changes to your website since then, you will need to wait for Google to refactor pages again before you see any change in your rankings.

SEO: How to Solve the Not Provided Dilemma

The Not Provided dilemma is affecting countless SEO specialists and webmasters. As part of Google’s recent updates, nearly 80% of keywords no longer appear in search results. This is an attempt by the popular search engine to reduce the ability of SEO expert to track their keywords performance, taking away the ability of concentrating more on specific KW. This ongoing problem has impacted the ability of users to extract vital data from analytic that pinpoint keyword and SEO performance. To combat this issue, the folks at SEOMonitor.com have devised a tool that accurately measures KW accuracy and offers previously unattainable information. Utilizing a blend of data from analysts and webmaster tools, the tool is heralded for its real time results and precision.

 

not provided keywords

80% of keywords are not provided by google

 The SEO Monitor

SEO Monitor utilizes innovative tools that accurately analyze and assess keyword data. This information, of course, falls under the not-provided category, however, is the perfect solution for SEO professionals looking for better insight and performance. The tool also evaluates not provided SEO traffics using a combination of data from analytic, webmaster tool, your site data and a proprietary algorithm. With this impressive tool, SEO experts are able to control and manage their entries from one convenient and comprehensive source. With plans ranging from $24 – $199 per month, this is the ultimate solution for those feeling the brunt of Google’s new search algorithms and parameters. All plans come with a 14 days free trial for your convenience.

Additional Benefits

SEO Monitor unlocks hidden data from Google analytics. This allows users to see which branded and non-branded keywords are really driving productivity and results. Having this information is a key factor in securing greater visibility, while making sure your SEO strategies are in line and working. Whether you want to recover lost data or formulate new entries, you are guaranteed optimal accuracy and not just guess. The rank tracker is another benefit built in the system, which precisely measures the position of your keywords on the search engine pages.  Whether it’s generating leads or revenue, why rely on estimation when you can get clear and concise returns that foster greater online growth? This includes automated KW research, along with reputation, competitor, and rank tracking.

For more information on reliable and effective keyword management, simply visit SEOMonitor.com today. Their services have received stellar reviews from SEO critics and clients alike. If you are struggling to attract and engage potential and new visitors, their tools are guaranteed to meet your needs within time and budget.

Google Hummingbird Update – How Will it Affect You?

Google Hummingbird is a brand new algorithm in its search engine. It provides faster query results by focusing on the reason behind every query rather than the keyword. The meaning of the whole question is more important as opposed to the earlier single word search. The search results will be related to the whole question rather than the key words searchers are looking for.
Unlike Panda and Penguin updates, Hummingbird is not just an update but a complete new replacement. It is the first time Google has completely rewritten its algorithm.
Humming bird - Standing still

google humming bird is the new algo updated affecting all searches – source

According to author of Google Semantic Search and an expert on search engine David Amerland
· Google has improved its ability to deal with complicated search queries.
· Secondly it has got much better at linking documents and search queries relationally
It is becoming more and more important to identify the actual intent behind a query for the SEO to remain successful. It is therefore necessary to identify the USP of any business and provide answers related to it.
Google has realized that search engines should be more about actual customer needs and less about keywords. So via Hummingbird it is looking for the purpose behind the query and offering a solution. This smarter algorithm compared to Penguin or Panda answers questions and filters answers and gives comparison data at a glance.
Hummingbird anticipates what information you will need. It simplifies complex search retrievals and improves data available via knowledge graph. This keeps users on their search page longer thus exposing them longer to sponsored ads. This improves ad sales and revenue for Google as well. It fulfills the dual purpose of increasing time on site and revenue by optimizing algorithm for artificial intelligence, semantic analysis and underlying understanding of language. It is focused on Natural Language Processing and better understands concepts versus words, relationships versus concepts.
But SEO specialists will have to watch, when it comes to local searches mobile devices, how successful Google will be. The primary goal is now to weed out useless irrelevant data from the search engine results page (SERP) and present only pages related to the core of the question asked. Business will have to get used to semantic search and knowledge graph. Hummingbird will have to position itself to be the correct source of answers users are seeking. It will have to identify intent, need and problems and then provide the answers and solutions.
Google will have to redefine their mobile search results page. According to them “It is cleaner, simpler, optimized for touch with results clustered on cards so you can focus on the answers you are looking for”. There has been a massive shift towards mobile search. The algorithm has to evolve to understand the longer more complex queries people ask on their mobiles.
Hummingbird therefore is looking at the meaning behind the entire query placed before the search engine. It has shifted the focus from individual keywords to the core meaning of the question asked by the user which will prove a challenge to any SEO agency not prepared for the change.
Hummingbird focuses on data retrieval, artificial intelligence and how data is accessed and presented to users to make his search a happier experience and this should be taken into account by all companies promoting a website.

Recent Search Engine Changes

It seems that every week these days, we receive a major new announcement from Google regarding either their algorithm or the way they share search data with users and companies.  Of course, it now seems that Bing doesn’t want to be outdone, so the number of announcements coming out of Washington seem to be increasing as well.  In reality, both companies continue to fight for their own respective shares of a search market, which according to a recent study by ComScore, is simply not growing. That whole not growing aspect to search is a major problem for both of these companies since they are publicly traded and as we all know, the stock market and investors want to see continued and sustainable growth over all times periods.  That means both giant tech companies are likely to continue looking for ways to increase revenue from other sources over time.

Google Keyword Data is Gone:

As Rand said over at Moz, we’re officially living in a world now where your Google organic search traffic is going to coming over, 100% as keyword not provided.  While Google itself is saying that the reasons are to restrict NSA data snooping and corporate snooping on their own end users, I do think that there are some real world reasons that Google might make the same change as well. For a small business owner, this makes it exceedingly difficult to tell what keywords are providing sales.  Although it isn’t as crucial for those of us who exist in small enough verticals to have some idea about what’s truly going on by checking by hand, for huge corporations it makes it virtually impossible to track what’s working in regard to SEO.  All that is to say simply, that if you want quality answers on what keywords are providing value, you are simply going to have to go back to paying for Pay Per Click advertisements to get some identifiable data in regard to the conversion rate of individual keywords.

google books

Google keeps changing and improving the search algo – source

Google Continues to Identify Link Networks:

Over the weekend Google reportedly targeted and then took down another major link network, this one out of Russia.  For those of us spending time creating quality sites with good content, I can’t stress how important this continual action happens to be.  It simply isn’t fair for people to be playing by different sets of rules in rewards to the algorithms and how links are created.  Nice job here guys.

Bing Offers Page Zero:

Bing in many ways is fighting Google in traditional ways, you know like the old Microsoft vs Google battles from a decade ago.  One thing that I greatly appreciate about Bing these days is that they are beginning to offer auto suggestions in their search box, in much the same way that Google has for the past year or two.  While the technology isn’t new necessarily, given the increasing lack of information coming from Google in regard to keyword volume, seeing any type of additional information from Bing is incredibly useful and valuable.

Major changes as usual right?  Of course, it’s just another week in seach!

Mark Aselstine is the owner of Uncorked Ventures a wine club focused on delivering the best wine and value in the industry.  Competing against major media backed wine clubs means Mark needs to continue improving his search engine marketing knowledge, since he can’t outmarket competitors like the New York Times!

Optimize Your WordPress Site By Adding It To Google Webmaster Tools

Adding your WordPress site to Google Webmaster Tools is one of the important and best ways to optimize your site for search engines. In this article we will tell you how to add wordpress site to Google Webmaster Tools.

Google Reporting Server Response Time

Webmaster’ s stuff – source

Google Webmaster Tools

Google webmaster tools are a set of tools provided by Google to facilitate us at how our website will be seen by the search engines. With this set of tool you get different data and report with which you can understand how each page of your site is appearing in search results. The Search Query section in Google Webmaster Tools demonstrates the performance of each page of your site, how many times each page is clicked, where your site ranks in search results for each keyword etc. It also permits you to submit XML sitemap of your site, leave out URLs and help search engines to display most imperative content of your site. It also informs you if your site is discontinue crawling and indexing your site’s pages.

Step by Step Process of adding your WordPress Site to Google Webmaster Tools

If you don’t know the exact way of adding wordpress site into Google Webmaster Tools then follow the step by step process given below:

1. First open the Google Webmaster Tools website and sign in with your Google account. If you don’t have Google account then signing in for an account. After signing in, add your website and click “add new site button”.

2. In next screen your verification will be done in order to verify the ownership of domain which you are adding. This can be done either by uploading HTML file in the root directory of your site by FTP or by getting the Google provided Meta tag which you can add in your website’s homepage.

  • If you want to add Meta tag in your wordpress site then make use of Insert headers and footers plugin. For this go to Settings and then Insert headers and footers and add the provided Meta tagline into the header field. Meta tagline looks something like this

<meta name=”google-site-verification” content=”VerificationKeyCode” />. Now save the changes done and return to Google Webmaster Tools site and click “Verify site” button.

  • On the other hand if you want to use wordpress SEO plugin then the simplest way is to  copy the verification key from meta tagline and paste it into wordpress SEO plugin. Save the changes done and return to Google Webmaster Tools site and click “Verify site” button.

3. After the verification of your ownership to domain, the next step is to add XML sitemaps which you have created in WordPress. If you have not created any XML sitemap for your site then it is good to create it because it helps search engines to display the important content of your site. After adding XML sitemap, it may take a moment to show your site data.

So, these are step by step process to add a wordpress site to Google Webmaster Tools. Hope this article will be helpful for you if you are thinking of doing the same and little bit confused how to start with it. You can also find more help on: http://en.support.wordpress.com/webmaster-tools/

Matthew Anton is an expert in giving strategies and tips on Online marketing, SEO and WordPress site optimization.

Why is everyone using google?

Google: More than Just a Search Engine

Today, Google search engine is unquestionably the dominant tool in the market. Initially built for queries and data research only, Google nowadays has many other features and power tools to allow its users obtain and analyze targeted information quickly and easily.

There are several major points standing behind Google’s immense popularity.

Google Food

google – source

High Quality Search Results

Google search engine provides timely, relevant results in split of a second. By using the secret algorithm and PageRank™ technology company’s developers have managed to almost completely eliminate spam and shallow-content pages from appearing on top of the search results. Pages are ranked by relevancy, domain age, inbound and outbound links and their trustworthiness.

Google is also very effective with misspellings: has a user made a typo in a common word or phrase Google will offer to replace the term with a correct one at the same time displaying search results for the original query. This option is essential when the query is difficult to spell, or when it contains a person’s name that might be spelled in more than one way.

Google ads that appear on the sides of the pages are, too, related to the query topic and are often as useful as the search pages. Ads and sponsored links are explicitly marked as such and do not clutter the search result page.

Google Search Services: Different Results Formats

To deliver faster and more accurate results, Google offers other service search options through tags that appear on the top of the Google homepage. “Web” tag is the default option, and if a user is looking for an image, a book, a map or a video – all he has to do it click on the relevant tag and type the query in the search box. The search results will appear in the relevant format: image, map, book list, video and so on.

Another helpful search service is Google Translate. Sometimes, the most relevant search result appears in a foreign language. It happens when a user is looking for a corporate web page of a company located in a different country, or when the query itself is in foreign language. By clicking “translate this page” next to the URL on the search result list, Google will display the landing page in English providing a reasonably good translation quality.

Google Advanced Search and Search Operators

Google search operators are certain words written with a colon, followed by the query term. The operators restrict the search process to specific areas on the landing page. For example, if a user starts a query with allintitle: operator, Google will search all query terms that appear in the title of its indexed webpages. The displayed results will only contain pages that have the desired query term in their title.

Search operators are extremely useful when looking for specific file types, definitions or information from a specific source. This option allows skipping several steps and getting a detailed result on the first search attempt.

Google advanced search is another search-restricting option that narrows the results by specific criteria set by the user: certain words, phrases, geographic location, language and even usage rights. Advanced search tool does not require the knowledge of operators.

***

Google is a mighty instrument that has become much more than a basic search engine. Its multiply functions, services and tools practically encompass all of the internet activity: searching, rating, analyzing, translating and even predicting future trends. Google tools enable end users and webmasters to work with different types of information, build websites, run blogs and analyze keywords to improve traffic and sales.