Post Panda Search Optimization Strategies

Matt Cutts Dodges Danny's Questions on SERP Bounce at SMX Q&A Session

You may have have read how at SMX Matt Cutts reiterated that Google does not look at   bounce rate as part Google  algorithm.  Not so fast!  Matt actually dodged Matt's question .... and it seemed to slip by most SEO reports of the conference. 

Danny Sullivan certainly does not shy away from asking Matt pointed questions at SMX.    I enjoy the questions  and despite Matt's sometimes vague answers --- I feel I there is a lot to learn from what Matt answers, how he answers and what he avoids.  

I am specifically interested in user metrics collected on the SERP (as I discussed at SMX) as parts of Google's algorithm, particularly Panda.   Danny actually drilled in on this ... and Matt danced around by answering a different question.   I have read every account of the session I could find and not a single post  that I have found understood the difference between Danny's question and Matt's answer (except for Danny).  This is very significant.    If I hadn't been there and had only read about it, I would have a totally different view of the conversation.
To roughly paraphrase the conversation:

Danny:  “I know you have said that Google does not use Google Analytics bounce rate. What about bounce back to SERP and behavior on the return to SERP?” 
 Matt: “GA bounce rate is a very noise signal. We never use it". (He spends about 5 minutes not answering the original question - the question was about bounce to SERP - not Google Analytics bounce rate). 
Danny: What about bounce to the SERP?"   
 Matt: We do not use GA data, it very noisy signal. 
Danny: No, I am talking about the SERP and bounce to SERP. 
 Matt: Well, ummmm…. we don’t like to rule certain things out for the future ….ummm…..ummm.

Matt simply refused to answer this question until Danny gave up.    This is extremely revealing!  There is no reason not to answer if they are not using this factor.   However, if they are using it, they might want to hide it if they believe it that knowledge can be leveraged by SEOs.   So why not lie outright?  It has been pretty clear to me from seeing these exchanges that Matt likes to be as forthright as possible.  So this was a flat-out yes to me.  SInce,  I haven't seen any other blog or account of the Q&A pick up on this,  - it  appears he successfully dodged the question and confused the majority of the SEO community. 

In fact,  we know absolutely 100%  that Googles does look at bounce to SERP as part of PANDA!   The "Block Site" link only shows up on bounce back to SERP.  So there is no question about this.  What exactly they are looking at besides block is not clear.  I have speculated that the alternate click (telling Google that the quest for an answer to their question is not over) as well as very quick bounces.   Previously I had assumed that bounce back to SERP was not a valuable metric in itself (rather it is the other actions).  Now I am not so sure.     

Search User Experience

Tomorrow At SMX, I'll be speaking on the New Periodic Table of SEO. I'll be talking about my theory (and others) that behavioral metrics are now part of Google's search algorithm.  I'll be discussing the potential metrics they are examining as well as ones I don't believe they would use (such as Google Aanalytics bounce rate).

As I have discussed previously, I believe the user metrics they collect on the SERP gives them the answer as to whether or not the user has a satisfactory experience with a specific search result. This whole experience I refer to as the Search User Experience (SearchUX or SUX).

The traditional UX and traditional User Experience Engineering  starts and ends on a website. The SearchUX starts on the SERP - transitions to a Website, and, if unsatisfactory, ends on back on the SERP.  

Whether explicit or implicit, all searches are questions - a quest for information in which the user has a very specific goal in his mind.  Satisfy that quest and you terminate the search. Terminate the search and Google cannot collect any negative metrics about the site. In fact, the terminated search is likely a positive metric.

You have no actual knowledge of the metrics that Google collects on the SERP and herein lies one of the big difficulties.  I'll address those difficulties in my talk and in my next blog post as well. 

Google - Be Afraid of the the Threshold! Defensive SEO

Many of the recent changes in Google's algorithm that are demoting and promoting certain results are based on thresholds. Recent major algorithm changes which seem to be threshold based include Panda (user metrics), PageLayout/Above the Fold Penalty (page content analysis), Over Optimization Penalty (Linking analysis).

Google, in their typical complete lack of openness in regards to their organic algorithms, does little to let you know where you are in regards to these algorithms). Although there have been some sites getting notices of links that look like paid links, they still don't tell you where you are in regards to the threshold.  And there has been no communications on Panda or Page Layout.

Think about this ... you could be knocking on the door of Panda, Page Layout Alog,  or Link Over Optimization, meaning you could be one content initiative, page redesign, link building initiative or other site change away from triggering a loss of 30%-80% of your organic revenue (which of course is the best type of revenue.)   How many businesses can survive that? Say the passing score for Panda is 65. You have site A with a score of 66 , fat, dumb and happy as a clam even though  they are teetering on the verge of a specific hit.

 Google Algorithmic Thresholds

And then you have Site B, maybe they failed with a score of 64  lost 50 % of their traffic and they are firing people or shutting their doors. Yet they might be hanging right below the threshold, not much different from Site A. Doesn't see fair, does it? Site C might be well beyond repair and a truly crappy site but the impact on site B and C might be indistinguishable. 

So what is the poor SEO to do? We need to practice defensive SEO. We don't wait to get hit.  SEO is no longer about spewing out reems of content, at least without thought to site relevance, content quality and usability. In fact, good SEOs are going to be learning quickly to integrate search usability into their methodology. Bad SEOs are still going to be pushing out content and outsourcing link building with no regards to quality of links or spaminess of the techniques. As I mentioned when I first started this blog, good SEO has changed and should no longer be at odds with usability - rather it needs to be tightly integrated. We rarely get rewarded for saving traffic - but we need to do it anyway.

SMX Advanced Seattle : Periodic Table of SEO Ranking Factors

I am going to be speaking in Seatlle at SMX on this panel and I am pretty psyched about this one. I was just reviewing the table from last year - pretty cool yet missing what I consider to be the most significant change in Google's algorithm, the use of behavioral metrics.

If you look at the chart from last year it certainly represents most of the current thought about SEO and how most SEOs approach their job. And yes, this is how most SEOs approached fixing their Panda issue or avoiding it. They think of OnPage and OffPage. It is easy to see why so few sites have recovered from Panda.

Periodic Table

What's missing is the concept of understanding their users needs and satisfying those needs as a response to the keywords the user searched for. 

Without that, sites are shooting blindly at their problems. Why is this such a difficult problem? You can't tell by reading the content on a page or examing reams of data on a link profile or even bounce rates which are a poor metric (by itself and as an absolute number).   It comes down to knowing your user.  

I am in the midst of working a methodology on what I am calling Search Usabilty Experience. More on this is in the weeks leading up to SMX ....

Google's Above the Fold/ Page Layout Algorithm

Google's above the fold algorithm (a.k.a  page layout algorithm) went live last week and while it has not caused Panda-like repercussions, it still has had a significant impact on many sites and it is certainly important to any site that has been hit (of which I know of at least one).   Some interesting things to note:

  • This really confirms that Panda did not satisfactorily do its job as based on user metrics.  If Panda was 100% effective, this update would not have been necessary.   Perhaps some sites were slipping through Panda because of the brand protection.  More likely some sites have figured out how to avoid Panda (even though they are deserved).   Or perhaps there were too many false positives and Google is looking more towards this update to catch certain sites  that would slip through if Google will relaxed Panda a bit.  Interestingly enough, Google ran a Panda update yesterday (1/24) and I have been hearing of Pandalized sites getting boosts.  I think perhaps a Panda update just a few days after the Page Layout update is not an accident.
  • Like Panda (and really like most Google updates) this is geared to improving the searcher's user experience. The user searchers, goes to a site and gets what they want visibly on the page  (as opposed to a page where the content is obscured by ads). 
  • Unlike Panda. this does not appear to be based on user
    but on the actual page.   According to Google,  once enough pages have been crawled that address the issue, the site penalty or dampening will be reversed.    This makes addressing this  issue if you get hit much easier than Panda.  With Panda, you make your changes and wait for Google to collect enough positive signals over time to overtake the negative signals. And those signals (which I have theorized on) have never been stated by Google.  In this case, Google has stated exactly what the issue is. 
  • Why would Google be so open about what this change is about as opposed to mysterious Panda?  Simple - if Google thinks they can be manipulated, they will be vague, if they think a change is beyond manipulation, they will be more open.   

Future Implications

Notice that Google is calling this the "Page Layout Algorithm".   The ability to reliably analyze page layout will likely be reflected in other updates in the future, perhaps degrading content and links below the fold. (Of course position of content always has had an impact, but the Page Layout Algorithm could make it even more significant and more accurate).   And link position has been discussed in regards to Google's Reasonable Surfer algorithm (which they applied for a patent last year).  The Reasonable Surfers algo basically values links on the likelihood of getting clicked.  To implement that, you need to understand page layout so this very well could be a step in that direction (which could shake up the SEO industry again).  

Never dull with Google these days!

Google's Freshness Update - Impact and Implications

Google Freshness update hit the search engine result pages (SERPs) a couple of weeks ago Google's Fresh Factorand like any major update (especially in the wake of Panda) created stress and anxiety for websites everywhere as fear of another search apocalypse started flooding the SEO forums. The particular angst inducing statement from Google was specifically:

“today we’re making a significant improvement to our ranking algorithm that impacts roughly 35 percent of searches and better determines when to give you more up-to-date relevant results for these varying degrees of freshness”

Nothing makes website owners more nervous then when Google announces improvements!  

35% certainly sounded like this would have result in Panda-like tremors – but as the traffic reports and discussion started hitting the Internet, it seemed more like a ripple as opposed to another quake. So what is going on here?

When analyzing Google algorithm changes  (and what Google will do in the future), I like to put on my product management hat to try and understand exactly what problem they are trying to solve as it pertains to their product - the organic search results.

Problem number 1: When users are searching for trending/hot topics, content relating to that hot topic would get very little or no coverage in the standard organic results on the SERP. Most likely it would show up in Universal Search of the SERP in the form of news, but the other 10 results would most likely have no content related to the search – essentially giving irrelevant results. Strong, high PageRank pages would dominant over the newer, fresher, more relevant content which yet to garnish any links or reputation. So for instance, if you searched for “NBA Talks” you would see organic results that looked like this:

· – would probably head the list because of high page rank and strength for the keyword “NBA”. Very poor relevance.

  • Articles talking about the pending NBA lockout from many months ago – this would seem relevant based on the keyword search and likely had strong PageRank and inbound links since they had been around a while. However, this is not what the user is looking for and a poor result for Google.
  • Very little new content relevant to the breaking story.

So this part of the update seems to be all about news and trending search terms. By looking at new content being indexed, news content along with dramatic jumps in search volumes for specific phrase, Google can conclude that a certain phrase needs fresh content. In this case, I believe fresh means newly, indexed pages relevant to hot topic. So now if you search for NBA Talks now, you see this:


Wow – forget about the old, high PageRank authoritative content. The results are dominated by new content. In fact 100% of the first page result are fresh and in this case news content. This is a far superior result for than old algorithm which mostly showed older content.

However, search for “NBA”, you see none of that fresh content in the results. Clearly, this part of the Algorithm impacts very specific searches related to trending news, but only for search terms clearly relevant to the trending term.

Problem #2: For certain type of queries, a good result is dependent on freshness. Google specifically gave the example of reviews. If you are searching for reviews, you want to see recent reviews which pertain to the latest version of a product or represent the current state of a business. However, Google’s algorithm returned results based on the authority of the pages which would surface the most relevant content from the strongest pages. The content was relevant to the explicit search term (for example ‘iPad Review’). However, most users probably want to see iPad 2 reviews, even though they didn’t explicitly type that in the search box.

In this case, fresh content can certainly mean long-standing pages that reflected the latest information via updates as opposed to the first case I discussed which appears to be news specific. In analyzing the search results, it still seems that strong, authoritative, content dominates, but with perhaps a tweaking based on freshness.

With a little help from SEMRush, I am able to compare the changes in the search result for ‘iPad Reviews’, pre and post Freshness change.

  • Before the change – the top ranking page was an Engadget review for the original iPad 1 from nearly 2 years ago. Probably not the review most users were looking for.
  • In fact, the entire first page of results was dominated by old reviews.

Now, after the change, things look quite a bit different.

  • This number 1 slot has been replaced with an iPad 2 review, also from Engadget, indeed a much better result.
  • The number 2 slot has been replaced with an iPad summary page from CNN with a link to an iPad 2 review (way down below the fold). This is not really a good result (it would have been better to list an iPad 2 review first). It shows freshness in action, but also that high PageRank, authoritative pages can still outrank the more relevant page (with pinch of freshness mixed in).
  • Number 3 is still an iPad 1 review (very authoritative with 396 linking root domains, so it is trumping freshness).
  •  Number 4 is actually a Kindle Fire review which mentions iPad – very fresh (48 minutes old as I write this) – and is an example of how trending stories can make it into the result even if the search term is not the hot-topic.
  •  Slot 6 is another iPad 2 review.
  • However, slots 5 and 7 are really poor results, search result and tag pages, yes, the type of pages everybody has been told are bad and should be de-indexed in the wake of Panda.

Overall, there are better results for the user looking for up-to-date information with a couple of iffy pages mixed in where freshness is trumping quality. Google still has work to do on this.

So what are the implications and how do they affect our SEO strategies going forward? These are my initial thoughts, however since it is in the early days of this update – it will take some more time to reach final conclusions:

  1. News and timely content will get more visibility. Now, besides showing up in Universal Search and on the News tab, the first page will be dominated by news content. Keep in mind though, that this will not be for general searches but for searches very specific to trending and hot topics.   Therefore, a shift to more of a news focus should result in additional traffic due to this increased visibility.   It still remains to be seen if, new, fresh content will be able to retain strong rankings over time or whether they will fall back.  
  2.  Fresh content is getting more visibility for general searches too – this includes updated content as opposed to just new pages.  The impact is more subtle and does not seem to dominate the SERP like the first case. 

    How much content needs to change before it is viewed as fresh? Too soon to tell but you can bet that Google has put in checks to try and stop the obvious gaming that is probably already in motion (such as randomizing sidebar text snippets or swapping out lists of links). I am not saying these techniques won’t work or you shouldn’t try them … just that if they do work they might not be sustainable as opposed to legitimate fresh content.

  3. Google is likely categorizing searches to determine whether freshness is important to the quality of the page. A website strategy should focus on freshening content on pages that need fresh content to improve the page (such as review pages).
  4. I am sure Google will be tweaking this. The PageRank Algorithm did a decent job of making sure that worthy pages made it to the top, even if not timely. I have seen some poor results making it into the search results strictly on freshness.
  5. In general, the impact of this update does not seem nearly as severe as Panda – however this one might whittle away at your traffic over time if the fresh content tends to slowly take up positions.
  6. This does not mean that site authority and linking programs are not as important as ever. In most case, sites showing up all have strong linking profiles.

In all, a very interesting update form Google and one I am sure they are not finished with.  I'll continue to analyze this over time to see the longer term impact.

Panda Metrics: It Isn't TIme on Site or Bounce Rate

Wow - AfterPanda comes alive again!   With the Panda iterations continuing to come, recoveries few  (although I have seen  some) and dramatic new changes coming to Google (Freshness update) - it is time for me to start blogging again.  
Before getting on to new stuff, I wanted to pick up on where I left off.  I have played around on a few sites with a package called Clicky.  I like Clicky because they have redefined bounce rate as a visit of <30 seconds, better than the single page metric usually seen in packages like Google Analytics.   This is definitely a better bounce rate.

Most likely, if you are decreasing your bounce rate, you are improving the user experience and satisfying your users more often.   Most likely, but not necessarily.   And even if you are getting a bit better, it does not tell you if you are good.  Remember, Google is not looking at your bounce rate or time on site.  They are looking at the G-Bounce (the return to the search result and subsequent actions when the user returns).  

None of the numbers in your analytics is as important of understanding why your user got to your site, what question they asked and what you need to do to answer that question.   I am not saying you shouldn't try and improve those numbers ... you should.  But it is not enough.   You need to know what Google knows.   More on how you do that in my upcoming posts ...

Metrics You Need to Recover or Prevent a Google Panda Hit

In my last post I pointed out what poor and misunderstood metrics bounce rates and time-on-site are, as traditionally reported by Google  Analytics, Omniture and other web analytic tools.   Post-Panda, here is the information  we need  - (to either prevent a Panda hit or as part of a Panda recovery strategy).

  • True time on site (time spent on all pages, including bounces and last page).
  • Bounce, as I mentioned previously and as currently defined is simply a single page visit.  While a useful metric, it will vary by site in value and also can be skewed by page and site design and technology (for instance, Ajax). 
  •  Exit rates - This is far better than bounce.  I want to know what percentage of people exit:
    • 15 second exits
    • 30 second exits
    • 45 second exits
    • 60 second exits
      My assumption is that the longer users stay on the site, the more positive the search experience.   
  • I'd love to know back buttons exiting users from the site.   A back-button to Google is what I am trying to prevent.  

To count exits, you need to set a timer that triggers a ping event every 15 seconds -  so if you get a visit with no pings, you know that the user left in less than 15 seconds.    This would also let you calculate true time on site.   Of course, the last pageview may never end if the user decided to take a lunch  break, so you would need to set some sort of max time to count for any one page (perhaps 120 seconds).  

This is what we need to understand and track over time our sites performance in response to search.   Let me add one very important thing when analyzing your metrics - you must create a segment that only looks at these statistics for visitors from search and specifically a Google only segment.   What matters is your SEO visits, users coming to your site via other means may have very different metrics. 
Once you get good metrics like these in place, you will have a much better understanding on what pages or parts of your sites might be causing problems  - as well as a better way to measure improvements to your site.

I am going to spend some time looking for off-the-shelf solutions  to help capture these metrics .  I'll let you know what I find in my next post! :)

Bad Metrics - Bounce Rate, Time on site and Google Panda Recovery Metrics


Recovering from Panda is all about improving the metrics that Google uses to measure satisfaction with your site.   So, as you have set forth on your aggressive Panda recovery plan (or Panda inoculation plan if you have not been hit) - you naturally are trying to figure out what to measure.    The metrics that everyone is obessing on is 'Time on Site' and 'Bounce Rate'  as they are reported by  Google analytics.   However, these metrics are extremely flawed and if you are looking at these measures it is important to understand the flaws.

Time on Site - The real time a user spends on your site is an obvious and clear measure of the success of your website.   However there is a fatal flaw with the way Google reports Time on Site.   Google does not include the last page of a visit (since it has no way of telling when that page view ended).  This, of course, excludes bounces as well, - so Google does not include bounces when caclulating average time on site.    

I was talking to somebody on a Pandarzied site that had a average time on site of 4:30 - how could they possibly be hit he wondered?  I explained - that 4:30 only includes your happier users, users who liked what they saw in their first page view and decided to click on something.  

Bounce Rates -  Bounce rate is yet another flawed metric.  A bounce rate as measuered by Google Analytics is simply a single page view visit.   Perhaps a user  landed on your site,  did not like what they saw and bounced right back to Google in 15 seconds - or perhaps they stayed 90 seconds reading the great content on your page .  Both those visits would be counted as a bounce.   To make matters worse, on-page JavaScrupt or Ajax interaction would not negate a bounce either since they don't fire another page view.   So again, we have a highly flawed metric that is being obsessed over by SEOs, web devlopers and site owners.

Better Metrics - In my next post, I look at other otpions to the standard way of measuring these 2 critical metrics.  

Outbound Links as Part of a Google Panda Recovery Strategy: Panda Recovery Tip

The Panda update is about giving the user what they want.   A Google search is always a question in the users mind, even if not expressed in words.  The search might be for a product name or perhaps a business such as the name of a hotel "Hilton New York".   If you have that Hilton page, you need to know all the questions implied with that search! 


'New York Hilton' is what the user typed.  However, there may be a slew of questions in the users mind:

  • Is the neighborhood safe? 
  • How are the reviews? 
  • How much does it cost?
  • Are there any rooms available for my dates?
Remember, your goal now is to stop the user from a G-bounce back to Google.   So, yes understand your users, but more specifically - understand your user coming from Google.   If your users want to see reviews and you don't have reviews for a specific hotel, link them off to Tripadvisor or find a partner who can fill that need.  

Also realize that your page is not the be-all and end-all to the user's search, as much as you would like it to be.   That being the case, send the user places to get more information.   It is far, far better to send them someplace else to get what they need then letting them go back to Google.  A return back to Google to select another result tells Google the searcher's question wasn't answered!

Let's face it, both SEO strategy and website strategy had been to limit offsite links.  SEO's did not want to bleed PageRank and websites were designed to keep users on their site at all costs.    It is better not to hoard your PageRank anymore and focus on a complete user experience.  
Syndicate content