You&A with Matt Cutts at SMX Advanced 2014 (& Where is the Penguin Update?)

Every year at SMX Advanced Danny does a You&A with Matt Cutts where audience members submit questions, Danny asks them (the ones he feels are the most valid to a large audience anyway) and Matt answers them. I live tweeted this year’s session and this is the roundup of those tweets along with anything I personally wanted to add. ;-)

Danny & Matt at SMX Advanced 2014Danny and Matt pre-chat with an awesome photobomb courtesy of Michelle Robbins. Ha.

That chat began with Matt throwing stuffed hummingbirds and other swag from the stage. Danny asked if Matt had any announcements (he said he had a lot of them) and said we’d go in a “question” then an “announcement” format for the chat.

I’d used my own made up hashtag for my live tweeting (#SMXYA for SMX You&A), so you can see the full list here (I’ve pulled selected tweets regarding the topics covered for this post and gave elaboration on related tweets below each).

PAYDAY LOAN 2.0 – “PART B”


The first announcement Matt made was that the latest Payday Loan update (referred to as Payday Loan 2.0) was actually a 2 part update and that only the first part had launched earlier this month. He said he expected “Part B” to launch sometime soon – “probably later this week” but possibly “as soon as tomorrow”.

Danny then asked how Part B of Payday Loan 2.0 would differ from Part A. Matt said that part B would focus more on “spammy queries” vs. “spammy sites”.

Now, supposedly the Payday Loan algorithm has always focused on “spammy queries” but it’s possible Google was handing down punishment to sites on a “site level” whereas now maybe they’ll be handing down punishment on a “query level”, but that’s all speculation on my part. Matt didn’t go any further in depth than the above on the topic.

UPDATED ON 6/16/14 TO ADD: Apparently Matt tweeted out on 6/12 that Payday Loan 2.0 “Part B” began rolling out:


However, if it is, Barry Schwartz says the webmaster community doesn’t appear to be noticing. Barry’s coverage on this is here.

ON METAFILTER

Danny then asked Matt what happened with MetaFilter. Matt unequivocally stated that MetaFilter was not hit by Panda. Matt said that MetaFilter is a typical high quality site, though he did notate that it was a typical high quality site with an outdated design/UI.

He then reiterated that not only was MetaFilter not affected by Panda, but that it was also not affected by Penguin. He added “there’s a lot of different algorithms we launch”. He mentioned that when MetaFilter did their post about their traffic loss, one of the things they suspected was that Google may have viewed them as spam as a result of an email they received where Google had supposedly cited a link from them to a webmaster as an example of a “bad link”.

Matt said they “checked their records” and that in fact, they’d never cited MetaFilter as a spam link to anyone – someone had taken the Google template and inserted the MetaFilter link on their own.

Matt seemed to imply that MetaFilter was not getting any manual help with their traffic hit but that instead Google was looking at what went wrong that they hit a quality site in the first place and instead planned to fix that algorithmically.

I took away two things from this discussion. The first was that they were able to “check their records” on whether or not they’d ever cited MetaFilter as a bad link.

The second was that – according to the post made by MetaFilter, their traffic losses coincided with Panda updates (the graph MetaFilter shared makes it too hard to see an exact date of their mega hit and they never gave the date in the post to confirm Panda from the outside looking in – but there were two Panda refreshes launched in November 2012), yet Matt stated they were in fact never hit by Panda, but instead were hit by a different algorithm.

So, is Google launching covert algorithms or updates at the same time it does Panda refreshes? If so, does this mean some sites who think they’re affected by Panda actually may not be? We know Panda is one of the hardest algo hits to recover from – if there’s other updates being tossed in at the same time, it could mean we’re barking up the wrong tree with “fixing” some sites. Sigh.

BETTER COMMUNICATION ON RECONSIDERATION REQUESTS


Matt then went into how Google is trying to handle reconsideration requests for manual penalties a bit better. On a first time reconsideration request, it would appear that the process was that a site either received a denial or a removal on the manual – the reviewer apparently couldn’t add any notes. It wasn’t until multiple reconsideration requests were made for the same site that reviewers had the ability to start to communicate with the webmaster.

Matt said they’ve now added the ability for the reviewer to add a note on every reconsideration request – even the first one – should they choose to do so.

Later in the session, Matt also admitted they know they need to do a better job of reaching out to and communicating with the small business owner community in the way they have in the webmaster community.

ON OVERLAPPING UPDATES


Danny asked about what we all get pissed about – why does Google overlap updates? Are they trying to confuse us? Matt made is seem like Google tried not to overlap updates. In the specific case Danny was asking about, Matt said that Payday Loan 2.0 (Part A) was originally slated to go early on the weekend, while Panda 4.0 was scheduled to go later in the week. Matt implied a weird series of events occurred (he was *very* vague on what those were exactly) that caused them to launch closer together than Google originally planned. He said their goal wasn’t to confuse webmasters.

WHY DON’T WE GET NOTIFICATION IN GWT FOR ALOGIRTHMIC HITS?


Danny asked why we don’t receive “you’ve been hit by Panda” type notices. I took Danny’s question to mean why didn’t we get notices for being hit algorithmically by an update the same way we do manual actions in GWT. Matt answered that they do try to let webmasters know about large (emphasis on large) algorithm updates when they make them by making a public announcement – which didn’t answer the question. Either I mistook what Danny was asking or Matt mistook what Danny was asking or Matt understood what Danny was asking and evaded answering by giving that response. ;-)

GWT IMPROVEMENTS


Matt made a couple of announcements to new features in GWT as well as “coming soon” features in GWT.

The first was that GWT recently added a “fetch and render as Googlebot” feature within GWT. He said they can fetch Ajax and JavaScript now. He said now that Googlebot can understand more code, we should stop blocking JS and CSS files from being crawled. He said that “more help” was coming to GWT in regards to robots.txt files and that more help was also coming for ahreflang. Also on the “upcoming” list was more help in regards to errors from app indexing.

Matt said they’ve made improvements in GWT as far as site move documentation and planned to continue improving that process. He made it sound like the improvements for site moves would be in the form of both documentation and features, but that was my take and not explicitly stated. No exact (or vague) timelines for the “coming soon” features were given.

WHEN WILL THE NEXT PENGUIN UPDATE BE?


Danny asked if there had been a Penguin update since the last announced update (which was October 4, 2013 for those keeping track). Matt said he didn’t believe so. Danny got flustered there for a second – asking Matt how he could “not know” if an update had occurred, LOL (thanks Danny, cause we were all thinking that). Matt implied that they’d been focused on the Panda 4.0 release. He then said – no lie – that an engineer came up to him and said it was probably time for a Penguin refresh and Matt had agreed with him… and the topic changed.

Side note: We (I was on the panel) were asked about the upcoming Penguin update in the Ask the SEO’s session the next day – and Greg Boser had said that he believed it was coming soon and that it would essentially be the biggest update yet. I’d added that I tended to agree – with all the information we’ve been feeding Google about which sites are shitty in the last eight months in the form of disavows – this one should be big.

WHY ARE THERE SO MANY HOOPS TO JUMP THROUGH TO FIX A PENGUIN PENALTY?


This was the best way I could title this part of the discussion, LOL. Danny asked about why Google makes it so hard to recover from a Penguin penalty. Why the need for link removal – the “link walk of shame” versus simply disavowing things and being done with it. Matt called it a “fair question” but was then very vague on answering it.

He made a comment about how spammers could build tons of spam links today and then disavow tomorrow – I think the implication with that was that they’d be able to penalize and unpenalize a domain too easily, thus the “link walk of shame” – and the length of time it takes to recover from Penguin – but I could be wrong.

Danny then offered an alternate solution. Make it easier to bounce back from the FIRST hit but take the tough stance on any subsequent hits for spammy link activity. Matt didn’t seem to like that idea – but didn’t give any specific reasons why, LOL.

INTERNET EXPLORER 8 REFERRERS ARE BACK


Matt said IE8 referrers were back and Danny replied, “you mean the referrers that don’t tell us anything?” and Matt laughed. He said they are now showing you IE8 referrers again, though yes, those are simply lumped into the referrers from Google. I think his reason for mentioning this was so that any sites with a significant IE8 user base would know why they might suddenly be seeing a bump in Google referrals. He didn’t give a date as to when this occurred.

ON GWT KEYWORD DATA


Matt had said a long time ago that Google Webmaster Tools was working on showing you / storing a years worth of keyword data (right now they show 90 days worth). Danny asked when that promise would be fulfilled. Matt said he’d saw someone tweet during an earlier session to download the data every ninety days (it was a tweet of a comment I’d made in the Keyword Research on ‘Roids session). He said he knew this wasn’t ideal, but kind of gave a “it is what it is” kind of response on it. No timeline was given on when – or if – the “years worth of data” would happen.

IS LINK BUILDING DEAD?


Danny said at this point, Google should start telling us what IS allowed because it seems like that would be a much smaller list to keep track of than what isn’t allowed. Danny asked Matt if link building was dead. Matt said, “No, link building is not dead”.

So Danny clarified that he wasn’t asking if links in regards to Google’s algorithm was dead – he was asking if actually going out with the goal to “build links” was dead. Matt then referenced a blog post that Duane Forrester had written that stated:

“You want links to surprise you. You should never know in advance a link is coming, or where it’s coming from. If you do, that’s the wrong path.”

Matt agreed with the sentiment of Duane’s remarks on building links in that post – however, he said the part where Duane said you should never know in advance that a link was coming was “going a little too far” as far as Google is concerned. He drove home that it was ok to create amazing content knowing it will help drive you links – providing the content is actually amazing and that people are linking to it because it’s amazing.

And that about summed it up. Matt implied that what Google takes issue with isn’t a website developing awesome content that is created with the hopes of attracting links – he implied the issue was with “building links” through all the ways Google had explicitly stated were against their guidelines – and through bare minimum efforts where the content wasn’t spectacular and the purpose was solely to obtain a link (versus a link and users and conversions and publicity).


Danny then asked if Google could really have an indication of a page’s value without links (in reference to the video featured in this article). Matt said yes, it was possible. Danny asked if Google could turn off links cold turkey then. Matt did his famous “uhhhh” and laughed. I assumed that meant the answer was no. ;-)

WHAT RUMORED OUTSIDE FACTORS ARE REALLY FOLDED INTO THE ALGORITHM?

The next few questions focused on things that have been rumored to be a factor in Google’s algorithm.


Danny said it wasn’t a hard question to answer – is it being used, yes or no – is it being used in regular web search. Matt said Author Rank was being used for In-Depth articles (something Google had already confirmed). Matt definitely didn’t want to – and didn’t – give an answer. He didn’t say yes, but he also didn’t say no. Then Matt mentioned he was a fan of Author Rank – and the topic changed.


The BUT was that while they were open to signals, Matt is “extremely skeptical” of using site engagement factors at face value and scale in the algorithm because they are very subject to manipulation.


Matt said there’s “currently” no boost for a site because they use SSL. Matt once again mentioned though that he’s a fan of SSL – he said that anything that makes the web more secure is better for all of us. Danny asked if that meant Google would default to showing the https version of a site over the http version if Google knew about and could access both versions. Matt said that at one point there was actually favoritism built in for the http version, but he believed that has since been removed.


Later in the session, Danny asked if Google+ was dead. Matt responded no, with a hurt voice LOL. Matt said G+ data was not used in the general rankings. He was quick to add though that if you’re searching Google logged in, you’ll possibly see effects on your personal rankings as a result of your activity within Google+ (I concur).


Danny tried to get some admission – positive or negative – on whether Google looked at social signals coming from networks *other* than Google+. Matt responded by joking that this is why search engineers don’t want to come to search shows. In other words, we got no answer to that question.


Later in the session, Danny asked if site speed mattered in regards to your rankings. Matt said that only extremely slow sites needed to worry about site speed (he used the example of “like 20 seconds”). AKA, he seemed to imply that a site that loaded in 2.4 seconds had no advantage over a site that loaded in 4 seconds.

WHAT IS THE DEAL WITH TIMED PENALTIES?


The day before the You&A, I’d submitted a question for Matt to Danny via Twitter. My question was:

“Matt has spoken a lot about timed penalties, saying the length often is determined by the infraction. Is there any instance where a site might get a timed penalty that does not show in GWT manual actions tab? And if all penalties, even those with ‘timers’ would show in GWT, how do you know if yours carries a ‘timer’?”

Danny asked my question using his own wording. Matt laughed and said he knew where that question had come from and called me out in the audience. I waved.

Matt said if you get a manual, it will be visible in GWT. He seemed to imply that all manuals had some kind of timer attached to them. The smaller the infraction, the “shorter the timer” so to speak. He definitely didn’t answer the part about how do we know how long the “timer” was set for whatever manual penalty we’d incurred.

He once again said that all manuals eventually expired (which is not new info) – but I’d like to note here that Matt has said in the past that if you don’t fix the reason for a manual and it expires, he’s confident Google will find out and hit you with another manual fairly quickly.

So, here’s my takeaway – Google will never tell us how long a penalty “timer” will last based on which rules were broken because that would mean we might be willing to take the risk if we know the “timer” would be short if we got caught.

The other thing I took away from this is that it looks like (thinking aloud here) there may be two aspects to removing a manual penalty. One is fixing it and having Google remove the manual action in GWT. That’s Google confirming you’ve fixed it and are no longer violating their guidelines. The second aspect is waiting for the “timer” to expire – and how long that takes is up for debate and based upon the “level” of your infraction based on a “bad to really bad” scale we have no insight to.

There have been multiple, multiple reports of people fixing manuals yet seeing no recovery despite Google removing the manual action notice after a successful reconsideration request in GWT. My guess is your timer has yet to run out, even if Google acknowledged you fixed the root cause by removing the manual action notice in GWT.

Matt was clear that manual penalties came with timers attached to them. He was also clear that all manual penalties eventually expire despite whether or not the offense that caused said penalty is fixed. What Matt was NOT CLEAR on was if the “timer” remained (however many months based on however bad what you did was) even AFTER you had the manual action notice removed in GWT.

Additionally, Matt was clear on stating all manual penalties show in GWT. I will however say that I don’t necessarily believe that is the case 100% of the time personally, but, according to Matt, that’s how manual penalties roll.

CAN YOU “REAVOW” A LINK?


So, you’ve disavowed a link (or someone you hired did in a crazy bid to save you from Penguin) that wasn’t actually a bad link. Danny jokingly asked, is there a way to “reavow” it? Matt said yes, that you could essentially “reavow” a link by removing it from your current disavow file and then reuploading the file with the link you’d like to “reavow” removed.

However, Matt seemed like he totally didn’t like the “taste” of those words in his mouth. I couldn’t tell if this was more because he pictured people “reavowing” some of their shitty links after getting a manual penalty removed with this knowledge or if it was because a link was somehow “damaged” with a disavow and he didn’t want people accidentally shooting themselves in the foot. Or it could have been neither of those. But, he definitely seemed like something about discussing “reavowing” links made him uneasy.

ON THE IMPORTANCE OF MOBILE


Matt spoke for several minutes of the topic of mobile. He kept saying how important it was for us to be mobile ready. He asked the audience how many people had auto-fill markup on their mobile site forms. Hardly anyone raised their hands. Danny said “that’s not mobile” and Matt said “yes it is”. He said that mobile dominant Internet “coming faster than most people in this room realize”.

ON NEGATIVE SEO


This is where typing quickly while trying to live tweet will sometimes get you haha. In the tweet I said Part A when I meant Part B.

Danny asked Matt what was going on with negative SEO. Matt said Google was very aware of negative SEO – but then sort of clarified they are very aware about how many people are worried about negative SEO. He implied that Payday Loan 2.0 Part *B* would be closing some of the loopholes people are using for negative SEO.

SO ABOUT HUMMINGBIRD

Danny asked how search would be different for wearables.

Matt replied, “so about Hummingbird” and pulled out his phone which has Google Now. He asked his phone “where is the space needle?” and his phone responded with the address. He then asked, “I want to see pictures” and his phone showed him pictures of it. Matt asked, “who built it?” and his phone answered. Matt asked, “how tall is it?” and the phone answered. Matt said, “show me restaurants near there” and his phone showed him a map listing of them. Matt said, “how about Italian?” and his phone showed him a listing of Italian restaurants. Matt said, “navigate to the closest one” and his phone enacted his map with the directions. The room clapped.

Matt said he thought this showed how wearable search would be different. He said Hummingbird was about connecting these dots. Matt also admitted this worked better with mobile than with desktop, because people tend to use more natural language with mobile. He said he expected desktop to improve as they learned more from the mobile use of it.

DO JAVASCRIPT LINKS PASS VALUE?


Matt said “mostly, yes” in response to being asked if JavaScript links are handled like regular links. Matt also pointed out that you could add the nofollow attribute to JavaScript links and Google would see it.

ON CONTENT / QUIZ MILLS

Danny asked Matt how he felt about Buzzfeed and sites like them that essentially produce shallow content that people eat up on social media. Matt said Buzzfeed has contacted them asking why they don’t rank better. Matt said everyone thinks their own website is above average in quality, even when their average or below average. It was obvious he thought Buzzfeed was overestimating their quality in regards to how they should rank.

ABOUT ALL THAT YOUTUBE SPAM

Danny asked what action Google was taking against known spam tactics – and then used YouTube spam as an example. Matt said they keep their ears open – he seemed to imply they know about spam tactics well before they implement something to take action on them algorithmically. He said targeting these tactics algorithmically can sometimes take some time.

SHOULD YOU HIRE A LINK BUILDING SERVICE?

Matt started to point out the differences between “link building” (consisting of PR4, with this anchor text, and “in content” type elements) and building links by “being excellent”.

He didn’t seem to have an issue with hiring someone to help you be excellent and help come up with creating ideas and help with their execution of those ideas. His issue was in hiring a “link building firm” in the 2009 sense so to speak. AKA, hiring a promotions / publicity / true marketing company is ok, hiring a “link building” company is bad. My take based on his comments – to be clear.

IS THERE A GOOGLE “WHITELIST”?


The last question Danny asked was whether or not Google had a “whitelist” of sites immune to penalties. Matt was really dodgy on this one, simply saying it “could happen”. So Danny asked for an example – a specific site that had been whitelisted. Matt said he didn’t know a specific one to give.

Matt then made it clear that a whitelist would only exist for something that was a known false positive. I wish I could remember the exact wording for y’all to dissect to death, but I don’t. :) It essentially amounted to only in extreme circumstances where a site exhibited a false positive they couldn’t fix algorithmically for whatever reason – again, my take. Matt then said, unequivocally that there was no whitelist – at all – for Panda (I can’t remember if he included Penguin here as well or not).

EDITED TO ADD

On 6/20/14 SMX released the video of Matt’s full talk. Check it out below.

There ya go folks. Expanded coverage on the live tweets mixed with a few of my own opinions. Until next time…

About Rae Hoffman

Rae Hoffman aka "Sugarrae" is an affiliate marketing veteran and the CEO of PushFire, a search marketing agency specializing in SEO audits and link building strategies. She is also the author of the often controversial Sugarrae blog. You can connect with Rae via Twitter, Google+ and Facebook.

Sugarrae runs on the Genesis Framework

Genesis Framework

If you’re someone who doesn’t understand a lot of PHP, Genesis will give a ton of functionality that you wouldn’t be able to obtain otherwise with a simple control panel instead of having to alter code. For the advanced, Genesis has incredible customization possibilities via Genesis hooks.

The theme is not only highly customizable, but it has allowed me to run Sugarrae more professionally, with a much more targeted focus on monetization than it ever has been able to achieve before.

You can find out more about Genesis below:

Comments

  1. Ryan Biddulph says:

    Thanks for the updates! Always fun to see what MC’s up to, and the Pay Day loan developments are good news.

  2. I’m worried sick about the Penguin update. I disavowed but can’t remove any links. I was hit by a negative seo attack, there’s approximately 500-600 ridiculously spammy links I found. I disavowed all of them but nobody will get back to me in deleting them. As most of them are on unmanaged websites with guest commenting. If Google had any sense, it would be plain obvious I never built the links. I hope I will recover on the next update despite being unsuccessful in deleting these links.

    • Ashley – you can add notes in the disavow file to let them know that you tried to remove the links to no avail. We’re all waiting on Penguin – hopefully the update will hit soon.

    • rob woods says:

      Hi Ashley,

      Unfortunately Google has to built systems aimed at true spammers. If they knew you were legit and would never build those kinds of links “it would be plain obvious [you] never built the links” but they are likely the very kind of links that a spammer might try to build to see if they work. Google has to treat all sites the same so they don’t know why these links were built or by whom. It stinks but unfortunately legit sites have to go through all the painful steps that Google puts in place to stop spammers and black hats. Google could make it easier but in this case you mostly have to blame the black hatters who ruin things for the rest of us.

  3. rob woods says:

    Hi Rae,

    Thanks for posting this (I didn’t take enough notes during the session…)

    On, “Additionally, Matt was clear on stating all manual penalties show in GWT. I will however say that I don’t necessarily believe that is the case 100% of the time personally, but, according to Matt, that’s how manual penalties roll.” I agree with you that not all manual penalties show up. An obvious example is if WMT is implemented for a site after the penalty occurred or if a particular user was added to an account after the manual action was created. It’s always been an issue with me, getting WMT access after a message was originally sent, I never see the messages unless I get access to an account that existed from before the action. That’s only one example of messages not showing in WMT so I am sure there are others.

    • Yeah, I have two instances of sites that recovered 20 months to the day from their biggest hit (and Google claims they did nothing on that day in regards to updates to the algo regarding spam)… and neither of them ever received a notification of a manual (100% fact).

  4. Ed Yates says:

    I absorbed more in the last 15 mins of reading this (I’m a slow reader) than I have all month! Very nourishing info, the penalty timer makes sense. @Rob Woods, @Rae – Have you seen many GWMT manual action messages simply vanish without a trace for unknown reasons?

    Matt may be letting slip on some really interesting grey lines within this speech that need exploring, can’t wait to test out these theories further. Matts cat getting his tongue (statement?) re. mixmash referring signals from some of the other ‘largest social networking platforms in world’ is a booya for me

    • Hey Ed – depends on what you mean by vanish – if you’re talking the unnatural link notices email, etc. I haven’t seen the email notifications from them vanish. If you’re talking a manual action being listed under the manual actions tab one day and gone the next – that I’ve seen. Sometimes after a successful reconsideration request, sometimes straight out of the blue on manuals that have been there a while – so I’ve always assumed that the penalty expired in those cases. :)

    • rob woods says:

      I have to agree with Rae. I have definitely seen messages just vanish. Rae called out Matt Cutts in a question about recent disappearances at SMX Advanced. I asked the same question but Rae got hers in first and Danny quite rightly chose her’s to ask Matt ;) We both recently noticed some messages disappearing at the same time even though Google claims nothing happened at that time. I had a client with a manual link penalty… the first reconsideration request was rejected so we did another clean up and when I went to do another request GWT rejected it saying there was no manual action on the site. Sure enough when I checked in the Manual Actions portion of the site, the message about the action was gone. So, yes, messages do disappear and I do believe that messages sometimes don’t show up in WMT, especially if they happened before the account was created or before the account you are using to look at WMT was added to the site’s profile after the message. Google isn’t saying, IMO because it would be embarrassing for them, but my suspicion is that there were a bunch of penalties that “should” have expired and did not and that Google recently caught this and expired them all at or near the same time. Matt was very clear in saying that all manual penalties eventually expire and that when that happened the message would disappear from the “Manual Penalty” portion of the site.

  5. Doc Sheldon says:

    Thanks for taking the time to share the You & A with us, Rae. You’re ‘preciated!
    I’m still undecided about the timers. To me, it doesn’t make a lot of sense to revoke a manual after a recon, but essentially leave the penalty in force for some period of time.
    Not that Google would necessarily refrain fro doing something that doesn’t make sense, of course. :P

  6. Vanessa says:

    Do you think that as a whole google tends to favor mostly blogs based in the States and in the U.K to give them higher page rank?

  7. I would imagine Matt hates going to these events, really. Can you imagine being in his position having effectively cost so many people their businesses and livelihood? I mean sure he’s got a bit of a cult following and the people wanting to gain an insight no doubt outweighs the blackhats but still. It must be a little bit daunting for the poor guy!

Speak Your Mind

*