My “Meet the Search Engines” SMX West 2014 Takeaways

When I walked into the “Meet the Search Engines” session at SMX, I definitely wasn’t planning to “live tweet” it. Typically these sessions consist of Duane Forrester offering up clear directives to rank well in Bing while Matt Cutts gives very vague answers in relation to ranking in Google.

I have to say though that Matt – while still vague at times – was more forthcoming in this session than I’ve seen him in recent years, and thus my tweets ended up being multiple in volume.

Meet the Search Engines

*Photo (used with permission) credit goes to Virginia Nussey from the Bruce Clay blog

While live tweeting, I was merely parroting the answers that were given, vs. adding any type of commentary to them due to time constraints between questions. So, I figured I’d compile the list of tweets below and add my own thoughts where I have them. :)

The first question was from Danny Sullivan and was a general “do you have anything to announce” question. Matt mentioned several things.

A “more soft” version of Panda is coming soon

The first was that Google would be releasing a new, “more soft” version of Panda. He then clarified that it would appear more soft to “most of the people in this room”. I took that to mean searchers wouldn’t notice, but SEOs would.

You need to get your site “mobile ready” and you’d better do so soon

He then offered up that serving up a great mobile was quickly increasing in importance – noting that he personally believed that mobile searches would overtake desktop searches on Google within a year. More on why that matters for “you” further down below.

A well known guest blog network will be “hit” next week

Pretty self explanatory. Tons of chatter in the SEO world right now taking guesses on who that might be.

EDITED TO ADD:

Matt released a subsequent tweet on 3/19/2014.

That the network he mentioned at SMX had been officially hit. Because he gave us some “advanced warning”, I’d been watching the branded terms for several networks that connect blogs with guest post authors. MyBlogGuest is no longer ranking for their branded terms since the above tweet by Matt so the suspicion is they were the unlucky “winner”.

EDITED ONCE AGAIN TO ADD:

On 3/19/14 the owner of MBG confirmed her site had received a manual action from Google. Barry Schwartz published a post on 3/20/14 citing a tweet from Matt Cutts:

But, it shouldn’t come as any surprise. Finding the users of the service is easy as can be. Add in some accompanying IFTTT variables and I’m expecting service users will definitely see some sort of fall out. The real question is how much.

Are we getting organic keyword data “back”?

While I was heading to SMX, Larry Kim posted on Google+ that Google was working on a solution to the “not provided” issue. I took that to mean that Google was looking to bring back organic keyword data in some fashion.

During the opening keynote, Amit Singhal had made the following statement:

“Over time, we have moved to secure searches. Referrers are not passed to webmasters, but they are passed to advertisers. But webmasters get a lot of information in Webmaster Central.

But over a period of time, we’ve been looking at this issue. We’ve heard from our users that they do want their searches secure — this is really important to users. We like how things have gone on with the organic side of search.

So, in the coming weeks and months, we’re looking at better solutions for this. We have nothing to announce, but we have discussed with the ads side about how we should handle this in the future.”

After I was able to see the quote myself, I think in actuality, that statement more reflected that paid advertisers may have cause to worry about their future keyword data. But, and I’m being clear here, that was my interpretation based on the quote above and a slight comment Matt made in the MTSE session (I don’t remember the exact wording and Matt doesn’t run paid, so he was clear he wasn’t “in the know” in regards to what was going on with that side of things).

What Matt did say was that he interpreted Amit’s comments to mean that Google was happy with the results of taking away organic keyword data in regards to creating a better experience for their users [insert slight eye roll from me here] and he didn’t see that decision being reversed.

Could a Penguin penalty follow you with a URL change even without a redirect?

The first question asked during the session from the audience was actually one I’d submitted. After seeing that John Mueller had made a comment that a penalty could follow you if you changed domains, even without using a 301 (if not much changed about the site other than the URL), there was some debate amongst several SEO themed private groups about whether that was in reference to Panda or if it also applied to Penguin.

Now, if you’re being hit for duplicate content (Panda) and change domains but not the duplicate content, it makes sense it would “follow you” without a redirect. The part I wanted clarification on was whether or not that was also true in regards to Penguin.

Since I was of the belief that comment also indicated a Penguin penalty could follow you, I asked my question fairly deliberately. Rather than ask if it applied to Penguin, I instead asked why a Penguin penalty would follow you – because if you don’t use a redirect, you’re essentially disavowing all links – so why the hell would Google then “hunt you down” so to speak when you’ve already said “Uncle” and had resigned to starting over?

My question was purposely asked to make the potential for a Penguin penalty to follow you even without a 301 if you changed domains a “fact” so to speak. The reason for that was to either have Matt say that “fact” was wrong, or to elaborate on why they would do so (vs. just asking if it was a fact and getting a yes or no).

Matt’s response was that if you have a Penguin penalty, Google doesn’t want you to “change domains” as some sort of mass disavow. They want you to actually disavow the links for the current domain. And he definitely implied through his (longish) answer that Penguin could indeed follow you and gave a few reasons as to “why” Google would do so (sorry, don’t remember the exact wording, but it was essentially they wanted you to fix your issues and not “run away” from them so to speak). But the sentiment was clear – a Penguin penalty potentially following you without a 301 was an actuality.

Keeping your parameters clean is a best practice in both engines

Duane commented here that webmasters shouldn’t simply rely on canonical tags but rather try and fix the problem at the source and clean up the URLs. Duane said canonical was meant for instances where you couldn’t do so, but that you shouldn’t be using tons of them.

Matt quickly chimed in saying that you could use the canonical tag as much as you wanted without issue. But, he also was quick to mention that it was indeed best to fix the problem at the source if possible to prevent “split link popularity” issues.

The mention of the potential for split link popularity left me wondering if using the Canonical tag doesn’t transfer link popularity to the canonical URL. I’ve always felt like Google has given the impression that a canonical tag essentially merges all aspects of a duplicate page to the correct source page. But, the statement above might imply otherwise.

But I hate when people dissect every word Matt says as if they have a beeline to his brain. His comment could have meant multiple things or nothing. But, that was the question mark that entered my head after hearing his comments on the issue. But, I’ve always been a fan of only using the canonical tag if fixing the issue at the source isn’t possible, so either way, it doesn’t change much for me.

Widgets are okay for some companies, but probably not yours

Ok, so my heading on this question is purposely me being a smartass. This topic was spurred by Danny asking if Getty was breaking Google guidelines by linking back to their site in their image embeds. Matt said no, because Getty’s “intent” wasn’t to manipulate Google. He cited several more examples of large brands doing this without the “intent” of manipulating Google. He also cited that keyword based links from widgets were really bad, regardless of intent.

The problem here for me was that you can’t scale determining intent. So if you’re not big enough for Google to look at and weigh in on your “intent” then you’d probably be at risk to get smacked for using widgets, even if your “intent” was legitimate.

Seriously, get “mobile ready” ASAP

Matt had already mentioned that mobile may overtake desktop in regards to people searching on Google soon. He mentioned that sites not delivering a good mobile experience might not rank as well for users searching from a mobile device.

To be very clear – he did not say penalty or anything even remotely close to it.

He used flash not rendering for an iPhone user as an example. If your site is flash based, then them serving it up as a result to iPhone users wouldn’t be a “good experience” for the user and they may choose to adjust their results for that searcher accordingly.

What this means for “us” is that getting sites mobile ready can no longer sit on the back burner (if you’re reading my non mobile site from a mobile device, then yes, I am a pot calling a kettle black). If being mobile responsive has been an item on your todo list that you haven’t yet found time to address (like I haven’t here on Sugarrae), then you’d better make the time – and soon.

How high is the “risk factor” for a site being penalized?

Duane was very clear in that Bing doesn’t levy a lot of penalties. He said you’d have to do something really, really bad to get a penalty from Bing. He implied Bing was smart enough to discount vs. needing to penalize.

Matt on the other hand let us know that Google is policing spam – in 40 different languages when it comes to manual review of spam (I didn’t even realize there WERE 40 different languages). But, Matt also was quick to note that Google mainly relies on their algorithm to detect penalties, implying the large majority of penalties were algorithmic and not handed down manually.

How long do penalties last?

Matt then noted that the time frame of your penalty can be affected by the severity of your infraction. According to John Mueller, the average site is probably looking at 6-12 months to recover from a penalty even after cleaning it up. (*cough unless you’re a big brand?)

I’ve often noted when I’ve discussed Penguin recovery that if your penalty is algorithmic, then you’ll have to wait for the next refresh of the Penguin filter to “be released” from said filter (same goes for Panda) after cleaning everything up. Matt has confirmed this to be true before, but did (blatantly) once again during the session.

Since Google says they’re no longer “announcing” the filter refreshes, Danny asked Matt to give some time frames for both Panda and Penguin data refreshes. Matt was reluctant to give an answer, so Danny started “suggesting some”.

After some back and forth, Matt said that saying the Panda filter updated somewhat monthly was a fair statement. He also pointed out that Panda refreshes are now done over a stretch of days vs. being a stark “hit” in one day as they’d been in the past.

When it came to Penguin, Matt was much more “dodgy” on giving a time frame than he was on Panda. Danny asked if there had been a Penguin data refresh since the last announced one (October 4th, 2013 for those not keeping track).

Matt said to his knowledge, no, there had not been a Penguin data refresh since that date. Danny pressed on looking for a time frame for Penguin refreshes. Danny asked if 6 months was a fair estimate as to the time frame.

Matt really “ahhhh, uhhhh”‘ed this one, but then told Danny six months was a “somewhat fair” timeframe to expect Penguin data refreshes – and pointed out that the Penguin data refreshes are more complicated to implement so to speak than the Panda ones.

As the local carrousel becomes more prevalent, how do we better rank in it?

Matt completely clammed up here and offered no advice at all. He cited spammers exploiting it as the reasons for needing to stay completely mum on the topic.

What should you be focusing on to rank in today’s algorithms?

Duane cited focusing on content, usability, social signals and link building – and he made it clear he stated them in order.

Matt said he agreed with most of what Duane had said. He then tried to emphasize you needed “great content” while telling us he knows we’re tired of hearing him say that. He started to give a few examples of sites he felt were producing great content and “doing it right”. The one he seemed to mention most was Android Police.

Matt joked that as a Googler, he hated that they were able to somehow get leaks on Android, but was quick to point out that despite that, they were doing it right so to speak as far as being an example of “great content” under his definition of it.

Does Google have and use “Author Rank”?

Danny asked Matt if Google was using “Author Rank” to influence rankings. To be clear, Matt did not call it “Author Rank” but he also didn’t correct Danny on using the term. He simply said that yes, those signals were being used in regards to the In Depth articles appearing in Google. He did not elaborate on whether they were or weren’t being used to determine rankings outside of In Depth articles. He was clear in that he was only commenting on its use in regards to In Depth articles.

Google and JavaScript / iFrames

I didn’t live tweet about this one, but worth noting is that Matt mentioned that Google is now much more able to read and execute JavaScript. When Danny asked about Getty Image embeds, he specifically asked if the fact that they were in an iFrame meant they had no impact in regards to Google. Matt was coy about saying Google was getting better in regards to iFrames.

What do we need to know about Hummingbird?

Matt answered this pretty quickly and succinctly as if there wasn’t much that needed to be said or done on the topic. He definitely implied via his statements that it was more about how Google treated search queries and less to do with “our websites” – which is something Ammon Johns (in my opinion, correctly) declared a long time ago.

How concerned do I need to be about Negative SEO?

Matt seemed to feel that most people claiming negative SEO weren’t actually “victims” of negative SEO, but rather people who once employed shady practices (or once employed firms that used shady practices).

This part I tend to nod my head at after seeing multiple clients come to us for Penguin recovery telling us that “their old SEO firm said they got hit by negative SEO” when in fact, everything I’m looking at says that isn’t the case, but rather the scapegoat for old work coming back to bite them in the ass.

He, as usual, implied negative SEO isn’t something the average webmaster needs to worry about. There, I don’t necessarily agree. But I also believe there’s not much you can do about it if it hasn’t happened to you except worry – and that’s not productive for us as marketers.

I have some bad links, but haven’t been penalized – what do I do?

Matt said that if you’re aware you have bad links, he’d probably disavow them. Someone tweeted at me that Google has always recommended removal over disavow and was a little confused by Matt’s immediate jump to disavowing them. To be fair, the topic of discussion heavily centered around when to disavow or not disavow and I think that was the reasoning behind Matt saying “disavow” vs. saying “remove then disavow” – AKA, don’t read too much into that.

After the session, Matt apparently read my live tweets of it.

He tweeted back essentially clarifying that he wasn’t trying to incite paranoia. I took his response above to mean that if you have a few bad links you didn’t obtain yourself, you probably don’t need to worry. But if you have a shitload of comment spam you created during your 2006 link building campaign and haven’t been “hit” for it yet, you may want to be proactive in getting those links removed or disavowed to avoid being hit in future Penguin updates.

About Rae Hoffman

Rae Hoffman aka "Sugarrae" is an affiliate marketing veteran and the CEO of PushFire, a search marketing agency specializing in SEO audits and link building strategies. She is also the author of the often controversial Sugarrae blog. You can connect with Rae via Twitter, Google+ and Facebook.

Sugarrae runs on the Genesis Framework

Genesis Framework

If you’re someone who doesn’t understand a lot of PHP, Genesis will give a ton of functionality that you wouldn’t be able to obtain otherwise with a simple control panel instead of having to alter code. For the advanced, Genesis has incredible customization possibilities via Genesis hooks.

The theme is not only highly customizable, but it has allowed me to run Sugarrae more professionally, with a much more targeted focus on monetization than it ever has been able to achieve before.

You can find out more about Genesis below:

Comments

  1. Awesome post Rae. Excellent live tweet coverage with some great follow-up elaboration. I especially like your (*cough unless you’re a big brand?) – Say JCPennys, Rap Genius, Teleflora, etc.

    Love your use of technical “jargon” (e.g. But if you have a shitload …) is excellent

    When Duane says that Bing penalties are not common, no kidding. If you want to see the ghost of the first page of Google penalized sites for a head terms, just search the same term on Bing. I have seen posts dedicated on how to salvage ROI on penalized sites by using more link spam to rank on Bing :)

    Happy St. Patty’s!

  2. Great post as always Rae!

    Speaking about we better adapt to mobile. If Google wants to rank me according to being in mobile I might as well ban the Google bot now.

    Scenario: I have a classic car authority site built with frontpage and other 1999 wsyiwig software that is visited by old guys that still have old phones if they have a mobile phone at all. I am lucky they know how to turn on a computer, but they do and they visit my authority site multiple times a day and have it “Saved as their Favorites” in Netscape lol. I was smart and built an email list on day 1 for the site. 80% return visitors

    I don’t need Google until my audience dies off. Do I spend hundreds of hours (and dollars) to covert it because Google thinks everything should be mobile? I don’t think so.

    Once again TY for the live tweets at the show. It was good knowing someone would be there to post the “Read Between the lines”. :)

    • Vinny – well, in my eyes, even if you “don’t need Google”, you still might need to do it. You can easily check your analytics to see what percentage of YOUR specific visitors are coming in via mobile. If it’s a large percentage of them and if your site gives a less than desirable user experience on mobile, then, yeah, I’d take the time to flip it. Both Sugarrae and PushFire are in the midst of redesigns and one of the big “spurs” for that was to make them mobile responsive.

      And you’re welcome re the post. :)

  3. Rae
    You are always ahead of the curve with your insight and where you are in the SEO and marketing process. Thank you as always for being a breath of fresh air reporter and just a real cool person.

  4. Thanks for the notes, Rae – I was at SMX, but couldn’t stay long enough for this one.

    Love it what you said about Duane vs Matt – clear vs vague. Too bad no one seriously works on Bing rankings…

  5. Rae, Thanks a ton for the coverage. I am shocked nobody else was talking about the US Guest Blogging Network that was in line to get hit. I’m amazed at how fast it went from Guest Blogging is ok, to Guest Blogging is getting Abused, to Stop Guest Blogging, to We are going to hit a network of sites that have been guest blogging. It’s usually taken Google a lot longer to come full circle on a specific tactic. They must be getting a little more impatient or have inside knowledge to the end of the world and want to have a clean web before then. ;-)

    • You’re welcome Mike :)

      Re the blog network… I think there’s a lot of “expected that” behind it. When they first took the stance on paid links, they subsequently went after several “buy PR5 links” networks. Re the time frame, I think guest blogging has been harder for them to detect vs. paid links and that means you hit the people who have been doing it most lazily and “in multiples” first – at least that’s what I do if I were them. :)

  6. Awesome, recap – very thorough and insightful. Love it!

  7. Thanks Rae good recap — appreciate the nuance. Any opinions on the slaps, yesterday?

  8. Roy Coan says:

    Rae, Thank you for your excellent summary of the SMX meet the search engines.

  9. I know you ‘called yourself out’ for not having a mobile site… But I am reading this post and typing this comment from my Nexus 5 and I GREATLY prefer the full desktop site experience on my phone versus mobile, and even responsive, sites.

    I think it is very foolish of Google to assume that everyone wants a mobile site on a mobile phone. Ugh.

  10. Akash Agarwal says:

    It’s really a great post. Here a lots of useful insights. Thanks for sharing.

Speak Your Mind

*