When and How to Listen to Google’s Public Statements About SEO – Whiteboard Friday
Posted by randfish
When Google says jump, it’s hard not to jump. Often we take the words of Google representatives as edict and law, but it’s important to understand subtleties and to allow for clarification with time. In today’s Whiteboard Friday, Rand discusses some angles to consider that will help you stay grounded when the “Big G” makes a statement about SEO.
Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. This week we’re going to chat about the public statements that Googlers make and how we, as marketers, as SEOs, should be interpreting and understanding those statements.
So I actually wrote down a few things that Googlers have said. These are quotes taken from websites that have quoted them. So they may not be perfect. For example, if you are someone from Google who actually made these statements, you might say, “That’s not exactly how I phrased that.” Well, it’s how the websites quoted you. So Search Engine Roundtable and SEM Post and Search Engine Land, places like that, is where I got these quotes.
When something is missing
So for example, someone from Google says, “301, 302, 307, don’t worry about it. Use whatever makes sense for you. They all pass PageRank.” So you might have seen over the last few weeks there’s been a lot of tweets and stories, blog posts written about how we no longer as SEOs have to worry about the type of 30x redirect that we put in place. If there are 302s, that’s fine. Google seems to be passing PageRank through them.
Well, there’s actually been a bunch of discussion about this, because the evidence is on the totally other side, that if you have a website with a bunch of redirects that are not 301s, 302s, 307s, and you change them to 301s, which is the permanent redirect status code, it sure looks like Google organic search traffic sends more visits to those redirected pages or to the target of the redirected pages. Why would that be if it didn’t matter in the first place? Is it just a bunch of correlation but not causation results because it looks way too consistent? Or is there something else going on here?
Many folks, for example, pointed to the fact that the word “PageRank” might be the operative thing here. In fact, this is one of the things that I would say personally. When Google says they all pass PageRank or they all pass the same amount of PageRank, remember PageRank is Google’s original ranking formula from 1997 that Larry and Sergey developed in college. It is not a comprehensive, holistic representation of every possible signal that is in Google’s ranking algorithm, 200 or 500+ of them. It’s not everything that a machine learning system could possibly interpret. Maybe the machine learning systems that are in place at Google for spam or for relevancy or for importance, for trust, whatever they are have determined that 301s are in fact the better one to use or should be interpreted as a stronger signal. So you’ve got to be careful when reading a statement like this. It does generate a lot of discussion in our field, but it’s not the only case. This has happened for a decade and a half now in the SEO world, where people from Google say things publicly.
When they don’t get it right
For example, you might remember a couple years ago, “The mobile-friendly update will be bigger than Panda and Penguin combined.” Then, of course, the mobile update rolled out — what was that, June of 2014 — and we all scratched out heads and went, “Gosh, that was not much of an update at all. It seems like things didn’t shake up very much.” Then Google sort of explained, “Well, a lot of websites did end up updating. Oh, I guess we had a more staggered update rollout of it than we were expecting, and so maybe you didn’t see a lot of change.” Well, certainly that seems awkward in comparison to that statement.
When we get clear-cut(ts) answers
Another statement, this statement I actually love. I love statements like this from Google. So this is when Eric Enge, from Stone Temple, was interviewing Matt Cutts and he asked Matt about whether a 301 redirect would lose some amount of relevancy or ranking ability when it was being moved over, whether there was any risk to moving a page. Matt replied, “I am not 100% sure whether the crawling and indexing team has implemented that sort of natural PageRank decay, so I’ll have to go and check.” Then there was a note in the text that said, “Note in a follow-up email Matt confirmed there is some loss of PageRank through a 301.” Well, PageRank or link ranking factors, whatever you want to call it.
That’s great. This is, “I don’t know, but I’ll go check with the team that does know.” Then a response of, “Yes, the thing that you assumed is in fact the case and I can confirm it.” That’s awesome. I love, love, love statements like this. I sort of wish we could nudge Google into doing more of that, of the hey, we ask a question and you go, “Well, I think it’s this, but I’m going to go check with exactly that team that’s responsible for writing the code that implements that piece, so that we can tell you an honest and complete answer.” That’s terrific.
When they’re saying there’s a chance
But then you might get statements like this one, which are real tough. “External links to other sites isn’t specifically a ranking factor, but it can bring value to your content, and that in turn can be relevant for us in search. Whether or not they are followed doesn’t really matter.” That is a hard, hard statement to interpret. The first sentence says, “External links. We don’t use them. They’re not a ranking factor.” The second sentence says, “But those links might bring value to your content, and that in turn can be relevant for us in search,” which almost seems to contradict the first sentence. Those two things don’t go together.
I think this statement was not from Garry. This is John Mueller I think said this one. “Whether or not they are followed doesn’t really matter.” Okay, so if you are using them, followed or not followed doesn’t matter. Tough statement to interpret. I’m not sure what to take away from that. The only thing I think I might be able to do is to say, “I should probably test it. I should figure it out for myself.”
Recommendations for analyzing and interpreting Google’s words
In fact, I’ve got some recommendations for you when you are analyzing these words from Google, because it can be really tough to say, “How do I know which statements I can trust? Which one is the external links statement? Which one is the, ‘I’ll go check and I’ll tell you which one is the flat-out wrong statement?’ Which one is the, ‘Well maybe this is right, but maybe it’s just not telling me the whole story.'”
(A) Consider all the ways that the statement could be true while the surface-level info is technically wrong. So, for example, on the external links one, maybe the statement is true that it’s not specifically used as a ranking factor or not separately used, but maybe it’s used in concert with other signals. That’s what was trying to be said there by John, and it just came out in a funny way that the language would be parsed on the surface as very misinterpreted. So if someone from Google says, “A does not equal C,” you might say, “Aha, so that means B or D could equal C.” There you go.
(B) Give statements some time to be amended or modified, at least a few weeks. For example, you’ll remember that the statement about 301s, 302s, and 307s, there was a statement made by Gary from Google and Gary said this. Then just a couple weeks later, he amended the statement to say, “Oh, right, there are also canonicalization issues, which is separate maybe from ranking issues, but probably you don’t care, because canonicalization will affect your rankings. 301s do help with canonicalization in Google, whereas 302s and 307s might not help as much,” which is sort of saying, “Wait, so they are interpreted differently and there could be some reasons why when I change 302s to 301s rankings and traffic go up. Aha.” That statement took a little while to come out, but it did kind of correct the record.
(C) I like data and I like experiments over opinions and public statements. So for example, a few months ago now, the folks at Reboot Online did a great study about external links. They created some fake words and built up a bunch of web pages. Some of the web pages did have external links on them. Some of them didn’t. They saw that Google was extremely consistent in always ranking the ones that had external-pointing links that were followed versus external but not followed or no external links or internal links only, that kind of stuff. I think their results were pretty conclusive.
There are all sorts of reasons why this statement might have been wrong. Maybe when John said it, it was correct. Or maybe his second sentence is really the truth here and the first sentence is more, “Well, it’s not its own separate, specific thing,” and so the interpretation is what matters. In either case, that data, that experimentation, hugely valuable and important for us as an industry and I really like paying attention to those things and then trying to verify and replicate and apply on our own sites.
(D) The last thing I’ll say is, look, we need to be empathetic and forgiving. A lot of Googlers are working in a giant, giant corporation, tens of thousands of employees at Google, hundreds of different teams that potentially contribute. Just the ones that we know of, there’s Core Ranking folks, there’s Web Spam folks, there’s Crawling and Indexing folks, and Search Quality folks, and Webmaster Tools folks, and Webmaster Trends Analysts, and all these many different departments. It’s not always the case that a Gary or a John or any of the representatives and Andre can go and talk to the engineers who wrote the code and have them pull that right up and say, “Aha, yes, this exactly is what’s going on here and here’s why and here’s how we wrote it.” You just don’t get that level of clarity and sophistication.
So they have to operate with the knowledge that they have and with the information that they are being told. We, likewise, need to give them some room to amend their statements. We need to follow up ourselves with our own data, and we need to be careful about how we interpret and parse the sentences and phrasing that they give us.
All right, everyone, look forward to your comments and your thoughts about things Google has said over the years, how they’ve been helpful to you, potentially harmful to you, and hopefully we’ll see you again next week for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
from Moz Blog http://ift.tt/2by0isL