With my recaps of the Local SEO sessions at SMX West last month, we had a bit of a break from Greg’s Soapbox. Never fear, it’s back in full force this month!
I’ve been lucky enough to have the opportunity to attend several large conferences over the last few months, and I have been a part of many discussions about what really works for local SEO. It seems that most people fall into one of two camps, and there’s a growing debate between the two.
On one side, we have people who hold the annual Local Search Ranking Factors (LSRF) survey, now run by Darren Shaw at Whitespark, as gospel. On the other, you have the anti-LSRF group, who think that the LSRF study is opinion-based poppycock (yes, someone actually called it “poppycock”). This side favors the insights gleaned from Andrew Shotland and Dan Leibson’s massive study of local ranking factors, in which they attempted to reverse-engineer Google’s local algorithm.
In many cases, but not all, the results of the study align with those of the survey — but in some cases, there’s a huge difference.
As I sat through these many conversations and debates over the last few months, I noticed something unsettling. Nearly every person I talked to on either “side” of the question seemed to fall into that camp by blind faith. They believed one way or the other because that’s the side of the fence they were “raised on,” so to speak.
Forget what anyone says — test it for yourself!
Maybe I’m just wearing my (officially licensed and available for sale) Greg’s Soapbox Tinfoil Hat, but in my entire career as an SEO, I’ve never simply accepted anything as the truth. I’ve always thought of myself as a bit of a mad scientist, conducting crazy experiments to see what really worked… and I’m incredibly surprised that so many people don’t look at things the same way!
It’s insane to read a blog post or two, or see a dynamic speaker at a conference, or even listen to your boss and trust that you’re hearing the absolute best truth. We all know there are hundreds of factors that influence the relevancy of a site, and being local SEOs, we know that Google treats different business types and even different search queries in vastly different ways.
Don’t get me wrong — I’m not knocking the Local Search Ranking Factors study. I’ve been a participant for years, and I firmly believe it’s an amazing tool for anyone in the industry. But I also think that Shotland and Leibson have the right idea: you simply must test things for yourself to be sure that things really work the way you expect them to.
To geo-optimize or not to geo-optimize?
The perfect example is geo-optimization. Most old-school local SEOs will tell you exactly how to optimize a page for a geo term, inserting it in the title tag, H1, content, alt text, URL and so on. On the flip side, the correlations in Shotland and Leibson’s study show that geo-optimization doesn’t really do anything. So who’s right?
I’m on Greg’s Soapbox, so I’m right. Here’s the answer: none of us is right.
In some cases, geo-optimization might not do squat for a website. If it’s a competitive vertical, and every site has geo-optimized out the wazoo, then of course it won’t work. It’s exactly the same issue I discussed in my post last fall about unique content no longer being important because everyone is unique.
In other verticals that might be a bit behind or a bit less competitive, geo-optimization can be a huge game-changer. If you’re working on a site, and it’s the only one in the local market that’s well-optimized for that city, then boom — you win!
The issue is this: neither the LSRF results or Shotland and Leibson’s test will tell you what’s right for your own site or your clients’ sites. You’re going to have to test things for yourself to find out what really matters.
The Local Search Ranking Factors study is incredibly valuable because it points you in what’s probably a good direction. The 40 or so participants in the study are at the absolute top of the local SEO game, and I know for a fact that every single one of them is always testing. It’s a good bet that if the LSRF study points you in a direction, it’s a smart choice to follow and test that factor for yourself.
Same thing with Shotland and Leibson’s test — there’s a good chance their data is pure gold as well, and it should give you a starting point for your own tests.
Regardless of which camp you fall into, don’t trust anything on blind faith. Become a mad scientist and test things for yourself — you’ll be a better SEO, and you’ll get much better results for your clients.