Everything You Need to Know About Hidden Text & SEO
Hidden text, with the goal of manipulating Google’s algorithm, can get you penalized. However, here are a few valid reasons to hide content.
Hidden text is one of the oldest tricks in the SEO handbook. If you’re hoping hidden text will somehow boost your SEO efforts, you’ll quickly discover this outdated tactic is ineffective.
Back in the day, when search engines were much less sophisticated, you could hide text on webpages in an attempt to gain ranking for certain keywords not visible on the page. You could also hide links on other websites linking back to the page you wanted to gain ranking.
Also known as “content cloaking,” this tactic used to be work because, even though the text is hidden to users, search engines could still crawl it. But that’s no longer the case – search engines are much more sophisticated and better at detecting spammy tactics.
Why Hide Text?
The reasons for using hidden text, and how it is implemented, can vary.
Here are a few reasons why some SEO professionals use this tactic.
Including Keywords They Can’t Show to the Public
For instance, competitor names. In the attempt to rank for competitor brand terms, these keywords can’t be added due to legal compliance, corporate marketing policies, or stealthy SEO approaches.
There are also SEOs who use incorrectly spelled keywords and cloak them because it won’t look appropriate – and to put it bluntly, just appears wrong – if you use them on a post.
Keyword Spamming the Page
Some SEO professionals believe that increasing the keyword count on a page can help rank that keyword. This may have been an effective strategy in the ancient SEO ages (late ’90s to early 2000s) but not today.
Links are still strong ranking factors. Many sites used to get links from other sites that were hidden. These links were hidden because often times they were unrelated to the content on the site where they were posted.
Sometimes, the links are added on the sites that are owned by the same company, or owned by a partner that has predefined this relationship. Other times, sites are hacked to be able to add the links; this is not only bad for SEO, but is also illegal.
Google doesn’t like these methods of optimization because they aren’t focused on improving ranking based on quality content; instead SEOs are merely trying to get around the search engine’s algorithm.
Over the years, Google has improved its capability of determining if and where hidden content exists.
If, for some reason, your hidden content gets past Google’s sophisticated crawler without detection, the quality of the hidden content or hidden links are often not good enough that they may still be ranked very low. Additionally, Google has manual reviewers whose sole task is to manually check websites for these kind of things and penalize the sites accordingly.
Valid Reasons to Use Hidden Text
Google uses various methods to determine whether hidden content exists on a site, but they also allow other forms of hidden content. Here are a few valid reasons to hide content:
Part of Navigational Elements
Too many links in the screen can be overwhelming so drop down menus, multiple hierarchy menus, accordion navigation, tabbed menus, slider menus, etc. are used to keep the page from appearing cluttered.
The main rule here is that it should be visually obvious to users how the hidden content should appear. An arrow, a button, a link that can be obviously found by users to display the hidden content is valid to use without any negative SEO implications. The intent to hide the content is related to the user experience and avoiding clutter on the page.
Paid Content Subscription
Google allows websites that offer paid subscription to hide content and even honors the First Click Free method of cloaking. This means that upon the initial visit from Google, you will see the content; but on the second visit to the page, the content no longer appears, instead you will need to login and often pay to view the content.
The intent here is to just give a sample preview of what a paid subscription of a publisher has to offer.
Elements of the Pages Designed for Mobile & Desktop
Responsive sites change and adjust based on the dimensions of a page. Once a certain width limit is reached, some page elements can disappear and some appear, but in the source code they were all there at the same time but are temporarily hidden. This is done for usability purposes and Google is aware of these different viewport formats and doesn’t penalize your site for it if the intent appears to really be for proper mobile and desktop user experience.
This also applies to cases in which these features are disabled on a browser and when the page can’t load simply due to bandwidth constraints. Search engines may see both pieces of content but as long as the content that appears on a degraded view is exactly the same as the content on the normal view, there shouldn’t be a problem.
The common trend in these four situations: the intent to hide content was never related to trying to game the algorithm in an attempt to improve search engine ranking.
How Hidden Text is Created & Ways to Detect It
Same Colored Text and Background
White text on white background is one of the oldest methods and easiest to detect. Simply highlighting the page can expose this text with a CTRL+A or you can always check the source code.
Disabling CSS can also expose this but using old school font color attributes of the HTML 4 <font> tag will still hide the content since it doesn’t use CSS.
CSS Hidden Text
CSS can hide content in numerous ways, like using the properties display:none, visibility:hidden, height:0, width:0, text-spacing:-1000, etc. These can be easily exposed by disabling CSS or simply viewing the source code.
User Agent Detection
Server side scripting languages (like PHP, ASP/.net, JSP, Cold a Fusion, Perl, Node.JS, etc) that detect user-agents would normally be used to determine the web browser you are using. These can also be used to detect the search engine bots. When Googlebot, or other search engines, are detected, a different version of the page is sent. When viewed in the source code, you cannot even see the hidden content. The only way to identify if this type of content exists is to change your browser’s user agent to mimic a search engine bot. There are many web browser plugins you can install to help change the web browsers user agent and pretend to be a search engine.
IP Address Detection
Similar to the user agent detection, the IP address is detected instead. Each request to a web page comes from an IP address and there are some known IP addresses of search engines where server side scripting can also be used to determine if the visitor is a search engine crawler. This can be done by using Google Translate, or looking at Google cache. The latter detection method will not work if the cloaking page uses the Meta Noarchive tag. This method also become problematic for the developer cloaking the content because it is hard to find a very complete list of IP addresses that search engines use.
Reverse and Forward DNS Detection
IP addresses can be spoofed. So the most sophisticated way to cloak content is to reverse and forward DNS detection. Ironically, Google and Bing/Yahoo will tell you how to do this. The reason why you can find this information from search engines is because of the valid reasons to cloak content, like when implementing the first click free for paid content subscriptions. Similarly, for you to check if content is hidden this way, you can use Google translate.
What to Do When a Competitor Uses Hidden Text
Google is pretty good, but still not always perfect. Once in awhile you will see a high ranking page, outranking your site and they have hidden content everywhere.
What can you do about it? Google has a page to report this, the Google spam report page.
Just because you reported it, does not mean it’s going down. This will be reviewed by their manual reviewers and if they find the page hiding the content on purpose to gain some ranking advantage, the page can get penalized in Google. If they find the same issues occur across many sites, it may lead to an algorithm update in the future.