Optimizing your adult website to help boost search engine rankings can be a time-consuming process. Because of this, some site operators may be tempted to use SEO tricks that, in the past, may have worked. However, most, if not all, of these tricks no longer work as a means of gaining long-term organic traffic. Worse, most SEO tricks will now get your site penalized by Google for using abusive SEO tactics.
The following five old school SEO tricks used to work fine, up until Google updates closed the loopholes that allowed them to work. Nowadays they’ll only work to get you penalized. Knowing what they are can help you avoid them altogether, in favor of more legitimate techniques that will help you sleep at night knowing your targeted traffic won’t fall off a cliff anytime soon.
Google’s updated algorithms allow it to quickly reindex sites and penalize site’s that don’t allow indexation.
Keyword Stuffing
Current SEO best practices revolve around providing high-quality content with sensible use of relevant keywords. It has not always been this way. In the past, sites would stuff their pages with keywords, with poorly written content serving simply as a vessel for embedding keywords on a page in an attempt to keyword-stuff their way to the top of the SERPs.
This method took advantage of more primitive search algorithms, which at the time couldn’t differentiate between pages that contained content merely designed to hold keywords and pages with truly valuable content. Typically, a site’s meta tags would also be stuffed full of keywords in addition to spamming exact-match keyword phrases that the site hoped to rank for.
This all changed in 2011 with the Google Panda update, which made keyword-stuffing ineffective as an on-page SEO method. Now, Google’s search algorithms can easily tell whether your content is of low quality, plagiarized, or simply used as a pretext to embed keywords, and penalize it in the SERPs as a result. To gain in the rankings your site needs to feature well-written and sourced content.
To thrive in this new environment, keywords should come second to the quality of your content, and they should never be spammed (use your primary keyword no more than two or perhaps three times for each 500-750 words). Additionally, try using a variety of keywords rather than focusing on just one, especially if you can find some long-tail keywords that can help convince web crawlers that your content is of high value.
Spammy Tags, Footers
The footer of a website is supposed to be used primarily to help visitors navigate your site without having to scroll to the top. However, in the past footers were often stuffed full of links and tags by site operators who used this method to try and boost their position in the SERPs. Besides being unsightly, this made it a pain to navigate through sites weighed down by redundant links and tags.
With usability being an emphasis of current search engine algorithms, you can imagine that this approach is no longer a winner when it comes to helping a site rise in the SERPs. Just the opposite, in fact, as Google’s Panda and Penguin updates set the stage for penalizing sites with spammy footer links and suboptimal website structure.
Current SEO best practices calls for clean footer design with the inclusion of valuable data such as site usage terms, contact information, and other important info. This data should be provided without spamming links or tags, which can cause your site’s rankings to tank.
Overusing Keyword-Heavy Anchor Text in Internal Links
When structuring the pages on your site, it’s important to include internal links so that search engine bots can crawl your site’s pages efficiently. However, be careful when doing so to avoid incurring an over-optimization penalty by constantly linking the inner pages of your site using anchor texts that are rich in keywords.
In the past, using the approach of linking a page to other pages with redundant keyword-targeted anchor texts was a surefire way to improve its SERPs. This was often done using a single URL with a variety of keyword variations placed in different anchor texts.
Now, Google’s emphasis on value means that this approach is potentially harmful when it comes to helping rank your pages. Links should be seen first in terms of how they bring value to the user. When you do link internally, be sure not to use the same anchor texts over and over – try a variety of links to avoid being penalized by Google for redundancy.
Setting Up Separate Pages for Each Target Keyword
In past years, keyword usage was critical to your page’s ranking in the SERPs. While Google’s recent updates to its search algorithms have lessened the importance of keywords, they still have a part to play in SEO. Instead of stuffing your pages with a variety of takes on your target keywords, new approaches are more effective. These include selecting keywords that best fit your content, including using long-tail keywords to help improve click-through rates (CTRs), and using strategies that take into account the context in which a keyword is used and the intent of web surfers when they search for them.
Updates such as Panda make it dangerous to your site’s ranking to engage in keyword over-optimization. This is directly contrary to old school keyword optimization, in which this practice was essential to securing a high search engine ranking. One method used in the past to maximize a site’s keyword usage was to create individual pages for each version of a keyword.
While this type of approach was OK to use in past years, if you use this method now you subject your site to the risk of being saddled with a penalty. To spare your site from this risk, avoid creating doorway pages for each target keyword.
Cloaking and Content Swapping
A common practice of so-called black hat SEO was to cloak a site, whereby one webpage would offer two different displays: one for visitors and another for web crawler bots. Utilizing this technique took a degree of sophistication, as it involved differentiating between search bots and real organic traffic, as well as using server-side coding to achieve it.
Cloaking was commonly used for the purpose of fooling search engine algorithms to garner increased SERPs, as well as keeping a page free of keywords and links so users could easily scan it. It was also used to spoof popular searches by linking to topics not related to the actual search.
Another technique used effectively to boost a site’s rankings in the past took advantage of the fact that Google’s algorithms were often slow to catch up with sites that had swapped content after being ranked. This made bait and switch content swapping a viable tactic for ranking sites stuffed with affiliate links or for directing high amounts of traffic to material that wouldn’t normally rank so highly.
In some cases, an entire site would be closed to indexation when swapped content was used. This technique is now a thing of the past when it comes to fooling the search engines. Google’s updated algorithms allow it to quickly reindex sites and penalize site’s that don’t allow indexation.
Instead of using tricks such as content manipulation or cloaking to boost your ranking, you are better off focusing on creating genuinely quality content for your visitors.
Adrian “Yo Adrian” DeGus is a 15-year adult industry veteran and founder of AdultSEOPartners.com, a professional adult SEO agency catering to large established adult sites. DeGus, who has provided advanced consulting services to many leading sites in the adult industry, also operates Adult SEO Training, a popular service that helps webmasters, program operators and affiliate managers to learn in-house SEO.