Google Secret Bonus Video Dark Web Operators

Secret SEO bonus video for google operators

This video cannot be searched for or viewed unless you know the secret address for it. This link was given out only to participants of the Power Searching With Google Program that is not open to public registration at this time.

Here they talk about the new Dark Web Google Operators that will crawl usually un-indexable files within a website also know as Deep Web.

Is Page Speed Important for SEO?

Why PageSpeed is Important for SEO

improve your page speed for search engine optimizationThe most important reason for optimizing your website’s page speed is simply because Google will reward you with higher rankings and visibility, but if that’s not reason enough or if you are not entirely convinced that it will boost your rankings, there are a number of other benefits that directly impact the profitability of your company. In this article, I will mention why PageSpeed has become such an important optimization factor for post Panda SEO and beyond.

Google Page Speed Algorithms

google-analytics-site-speedAs mentioned on Google’s performance best practices, At Google, we’ve found that faster sites make for a better user experience. and in case you haven’t noticed, Google recently added new metrics in Google Analytics to help measure your site’s load time and performance. But if you are still in doubt, they made it clear when they announced that they do in fact use site speed as a new signal in Google search ranking algorithms so unless you want to experience Google’s page speed penalization, head these warnings. Test your site’s page speed here or you can download and install the Firefox extension.

So why does Google care about page speed so much? A study conducted by Compuware found that two-thirds (67%) of web users say they come across slow websites at least weekly while, over a third (37%) say it makes them less likely to return to the site and 27% say it makes them more likely to visit a competitor’s site. Another study conducted by MIT technology review shows that 49% of online visitors will bounce from your website or just check out your competition if they experience performance issues. Below is the infographic they produced or you can download the full PDF version here.

page speed infographic

Other Benefits of Optimizing Page Speed

Not only can you get better rankings and more traffic but faster sites are cheaper to host because they require less load on your server(s). A more efficiently served site uses less bandwidth, therefore, lowering your monthly hosting fees. This might not be a big deal for some of you but most of the sites I work on spend anywhere from $7,000 to $25,000 per month on bandwidth alone. In addition, you may be able to save even more money by being able to downgrade your servers hardware requirements.

How can I Make My Site Faster?

This is a case-per-case type of issue because no two sites are exactly alike and even if they are using the same CMS, they could potentially have other factors such as content that could require different optimization methods. That being said, some places to start are by looking at your code. You want to use as light of code as possible and even in this complicated world of ever-evolving languages and techniques, basic HTML and CSS are always going to be your best choice. Use simple and short expressions, like <div> not <tb> and try to consolidate your CSS as much as possible by using site-wide rules instead of having several different CSS files for different sections of your site. You can use CSS menus for JavaScript-based menus to minimize or even eliminate the use of JavaScript altogether. If you must use external scripts, try to limit them to 2 per page and keep the total size below 20k. You can also compress resources with gzip or deflate to reduce the number of bytes sent over your network. You can also use browser caching, parsing of JaveScript, optimize images (combine into CSS sprites), minify JaveScript and CSS, remove query strings from static resources and reduce request sterilization.

Page Speed SEO Consultation

Please feel free to contact me if you would like a free consultation on your site and as always I look forward to answering your questions and concerns below in the comments.

Beware of The Penguin, The Google Penguin That is

Penguin-Update-GoogleAffected by the Penguin? You need to definitely check out this Penguin Recovery Case Study and this one features other Panda and Penguin Recovery Strategies.

On April 24th, 2012 Google unleashed the troublesome Penguin update and posted the unofficial Penguin announcement on Google’s Webmaster Central blog before they even named it Penguin. Then Matt Cutts tweets a link where you can report Post-Penguin webspam which directs to the Official Penguin Webspam Report Form located within Google Webmaster tools. If you were negatively affected by the Panda or the Penguin update, definitely read my post on Penguin Recovery.

If You Are Innocent

If you feel you were unjustly affected by the update it just doesn’t matter because Google will never hear your argument. As an alternative to pleading with Google, you can sign the kill the Penguin petition at In addition, Google provided this Penguin feedback form where you can state your case and you can also ask for a reconsideration request here but I STRONGLY recommend that you do not contact Google in any way as it could just put you on a flagged list along with spammers. Google has been accused of pulling similar phishing scams.

The FTC Gets Involved

The ironic thing is that this update which took out many innocent bystanders was released on the same day that the FTC escalated their case against Google and hired Beth A. Wilkinson to help prosecute the case. If you were one of those bystanders, I also suggest you file an official FTC complaint here. Former Federal Trade Commission official David Wales was quoted saying, “This shows Google that if it doesn’t give you the remedy you want, you’re going to litigate.”

What is the Penguin All About?

penguin-link-building-crusherThis update was originally called the Webspam Update until two days later when Matt Cutts officially renamed it Penguin. This update strictly targets off-page attributes and unnatural link building as defined in this Webmaster Guideline on Link schemes including buying links. To save you some time, I’ll go ahead and list what type of links this covers.

  • Paid links
  • Reciprocal link exchanges
  • Links from spun articles
  • Links from low-quality articles
  • Links from comment spam
  • Links from networks
  • Links containing a large number of exact anchor text matches
  • Links from splogs (auto created spam blogs)
  • Links from irrelevant sites
  • Forum spam links

Please feel free to add to my list of bad links below in the comments

Genetic Code, Binary Code and My Visions of the Future

DNA genetic code and the future

It’s been a while since I posted my last rant so this one ought to be good since its been building up.

The Code of Life is Digital

It’s amazing to think about the similarities between the genetic code that DNA is made out of and the ones and zeros that make up binary code used in computers. Just the idea leaves me in absolute awe because being able to look into this world and even manipulate it, seems like the kind of power only a god should possess and in glimmering moments of genius, I feel that it’s just within our reach. Before I get into the heart of this rant, I just want to explain exactly how they are so similar then, I’ll tell you what can be done with this knowledge.


computer 1s and 0sComputers are able to store and process information by breaking it down into combinations of ones and zeros that represent a value.  One binary string aka a byte consists of eight characters and each bite can contain a small amount of information. For example, a W in binary code looks like this, 01010111. It takes tens of thousands of these bites of information to represent text on your computer and millions to represent colors and pixels to make up an image.

DNA Code

dna-acgtThe genetic code of DNA is made up of four different molecules labeled as letters instead of numbers and these letters are A for adenine, C for cytosine, G for guanine and T for thymine. Genetic code restores and processes information binarily and has been doing it with amazing complexity for about 4 billion years now. An example of this is TGGACTTA on the 12th chromosome will mean you will have blue eyes or brown hair. It takes strings of this code wide enough to wrap around the Earth to provide enough information to build a human body.

OpenSource DNA

Now you see how these two technologies if you will, are one in the same. You can take the DNA of simple lifeforms like bacteria and completely upload that into a computer. This has been done and is known as DNA software. From there you can play with the building blocks of life by adding to it, taking away or building completely new modules altogether. This is a new field of science called synthetic biology and there are already open source communities developing the software, building modules and building a repository. Once you have reconfigured the code of As, Cs, Gs, and Ts you can then print it out by using the actual molecules that these letters represent then place it into a cell. It’s then put into a petri dish and is given a small electrical charge to kick start it just as you would restart someone’s heart and a new lifeform comes into creation. Think this sounds crazy well it’s already been done and they named the new lifeforms Synthia but the scientific name is Mycoplasma Laboratorium as shown below.

synthetic bacteria

The Future

My super system of the future will be a computer interface integrated into my DNA that allows me to control my genetic software and I’ll call it Gemmy (pronounced Jimmy that is). If I catch cancer, I’ll just tell those cells to stop reproducing, if I want to climb a wall, I’ll just activate my synthetic spider genes and create some silk like Spiderman. Now you must really think I’m crazy right?!?!?! But I’m not because this type of genetic engineering has already been done. We have located genes responsible for the different types of cancer and can remove them and in Utah, we have already put spider genes into goats to produce super-strength silk. Now you wanna mess with me and Gemmy? No, I didn’t think so.

synthetic spider webThis stuff is real and will be a big part of our everyday life sooner than you think. People will create whole open-source platforms and widgets of life that are integratable and interchangeable. It’s not all good that could come of this as a computer virus can be written in this code and will essentially a real virus by all scientific means.  The possibilities are really endless so mention your thoughts and visions for Gemmy and what you would use it for in the comments below

Penguin Friendly Link Building

Post-penguin link building tips

penguin-friendly-linksContrary to what you may be reading on the forums, link building is not yet dead. That being said, link building shortcuts are! The easy days of automated link building tools and buying links from spammy networks are long gone and now this new era of SEO comes ushering in. Don’t tell me you didn’t see this coming. Did you really think you would be able to continue dominating the SERPs by spinning and posting crap all over the internet? Or by using tools like Scrapebox, Xrumer and SEnuke to create 1000s of spammy links with the click of a button?

It doesn’t take a genius to look at your link footprint and determine the spamminess of your links so of course, a multi-billion dollar company like Google is going to figure it out. I’m actually surprised all this black-hat fun lasted as long as it did! I expected this sh*t to hit the fan years ago and that is why I wrote about these Penguin Friendly Link Building strategies before I even knew it would be called Penguin. Without further adieu, here are the new Penguin link building DOs and DONTs.

Penguin Link Building DOs and DONTs

Also, check out this Penguin Recovery Case Study where I analyze backlinks of a site that was negatively affected by the Penguin update.

Comment, Forum & Profile Links

Google SEO penguin webspamThey are still useful if done manually using a VPN service to change up IPs accordingly to profiles. No more spamming your way to success. Sorry guys, Scrapebox is still a very good tool and can still be helpful for scraping, harvesting and various other tasks but as far as posting goes, it’s useless at best and dangerous at worst.  Xrumer, which can post tens of thousands of links on forums, is also dead and the same goes with auto-created profiles. These were the places where blackhat webspam ran rampant and even the black-hatters themselves will admit the world wide web is better off without it!

Still Spinning?

Put that spinner down. Claims have been made that Google is working feverishly to identify poorly spun content by matching it against their own internal synonyms database, but if you are going to continue to spin your content (which I don’t recommend) at least be clever enough to spin your links and anchor text too. Plus, don’t just spin words, spin whole sentences and phrases. I actually have a theory that it’s not necessarily the spun content but the way you submit the content that is getting people caught. See, when you have 25 to 50 articles being published in one day, all with very similar titles and similar content, all with links pointing at the same site, using the same anchor text, this tends to look a bit spammy.  I stopped using spun content more than two years ago a future-proofing because I know one day Google (with the help of Copyscape) will retroactively punish violators but if you insist on doing it, at least upload your spun spam at different times and spread out over several months.

Buying Penguin Friendly Links

Buying links has changed in a few ways with the arrival of the new Penguin algorithms. Firstly, don’t buy links in the footer or on a links page because links in these areas are almost always going to be what Google considers Link Scheming or straight up paid links. Links in the header or sidebar can also be considered anti-Penguin because your average webmaster would never put an outbound link in these areas unless he/she was getting paid for it. Lastly, site-wide links like these, seem to leave a big footprint and also have a negative effect on your anchor text diversity. For example, if you have your link in the header of a site with thousands of pages, then you will have thousands of links coming from the same site all using the same exact anchor text and this dilutes your anchor text diversity. It’s much better to have 100 links on 100 different sites rather than 100 links from the same site. My advice for link buying is to try to get your link on just 1 page that is relevant, has good PR, within the <body> tag, within some written text on a site that already ranks within the first 10 pages for the keyword you are trying to target. These are called content-based links and are always going to be more Penguin friendly.

Link Networks

linking to your network sitesBuilding, buying, or acquiring links from networks can be risky as we learned with’s little mishap. Their whole network of tens of thousands of aged sites was taken down in one fell swoop back in March of 2012. The trick with large networks is to not leave behind a footprint or anything that will allow the search engines to connect the dots. This just got a whole lot harder. Networks can be easily identified and exposed by looking at the registration details, IP addresses, identical scripts/code, central control tools, outbound linking footprints, inbound linking footprints and more. Then let’s say you followed all the guidelines and built the most perfect, undetectable network of sites, well then what are you going to do with it?  You can’t sell links on it because the second you make it available to the public you’re going to get busted. Even trying to be sly and offering it on a black-hat forum is not discreet enough these days. Many speculate that Google detected most of these networks manually not algorithmically, meaning they had someone on the inside.  You can bet your bottom dollar they have spies and informants in ever corner of the SEO community. It’s imperative for them to hunt down networks like these and take them out before they manipulate their way to the tops of the SERPs.   The best way to build a network of sites is by manually treating each site as its own individual entity, each with a unique code, c-class IPs, content and no central control interface.

Internal Link Structure

internal links and outbound linksGoogle loves internal links because it helps to index your site and efficiently distribute PageRank. Furthermore, it increases user activity and page views. Just as you see in Wikipedia, use keyword-rich anchor text to build a web of relevant internal links.

Outbound Linking

As I’ve been saying for years, outbound links will increase your visibility on the web and for many reasons. One, it adds credibility to your content increasing the quality score. Secondly, Google likes to crawl as much as possible and if you can provide a relevant link to a helpful resource and the user clicks-through and sticks, then Google will reward you for it. The rules are simple, do not link to any page or site that competes for the same type of traffic. Link only to high authority sites like .edu, .gov or any high PR site. For an example just look at my outbound links here.

Less is More

less links is betterThe lesson to learn is quality over quantity.  I’ve spent countless hours looking at the backlinks of sites in different niches and surprisingly, I have been finding sites with fewer links outranking their competitors.  In each one of these cases, the sites with fewer links also have higher quality links. Since I’m currently doing Adult SEO, One example is with only 98 links is now on the first page of Google for the keyword escorts outranking and which both have hundreds of thousands of links. Just 1 relevant, well-placed link can have a greater effect than 1000 lower quality, poorly placed links.

How to Recover from a Panda or Penguin Slap Down

google-penguin-updateUnfortunately, trying to get back on your feet after getting slapped down by the Panda or the more recent Penguin update, is not an easy task and almost always requires a complete remodeling of your entire website! This is mostly because the actions were taken were not manual, instead, they were caused when something triggered an algorithm alarm to devalue or remove your rankings and once that happens your site could get sucked into a black hole for a very long time or until you do something drastic to reset the algorithm. Ever since Google started rolling out their infamous Panda updates in February of 2011, site after site has been penalized and sometimes unjustly. SEOs and webmasters alike have been scrambling for answers.  In this article, I will be outlining instructions on how to recover from a Panda penalty along with actual case studies from other panda recovery stories.

First things first, before you can fix the problem you must identify the problem and this means you must diagnose the attributes that got you penalized in the first place. In doing so, you’ll need to find out if it was on-page or off-page. The earlier Panda updates only affected your site’s on-page attributes like content, internal link structure, and coding. That was until the Panda 3.5 update on April 19th, 2012 and the Penguin update on April 24th, 2012 that targeted off-page attributes like off-page content and external links. Identifying the culprit will require a deep investigation of your site’s content, code and your server configuration well as an in-depth look at the links that point to your site. Lastly, you will need to look at all the actions taken by all members of your team. This should have been tracked in a project management software and by making notes in your Google Analytics account. Identifying the problem is always the hardest part of the Panda recovery process and second opinions can be very helpful, this is why I offer a 150 point Panda Slap Analysis which takes 5 days to complete.

Now that you know what triggered Google’s algorithms to punish you, these are some of the most common practices used to recover from the Panda:

If Your Site Was Penalized for Having Copied or Low-Quality Content

Remove the content from your site or move it to another sub-directory with 301 redirects. On April 24th, the same day of the Penguin update, Google posted this guide on how to move your content. This is a very popular technique that I mentioned back in Oct. of 2011 in a post about the Panda 2.5 update. Alternative methods are changing your URL structure to force a re-indexing of the whole site at least to the URLs that lost ranking, not to the URLs that are still listed high in the SERPs.

If Your Site Was Penalized for Having Spammy Backlinks

Cancel your subscriptions to any and all link networks or link brokering sites. This especially applies to adult sites and Porn SEO.  Then work feverishly to try to build a solid foundation of white hat links to fill that empty void of links that were de-indexed. These must content based links which are considered Panda Friendly and can be very time-consuming.

Other Panda Recovery Tactics

The two Panda penalties I listed above are the most common but your site could have been slapped down for a number of different reasons so make sure you check through this list of other very helpful Panda rescue tactics.

  • Scale down on your ads
  • Hide affiliate links by using redirects in your .htaccess file
  • Integrate with social media YouTube, Plus and Blogspot, Twitter, Facebook and Pinterest
  • Do more internal linking just like you see on
  • Clean up your code and optimize page load speed
  • Redesign your whole site
  • Enhance the user experience to increase stickiness and page views
  • Integrate rich snippets and microdata
  • Post regular content to the site

What Not to Do!

Do not file a reconsideration request! Firstly because Google only replies to manual penalties, not the penalties that result from an algo change. Secondly, because you most likely just be telling on yourself and fall victim to Googles Phishing Scam.

Do not use Matt Cutts Penguin tattle tale tool that he posted on Twitter or their Penguin Feedback form.

You can also use this list of past Panda updates to try to help you troubleshoot cause and effect of when your SERPs dropped.

Panda 3.5 on April 19th, 2012
Panda 3.4 on March 23rd, 2012
Panda 3.3 on about February 26th, 2012
Panda 3.2 on about January 15th, 2012
Panda 3.1 on November 18th, 2011
Panda 2.5.3 on October 19th, 2011
Panda 2.5.2 on October 13th, 2011
Panda 2.5.1 on October 9th, 2011
Panda 2.5 on September 28th, 2011
Panda 2.4 on August 15, 2011
Panda 2.3 on July 22nd, 2011
Panda 2.2 on June 18th or so.
Panda 2.1 on May 9th or so.
Panda 2.0 on April 11th or so.
Panda 1.0 on February 24th
In the end, if you are not 100% sure about your diagnosis or recovery, consult a real professional because you can easily make the problem worse.

Expert Keyword Research (2012 Revisited)

Expert Search Engine Optimization Keyword Research

As the ever changing algorithms evolve, so must your SEO techniques. People often ask me, what’s your favorite SEO tactic? And my answer is always KEYWORD RESEARCH. It’s the foundation of any SEO campaign and therefore, the most important aspect. Here I will talk about keyword tools, longtail keywords, 2012 post Panda changes and various keyword analysis strategies.

1. First Things First What are Longtail Keywords?

Longtail keywords are those that are found toward the long tail of the bell curve. These keywords bring in less traffic and are less competitive. In the graph below I have mapped out two keywords, [free porn] and [free Asian milf porn] to demonstrate how long tail keywords work.


2. Studying Keyword Competition

The Google Keyword Tool gives you two pieces of data, monthly searches and competition level. The problem is that their competition ranking system is almost always inaccurate when determining the competitiveness of longtail keywords. The reason is because the true competitiveness of a keyword depends on more variables than the ones that Google uses.

Since I used to work in the Porn SEO arena, I’ll give this example [sex chat] gets 4,090,000 searches per month and Google gives it a medium competitiveness ranking while [Romanian Sex Chat] gets 490 searches per month yet it has the same competitiveness ranking. According to this metric, they have an equal amount of competition but common sense tells us this is not true. Instead of counting on their tool, I use my own little method to more accurately measure this. Using the number of monthly searches vs the number of results found when actually searching for the keyword (e.g About 4,220,000 results (0.19 seconds)) vs the amount of inbound links to the domains within the first 10 results vs the level of on-page optimization of those first 10 results. Lastly, you throw in a little bit of common sense. With these metrics, one can get a better idea of the competition level of the keyword and a better understanding of what type of content will be needed as well as how much link building will be needed to achieve 1st-page rankings.

3. The Best Expert Keyword Tools

While there are many keyword tools that require registration or even payment, the most trusted tool for adept keyword research is the human brain. Start by thinking of all the search terms that you would type in if you were looking for your product or service. The reason I suggest this is because you will always be able to think of search queries that would not and could not have been returned by automated tools. An example of this the fact then when I use keyword tools for the query strip clubs they should have also suggested girly bars or lap dance bars which also get tons of traffic but they didn’t so I had to use human intuition instead. My second most favorite is the Google Keyword Tool and it happens to be free and requires no signup as well. Once I use my own intuition to compile a list and I then plug those KWs into Googles keyword tool to expand on the list and to get the stats for each one. I still haven’t found a keyword tool or service worth paying for so save your money!

4. Use Knowledgeable Keyword Synonyms!

Prior to the year 2001, an expert keyword researcher could have simply placed the keywords on the web page hundreds of times and even hid the keywords in the background to trick Google. Then Google implemented an algorithm of around 2% keyword density as the suggested practice for proficient keyword placement within your text. Nowadays, even 2% is too much in this Post-Panda SEO era, as it disrupts the natural flow of writing and tends to look spammy. To become a true Keyword expert, use keyword synonyms or semantically related keywords because Google now loves synonyms. An example for this article could be expert keyword research, adroit search query research, keyword analysis expert or keyword inquisition. Using this strategy you can really get your keyword density up to 2% or higher without catching the attention of Google’s webspam team. Use an online thesaurus if you need help finding good synonyms.

5. What Keywords are Your Competitors Using?

Saving the best suggestion for last It doesn’t take an expert keyword researcher to find out what keywords your competitors are going after. Let someone else do all the work for you. You can easily see what keywords your competition is trying to rank for by looking at the keywords they use in their titles, meta tags, and other on-page attributes. You can also find information about what keywords your competition is ranking for by looking at SEMrush, Alexa, and

Penguin Recovery Case Study I

How to Recover from Google's Penguin update

dead-penguinPost-Penguin Update Case Study I: Is a spammy site outranking you while your site has been moved to page 5 or 6? Think it’s a Google error? Think it’s not fair? Well, the truth is that it may not seem fair but there turns out to be very valid reasons for Google to do this. There are no ghosts in the machine nor human error, instead, this annoying Penguin update seems to be functioning just as Google intended.

In an effort to better explain the new Penguin update, I am posting the results of my first Penguin Recovery Analysis from a colleague that lost almost 90% of his traffic. Two of his sites that were affected the most were and both of these sites were ranked #1 in Google and were receiving over 1000 clicks per day. Now they have been all but completely removed from Google and get around 100 clicks per day.

The Diagnosis, Panda or Penguin?

panda-or-penguinThe first step was to look at the traffic stats to see if the decline in traffic correlated with the Panda 3.5 update which was released on April 19th and 20th or if the decline happened on or around the April 25th release date of the Penguin update. In his case, the traffic actually increased between April 19th and April 24th and then on the 24th, it when from over 1000 clicks per day to around 10 clicks per day. This was obviously an effect of the Penguin update, an easy case to diagnose, however, complicated cases can easily be misdiagnosed and could require looking at other metrics.

Now that all the signs point to the Penguin being the culprit, I started looking at the backlinks that are pointing to these sites. This is because the Google Penguin update only targeted webspam which boils down to off-page SEO and link building practices, while the Panda 3.5 focuses more on the on-page SEO attributes of one’s site. I first did a Google search for “pick up lines to use on girls” and found ranked number one!

Initially, I was appalled to see this spammy site outranking my colleague’s site. I did a CopyScape check and I found that most of his content was copied and even poorly spun and identified on several other sites, the code is very unfriendly to the SEs like it was coded with Microsoft Frontpage back in the mid 90s, the site has too many ads above the fold, on top of that, it had only a handful of low quality backlinks. This site should be on the 10th page of Google or just de-indexed altogether. Just like what so many people are reporting, Google Penguin failed because now spammy sites outrank mine but on further inspection of the backlinks, via my favorite Backlink Explorer Tool, I started siding with Google on this one.

The Prognosis has very few links but they seem to be aged and for the most part look natural. My colleagues sites have thousands of backlinks. Unfortunately, I found many problems in his backlinks.

  • Multiple links from the same few sites. Too many links from one domain can look spammy and unnatural.
  • Site-wide links from other sites he owns. Links within the body of a page look more natural. When link building, especially stay away from blogroll and footer links.
  • Many links were reciprocal. It used to be that reciprocal links just canceled each other out but post Penguin, these are a clear sign of what Google considers to be a linking scheme and can hurt your site’s ability to rank.
  • Lacking anchor text diversity in the backlinks. 90% of the anchor text contained the exact same phrase and this makes it look like the result of an automated posting software or some other form of unnatural link building.

The Cure

Unfortunately, recovering from a Penguin slap down is not easy. Sometimes the best thing to do surprisingly is absolutely nothing; Just carry on with your regular publishing and make sure you don’t participate in any link schemes. Furthermore, do not participate in any communication with Google through a reconsideration request or their meaningless Penguin feedback form. They will not reply unless manual actions were taken and instead, Google just uses the data they collect to further damn you and other SEOs.

Always wait at least 1 week for Google’s algorithms to recalibrate before taking any drastic measures but after this grace period, the algos would have set in and will not correct themselves automatically. Of course in due time you may naturally recover by doing nothing but this could take a year or more for the algorithms to reset. This only works if you have time on your side and you do not rely on a steady cash-flow to keep your business going.

Otherwise, you try these actions to correct the errors and recover from the Penguin attack:

  • First, remove the spammy links.
  • If they came from 2.0 sites then log in and edit them or if it was from spun content, just delete the posts/articles and close the accounts.
  • If they were bought or traded, then contact the webmaster and ask for the link(s) to be move inside the <body> tag instead of in the header, sidebar, footer or blogroll.
  • Make sure to use a wide variety of keywords and employ synonyms in your anchor text.
  • If you cannot remove all your spammy links then you may need to change your URL structure without using 301 redirects to the new pages. This will effectively break all the links pointing to every page of your site except for the homepage and should only be used as a last measure.

If removing the spammy links doesn’t cause your site to bounce back then you may need to take more drastic measures to reset the algorithm or to completely re-index your site. Here are some ways you can do that:

  • First, try moving to a new host server or change your IP address.
  • If you didn’t already change your URL structure in the previous steps then try doing that now and this time you can use 301 redirects as long as the spammy backlinks were removed.
  • Change registration of the domain name and change the IP again to emulate the site being bought and sold.
  • Move your content to a new sub-directory or several sub-directories based on category, author or content matter.
  • Erase or rewrite all of the content on the site.
  • Hopefully, you won’t get this far but as a last ditch attempt, park the domain for a month of two and start all over from the drawing board. We’ve tested this and the site actually bounced back to regain most of its first page rankings. Domain history and authority was not lost.

In the end, this case study provides the opportunity to test and identify actual causes and effects of the new Penguin update. I will be posting the results of the recovery as soon as the corrections are made.


The Worlds First Adult Search Engine

The First Porn Search Engine

It’s time we address the effectiveness of Google, Bing, and Yahoo’s effort to deliver high-quality adult search results. With the ever increasing pressures from internet censorship and the new .xxx domains, the future of adult content on the net is under attack! In this article I will discuss the following.

  • Adult contents online supply and demand
  • Internet censorship against adult sites
  • Adult Search Engine Reviews

Just How Popular is Adult Content on the Web?

There is an overwhelming demand for adult material on the internet. In fact, the world actually owes the porn industry for inventing E-commerce as the very first online credit card transaction was for pornographic content in the year 1995. In less than 1 month the same website was making over $200,000 per day and ended up processing around $1.5 billion dollars per year in their prime! Their story was beautifully illustrated in the Hollywood movie Middle Men.

The First Online Credit Card Transaction

This huge influx of business fueled one of the biggest gold rush of the information age. As a result, adult content still takes up an overwhelming majority of all content published on the internet today. Here are a few stats to check out:

  • More than 70% of men from 18 to 34 visit a pornographic site in a typical month
  • Google’s Keyword Tool reports 618,213,105 searches per month for the word porn
  • Adult content takes in around $57 billion annually
  • Almost 1/3 of all search queries are for adult content
  • Over 1/3 of all internet downloads are for adult material
  • $3,075.64 per second is spent on online porn

Escort Search Engines

Companies like,, and Eros earn millions of dollars each year on escort traffic. But they have made it very easy to Find escorts in your city no matter where in the world you may be.

Google Crawls Facebook Comments

Comments on Facebook are crawled by Google

Google Announces That They Can Now Index Facebook Comments

Surprise, surprise, Google can now crawl JavaScript and AJAX based Facebook comments straight out of a <iFRAME> on your website. This means that finally comments being made using plugins like Disqus and Facebook Comments for WordPress are being crawled and indexed! The implications for this are massive. Not just for the ability to refresh your pages with user generated content, (which, Google loves, by the way.) but also because AJAX, JavaScript and <IFRAMES> have long been considered the dark abyss for Googlebots as it was unable to index the content within. The battle between search engines and SEOs will never be the same!

What Does Google’s Ability to Crawl FB Comments Mean for SEO?

Google inches even closer to running more of a Social Search Engine where SEO techniques should favor freshness, social media, Web2.0 and user generated content. Not only is it time to get more serious about social media integration with your own site but if you haven’t already, it’s high-time to get more aggressive practicing social media optimization techniques. SMO was always considered SEOs little sister but I have a feeling she might pack an increasingly strong punch as our search engines evolve to a hyper-social web. Most of the top optimizers in the industry already agree that social media optimization is a MUST in a post panda world. To sum it up in one sentence, Like is the new link.

The implications go far beyond social search. Being able to index AJAX, JavaScript and <IFRAMES> means that APIs and rich media integration across different websites just became search engine friendly. This could usher in a new era. What will we call it? Web2.1 lol.