Genetic Code, Binary Code and My Visions of the Future

DNA genetic code and the future

It’s been a while since I posted my last rant so this one ought to be good since its been building up.

The Code of Life is Digital

It’s amazing to think about the similarities between the genetic code that DNA is made out of and the ones and zeros that make up binary code used in computers. Just the idea leaves me in absolute awe because being able to look into this world and even manipulate it, seems like the kind of power only a god should possess and in glimmering moments of genius, I feel that it’s just within our reach. Before I get into the heart of this rant, I just want to explain exactly how they are so similar then, I’ll tell you what can be done with this knowledge.

Computers

computer 1s and 0sComputers are able to store and process information by breaking it down into combinations of ones and zeros that represent a value.  One binary string aka a byte consists of eight characters and each bite can contain a small amount of information. For example, a W in binary code looks like this, 01010111. It takes tens of thousands of these bites of information to represent text on your computer and millions to represent colors and pixels to make up an image.

DNA Code

dna-acgtThe genetic code of DNA is made up of four different molecules labeled as letters instead of numbers and these letters are A for adenine, C for cytosine, G for guanine and T for thymine. Genetic code restores and processes information binarily and has been doing it with amazing complexity for about 4 billion years now. An example of this is TGGACTTA on the 12th chromosome will mean you will have blue eyes or brown hair. It takes strings of this code wide enough to wrap around the Earth to provide enough information to build a human body.

OpenSource DNA

Now you see how these two technologies if you will, are one in the same. You can take the DNA of simple lifeforms like bacteria and completely upload that into a computer. This has been done and is known as DNA software. From there you can play with the building blocks of life by adding to it, taking away or building completely new modules altogether. This is a new field of science called synthetic biology and there are already open source communities developing the software, building modules and building a repository. Once you have reconfigured the code of As, Cs, Gs, and Ts you can then print it out by using the actual molecules that these letters represent then place it into a cell. It’s then put into a petri dish and is given a small electrical charge to kick start it just as you would restart someone’s heart and a new lifeform comes into creation. Think this sounds crazy well it’s already been done and they named the new lifeforms Synthia but the scientific name is Mycoplasma Laboratorium as shown below.

synthetic bacteria

The Future

My super system of the future will be a computer interface integrated into my DNA that allows me to control my genetic software and I’ll call it Gemmy (pronounced Jimmy that is). If I catch cancer, I’ll just tell those cells to stop reproducing, if I want to climb a wall, I’ll just activate my synthetic spider genes and create some silk like Spiderman. Now you must really think I’m crazy right?!?!?! But I’m not because this type of genetic engineering has already been done. We have located genes responsible for the different types of cancer and can remove them and in Utah, we have already put spider genes into goats to produce super-strength silk. Now you wanna mess with me and Gemmy? No, I didn’t think so.

synthetic spider webThis stuff is real and will be a big part of our everyday life sooner than you think. People will create whole open-source platforms and widgets of life that are integratable and interchangeable. It’s not all good that could come of this as a computer virus can be written in this code and will essentially a real virus by all scientific means.  The possibilities are really endless so mention your thoughts and visions for Gemmy and what you would use it for in the comments below

Penguin Friendly Link Building

Post-penguin link building tips

penguin-friendly-linksContrary to what you may be reading on the forums, link building is not yet dead. That being said, link building shortcuts are! The easy days of automated link building tools and buying links from spammy networks are long gone and now this new era of SEO comes ushering in. Don’t tell me you didn’t see this coming. Did you really think you would be able to continue dominating the SERPs by spinning and posting crap all over the internet? Or by using tools like Scrapebox, Xrumer and SEnuke to create 1000s of spammy links with the click of a button?

It doesn’t take a genius to look at your link footprint and determine the spamminess of your links so of course, a multi-billion dollar company like Google is going to figure it out. I’m actually surprised all this black-hat fun lasted as long as it did! I expected this sh*t to hit the fan years ago and that is why I wrote about these Penguin Friendly Link Building strategies before I even knew it would be called Penguin. Without further adieu, here are the new Penguin link building DOs and DONTs.

Penguin Link Building DOs and DONTs

Also, check out this Penguin Recovery Case Study where I analyze backlinks of a site that was negatively affected by the Penguin update.

Comment, Forum & Profile Links

Google SEO penguin webspamThey are still useful if done manually using a VPN service to change up IPs accordingly to profiles. No more spamming your way to success. Sorry guys, Scrapebox is still a very good tool and can still be helpful for scraping, harvesting and various other tasks but as far as posting goes, it’s useless at best and dangerous at worst.  Xrumer, which can post tens of thousands of links on forums, is also dead and the same goes with auto-created profiles. These were the places where blackhat webspam ran rampant and even the black-hatters themselves will admit the world wide web is better off without it!

Still Spinning?

Put that spinner down. Claims have been made that Google is working feverishly to identify poorly spun content by matching it against their own internal synonyms database, but if you are going to continue to spin your content (which I don’t recommend) at least be clever enough to spin your links and anchor text too. Plus, don’t just spin words, spin whole sentences and phrases. I actually have a theory that it’s not necessarily the spun content but the way you submit the content that is getting people caught. See, when you have 25 to 50 articles being published in one day, all with very similar titles and similar content, all with links pointing at the same site, using the same anchor text, this tends to look a bit spammy.  I stopped using spun content more than two years ago a future-proofing because I know one day Google (with the help of Copyscape) will retroactively punish violators but if you insist on doing it, at least upload your spun spam at different times and spread out over several months.

Buying Penguin Friendly Links

Buying links has changed in a few ways with the arrival of the new Penguin algorithms. Firstly, don’t buy links in the footer or on a links page because links in these areas are almost always going to be what Google considers Link Scheming or straight up paid links. Links in the header or sidebar can also be considered anti-Penguin because your average webmaster would never put an outbound link in these areas unless he/she was getting paid for it. Lastly, site-wide links like these, seem to leave a big footprint and also have a negative effect on your anchor text diversity. For example, if you have your link in the header of a site with thousands of pages, then you will have thousands of links coming from the same site all using the same exact anchor text and this dilutes your anchor text diversity. It’s much better to have 100 links on 100 different sites rather than 100 links from the same site. My advice for link buying is to try to get your link on just 1 page that is relevant, has good PR, within the <body> tag, within some written text on a site that already ranks within the first 10 pages for the keyword you are trying to target. These are called content-based links and are always going to be more Penguin friendly.

Link Networks

linking to your network sitesBuilding, buying, or acquiring links from networks can be risky as we learned with BuildMyRank.com’s little mishap. Their whole network of tens of thousands of aged sites was taken down in one fell swoop back in March of 2012. The trick with large networks is to not leave behind a footprint or anything that will allow the search engines to connect the dots. This just got a whole lot harder. Networks can be easily identified and exposed by looking at the registration details, IP addresses, identical scripts/code, central control tools, outbound linking footprints, inbound linking footprints and more. Then let’s say you followed all the guidelines and built the most perfect, undetectable network of sites, well then what are you going to do with it?  You can’t sell links on it because the second you make it available to the public you’re going to get busted. Even trying to be sly and offering it on a black-hat forum is not discreet enough these days. Many speculate that Google detected most of these networks manually not algorithmically, meaning they had someone on the inside.  You can bet your bottom dollar they have spies and informants in ever corner of the SEO community. It’s imperative for them to hunt down networks like these and take them out before they manipulate their way to the tops of the SERPs.   The best way to build a network of sites is by manually treating each site as its own individual entity, each with a unique code, c-class IPs, content and no central control interface.

Internal Link Structure

internal links and outbound linksGoogle loves internal links because it helps to index your site and efficiently distribute PageRank. Furthermore, it increases user activity and page views. Just as you see in Wikipedia, use keyword-rich anchor text to build a web of relevant internal links.

Outbound Linking

As I’ve been saying for years, outbound links will increase your visibility on the web and for many reasons. One, it adds credibility to your content increasing the quality score. Secondly, Google likes to crawl as much as possible and if you can provide a relevant link to a helpful resource and the user clicks-through and sticks, then Google will reward you for it. The rules are simple, do not link to any page or site that competes for the same type of traffic. Link only to high authority sites like .edu, .gov or any high PR site. For an example just look at my outbound links here.

Less is More

less links is betterThe lesson to learn is quality over quantity.  I’ve spent countless hours looking at the backlinks of sites in different niches and surprisingly, I have been finding sites with fewer links outranking their competitors.  In each one of these cases, the sites with fewer links also have higher quality links. Since I’m currently doing Adult SEO, One example is brittanysplace.com with only 98 links is now on the first page of Google for the keyword escorts outranking AdultSearch.com and Eros.com which both have hundreds of thousands of links. Just 1 relevant, well-placed link can have a greater effect than 1000 lower quality, poorly placed links.

How to Recover from a Panda or Penguin Slap Down

google-penguin-updateUnfortunately, trying to get back on your feet after getting slapped down by the Panda or the more recent Penguin update, is not an easy task and almost always requires a complete remodeling of your entire website! This is mostly because the actions were taken were not manual, instead, they were caused when something triggered an algorithm alarm to devalue or remove your rankings and once that happens your site could get sucked into a black hole for a very long time or until you do something drastic to reset the algorithm. Ever since Google started rolling out their infamous Panda updates in February of 2011, site after site has been penalized and sometimes unjustly. SEOs and webmasters alike have been scrambling for answers.  In this article, I will be outlining instructions on how to recover from a Panda penalty along with actual case studies from other panda recovery stories.

First things first, before you can fix the problem you must identify the problem and this means you must diagnose the attributes that got you penalized in the first place. In doing so, you’ll need to find out if it was on-page or off-page. The earlier Panda updates only affected your site’s on-page attributes like content, internal link structure, and coding. That was until the Panda 3.5 update on April 19th, 2012 and the Penguin update on April 24th, 2012 that targeted off-page attributes like off-page content and external links. Identifying the culprit will require a deep investigation of your site’s content, code and your server configuration well as an in-depth look at the links that point to your site. Lastly, you will need to look at all the actions taken by all members of your team. This should have been tracked in a project management software and by making notes in your Google Analytics account. Identifying the problem is always the hardest part of the Panda recovery process and second opinions can be very helpful, this is why I offer a 150 point Panda Slap Analysis which takes 5 days to complete.

Now that you know what triggered Google’s algorithms to punish you, these are some of the most common practices used to recover from the Panda:

If Your Site Was Penalized for Having Copied or Low-Quality Content

Remove the content from your site or move it to another sub-directory with 301 redirects. On April 24th, the same day of the Penguin update, Google posted this guide on how to move your content. This is a very popular technique that I mentioned back in Oct. of 2011 in a post about the Panda 2.5 update. Alternative methods are changing your URL structure to force a re-indexing of the whole site at least to the URLs that lost ranking, not to the URLs that are still listed high in the SERPs.

If Your Site Was Penalized for Having Spammy Backlinks

Cancel your subscriptions to any and all link networks or link brokering sites. This especially applies to adult sites and Porn SEO.  Then work feverishly to try to build a solid foundation of white hat links to fill that empty void of links that were de-indexed. These must content based links which are considered Panda Friendly and can be very time-consuming.

Other Panda Recovery Tactics

The two Panda penalties I listed above are the most common but your site could have been slapped down for a number of different reasons so make sure you check through this list of other very helpful Panda rescue tactics.

  • Scale down on your ads
  • Hide affiliate links by using redirects in your .htaccess file
  • Integrate with social media YouTube, Plus and Blogspot, Twitter, Facebook and Pinterest
  • Do more internal linking just like you see on Wikipedia.com
  • Clean up your code and optimize page load speed
  • Redesign your whole site
  • Enhance the user experience to increase stickiness and page views
  • Integrate rich snippets and microdata
  • Post regular content to the site

What Not to Do!

Do not file a reconsideration request! Firstly because Google only replies to manual penalties, not the penalties that result from an algo change. Secondly, because you most likely just be telling on yourself and fall victim to Googles Phishing Scam.

Do not use Matt Cutts Penguin tattle tale tool that he posted on Twitter or their Penguin Feedback form.

You can also use this list of past Panda updates to try to help you troubleshoot cause and effect of when your SERPs dropped.

Panda 3.5 on April 19th, 2012
Panda 3.4 on March 23rd, 2012
Panda 3.3 on about February 26th, 2012
Panda 3.2 on about January 15th, 2012
Panda 3.1 on November 18th, 2011
Panda 2.5.3 on October 19th, 2011
Panda 2.5.2 on October 13th, 2011
Panda 2.5.1 on October 9th, 2011
Panda 2.5 on September 28th, 2011
Panda 2.4 on August 15, 2011
Panda 2.3 on July 22nd, 2011
Panda 2.2 on June 18th or so.
Panda 2.1 on May 9th or so.
Panda 2.0 on April 11th or so.
Panda 1.0 on February 24th
In the end, if you are not 100% sure about your diagnosis or recovery, consult a real professional because you can easily make the problem worse.

Expert Keyword Research (2012 Revisited)

Expert Search Engine Optimization Keyword Research

As the ever changing algorithms evolve, so must your SEO techniques. People often ask me, what’s your favorite SEO tactic? And my answer is always KEYWORD RESEARCH. It’s the foundation of any SEO campaign and therefore, the most important aspect. Here I will talk about keyword tools, longtail keywords, 2012 post Panda changes and various keyword analysis strategies.

1. First Things First What are Longtail Keywords?

Longtail keywords are those that are found toward the long tail of the bell curve. These keywords bring in less traffic and are less competitive. In the graph below I have mapped out two keywords, [free porn] and [free Asian milf porn] to demonstrate how long tail keywords work.

expert-keyword-research

2. Studying Keyword Competition

The Google Keyword Tool gives you two pieces of data, monthly searches and competition level. The problem is that their competition ranking system is almost always inaccurate when determining the competitiveness of longtail keywords. The reason is because the true competitiveness of a keyword depends on more variables than the ones that Google uses.

Since I used to work in the Porn SEO arena, I’ll give this example [sex chat] gets 4,090,000 searches per month and Google gives it a medium competitiveness ranking while [Romanian Sex Chat] gets 490 searches per month yet it has the same competitiveness ranking. According to this metric, they have an equal amount of competition but common sense tells us this is not true. Instead of counting on their tool, I use my own little method to more accurately measure this. Using the number of monthly searches vs the number of results found when actually searching for the keyword (e.g About 4,220,000 results (0.19 seconds)) vs the amount of inbound links to the domains within the first 10 results vs the level of on-page optimization of those first 10 results. Lastly, you throw in a little bit of common sense. With these metrics, one can get a better idea of the competition level of the keyword and a better understanding of what type of content will be needed as well as how much link building will be needed to achieve 1st-page rankings.

3. The Best Expert Keyword Tools

While there are many keyword tools that require registration or even payment, the most trusted tool for adept keyword research is the human brain. Start by thinking of all the search terms that you would type in if you were looking for your product or service. The reason I suggest this is because you will always be able to think of search queries that would not and could not have been returned by automated tools. An example of this the fact then when I use keyword tools for the query strip clubs they should have also suggested girly bars or lap dance bars which also get tons of traffic but they didn’t so I had to use human intuition instead. My second most favorite is the Google Keyword Tool and it happens to be free and requires no signup as well. Once I use my own intuition to compile a list and I then plug those KWs into Googles keyword tool to expand on the list and to get the stats for each one. I still haven’t found a keyword tool or service worth paying for so save your money!

4. Use Knowledgeable Keyword Synonyms!

Prior to the year 2001, an expert keyword researcher could have simply placed the keywords on the web page hundreds of times and even hid the keywords in the background to trick Google. Then Google implemented an algorithm of around 2% keyword density as the suggested practice for proficient keyword placement within your text. Nowadays, even 2% is too much in this Post-Panda SEO era, as it disrupts the natural flow of writing and tends to look spammy. To become a true Keyword expert, use keyword synonyms or semantically related keywords because Google now loves synonyms. An example for this article could be expert keyword research, adroit search query research, keyword analysis expert or keyword inquisition. Using this strategy you can really get your keyword density up to 2% or higher without catching the attention of Google’s webspam team. Use an online thesaurus if you need help finding good synonyms.

5. What Keywords are Your Competitors Using?

Saving the best suggestion for last It doesn’t take an expert keyword researcher to find out what keywords your competitors are going after. Let someone else do all the work for you. You can easily see what keywords your competition is trying to rank for by looking at the keywords they use in their titles, meta tags, and other on-page attributes. You can also find information about what keywords your competition is ranking for by looking at SEMrush, Alexa, and Compete.com.

Penguin Recovery Case Study I

How to Recover from Google's Penguin update

dead-penguinPost-Penguin Update Case Study I: Is a spammy site outranking you while your site has been moved to page 5 or 6? Think it’s a Google error? Think it’s not fair? Well, the truth is that it may not seem fair but there turns out to be very valid reasons for Google to do this. There are no ghosts in the machine nor human error, instead, this annoying Penguin update seems to be functioning just as Google intended.

In an effort to better explain the new Penguin update, I am posting the results of my first Penguin Recovery Analysis from a colleague that lost almost 90% of his traffic. Two of his sites that were affected the most were pickuplinestouseongirls.info and cutepickuplines.info both of these sites were ranked #1 in Google and were receiving over 1000 clicks per day. Now they have been all but completely removed from Google and get around 100 clicks per day.

The Diagnosis, Panda or Penguin?

panda-or-penguinThe first step was to look at the traffic stats to see if the decline in traffic correlated with the Panda 3.5 update which was released on April 19th and 20th or if the decline happened on or around the April 25th release date of the Penguin update. In his case, the traffic actually increased between April 19th and April 24th and then on the 24th, it when from over 1000 clicks per day to around 10 clicks per day. This was obviously an effect of the Penguin update, an easy case to diagnose, however, complicated cases can easily be misdiagnosed and could require looking at other metrics.

Now that all the signs point to the Penguin being the culprit, I started looking at the backlinks that are pointing to these sites. This is because the Google Penguin update only targeted webspam which boils down to off-page SEO and link building practices, while the Panda 3.5 focuses more on the on-page SEO attributes of one’s site. I first did a Google search for “pick up lines to use on girls” and found www.pickuplinesgalore.com/cheesy.html ranked number one!

Initially, I was appalled to see this spammy site outranking my colleague’s site. I did a CopyScape check and I found that most of his content was copied and even poorly spun and identified on several other sites, the code is very unfriendly to the SEs like it was coded with Microsoft Frontpage back in the mid 90s, the site has too many ads above the fold, on top of that, it had only a handful of low quality backlinks. This site should be on the 10th page of Google or just de-indexed altogether. Just like what so many people are reporting, Google Penguin failed because now spammy sites outrank mine but on further inspection of the backlinks, via my favorite Backlink Explorer Tool, I started siding with Google on this one.

The Prognosis

penguin-prognosisPickuplinesgalore.com has very few links but they seem to be aged and for the most part look natural. My colleagues sites have thousands of backlinks. Unfortunately, I found many problems in his backlinks.

  • Multiple links from the same few sites. Too many links from one domain can look spammy and unnatural.
  • Site-wide links from other sites he owns. Links within the body of a page look more natural. When link building, especially stay away from blogroll and footer links.
  • Many links were reciprocal. It used to be that reciprocal links just canceled each other out but post Penguin, these are a clear sign of what Google considers to be a linking scheme and can hurt your site’s ability to rank.
  • Lacking anchor text diversity in the backlinks. 90% of the anchor text contained the exact same phrase and this makes it look like the result of an automated posting software or some other form of unnatural link building.

The Cure

Unfortunately, recovering from a Penguin slap down is not easy. Sometimes the best thing to do surprisingly is absolutely nothing; Just carry on with your regular publishing and make sure you don’t participate in any link schemes. Furthermore, do not participate in any communication with Google through a reconsideration request or their meaningless Penguin feedback form. They will not reply unless manual actions were taken and instead, Google just uses the data they collect to further damn you and other SEOs.

Always wait at least 1 week for Google’s algorithms to recalibrate before taking any drastic measures but after this grace period, the algos would have set in and will not correct themselves automatically. Of course in due time you may naturally recover by doing nothing but this could take a year or more for the algorithms to reset. This only works if you have time on your side and you do not rely on a steady cash-flow to keep your business going.

Otherwise, you try these actions to correct the errors and recover from the Penguin attack:

  • First, remove the spammy links.
  • If they came from 2.0 sites then log in and edit them or if it was from spun content, just delete the posts/articles and close the accounts.
  • If they were bought or traded, then contact the webmaster and ask for the link(s) to be move inside the <body> tag instead of in the header, sidebar, footer or blogroll.
  • Make sure to use a wide variety of keywords and employ synonyms in your anchor text.
  • If you cannot remove all your spammy links then you may need to change your URL structure without using 301 redirects to the new pages. This will effectively break all the links pointing to every page of your site except for the homepage and should only be used as a last measure.

If removing the spammy links doesn’t cause your site to bounce back then you may need to take more drastic measures to reset the algorithm or to completely re-index your site. Here are some ways you can do that:

  • First, try moving to a new host server or change your IP address.
  • If you didn’t already change your URL structure in the previous steps then try doing that now and this time you can use 301 redirects as long as the spammy backlinks were removed.
  • Change registration of the domain name and change the IP again to emulate the site being bought and sold.
  • Move your content to a new sub-directory or several sub-directories based on category, author or content matter.
  • Erase or rewrite all of the content on the site.
  • Hopefully, you won’t get this far but as a last ditch attempt, park the domain for a month of two and start all over from the drawing board. We’ve tested this and the site actually bounced back to regain most of its first page rankings. Domain history and authority was not lost.

In the end, this case study provides the opportunity to test and identify actual causes and effects of the new Penguin update. I will be posting the results of the recovery as soon as the corrections are made.

 

The Worlds First Adult Search Engine

The First Porn Search Engine

It’s time we address the effectiveness of Google, Bing, and Yahoo’s effort to deliver high-quality adult search results. With the ever increasing pressures from internet censorship and the new .xxx domains, the future of adult content on the net is under attack! In this article I will discuss the following.

  • Adult contents online supply and demand
  • Internet censorship against adult sites
  • Adult Search Engine Reviews

Just How Popular is Adult Content on the Web?

There is an overwhelming demand for adult material on the internet. In fact, the world actually owes the porn industry for inventing E-commerce as the very first online credit card transaction was for pornographic content in the year 1995. In less than 1 month the same website was making over $200,000 per day and ended up processing around $1.5 billion dollars per year in their prime! Their story was beautifully illustrated in the Hollywood movie Middle Men.

The First Online Credit Card Transaction

This huge influx of business fueled one of the biggest gold rush of the information age. As a result, adult content still takes up an overwhelming majority of all content published on the internet today. Here are a few stats to check out:

  • More than 70% of men from 18 to 34 visit a pornographic site in a typical month
  • Google’s Keyword Tool reports 618,213,105 searches per month for the word porn
  • Adult content takes in around $57 billion annually
  • Almost 1/3 of all search queries are for adult content
  • Over 1/3 of all internet downloads are for adult material
  • $3,075.64 per second is spent on online porn

Escort Search Engines

Companies like BackPage.com, Rotci.com, and Eros earn millions of dollars each year on escort traffic. But they have made it very easy to Find escorts in your city no matter where in the world you may be.

Google Crawls Facebook Comments

Comments on Facebook are crawled by Google

Google Announces That They Can Now Index Facebook Comments

Surprise, surprise, Google can now crawl JavaScript and AJAX based Facebook comments straight out of a <iFRAME> on your website. This means that finally comments being made using plugins like Disqus and Facebook Comments for WordPress are being crawled and indexed! The implications for this are massive. Not just for the ability to refresh your pages with user generated content, (which, Google loves, by the way.) but also because AJAX, JavaScript and <IFRAMES> have long been considered the dark abyss for Googlebots as it was unable to index the content within. The battle between search engines and SEOs will never be the same!

What Does Google’s Ability to Crawl FB Comments Mean for SEO?

Google inches even closer to running more of a Social Search Engine where SEO techniques should favor freshness, social media, Web2.0 and user generated content. Not only is it time to get more serious about social media integration with your own site but if you haven’t already, it’s high-time to get more aggressive practicing social media optimization techniques. SMO was always considered SEOs little sister but I have a feeling she might pack an increasingly strong punch as our search engines evolve to a hyper-social web. Most of the top optimizers in the industry already agree that social media optimization is a MUST in a post panda world. To sum it up in one sentence, Like is the new link.

The implications go far beyond social search. Being able to index AJAX, JavaScript and <IFRAMES> means that APIs and rich media integration across different websites just became search engine friendly. This could usher in a new era. What will we call it? Web2.1 lol.

SMO Outsourcing

social media optimization outsourcingSocial Media Optimization Outsourcing a.k.a. SMO Outsourcing makes the world of sense. The reason for this is because it’s just not feasible to pay American salaries to have an employee stumble around on his or her Facebook account all day but if the employee is based in Romania, India or the Philippines you can pay pennies on the dollar and actually make a nice return on your investment!

Everyone agrees that social media is where it’s at. You can thank Mark Zuckerburg, the world’s youngest billionaire because the average internet user spends most of their time on social media websites like Facebook, Twitter, Friendster, Blackplanet and yes even Myspace, which still brings some good link juice. That being said, it shouldn’t be hard to imagine how this can benefit your company.

Offshore SMO Outsourcing

While Google still gets more unique visitors per day, statistics show that people actually spend more time on Facebook than any other website on the web, overtaking Google for the first time in 2010. Now it’s time to put this into perspective. With social media comes social targeting and it’s just this way of thinking that got VCs to front hundreds of millions of dollars to help develop social media sites worldwide. See with social targeting you can easily identify and deliver content to a highly targeted audience and it even works for ultra specific niches.

Of course, if you are just selling online games then almost everyone is a target and Farmville is proof of that with over $300 million in annual revenues their first year. Another wide open target is muscle building products because you can target any male between the ages of 18 to 45 and see great results but how about something much more specific like b2b selling in an industry like insurance? That should be much harder, right?

WRONG, think again, SMO Outsourcing works for that too!

I have recently put this to the test by searching social media platforms to find professionals associated with the insurance industry. I easily found tens of thousands of professionals and ended up selling them insurance leads. No matter what your product or service is, SMO outsourcing can increase your revenues.

How to Outsource SMO Work

It’s a lot easier than you think. The reason is because in developing countries like India, Philippines, and Romania the younger, college-aged generation tends to be complete internet junkies, trying to reach out to the rest of the world via social media. You can go to any of these countries and it will be hard to find anyone between the ages of 18 to 25 that doesn’t already have a Facebook, Twitter, Friendster, or Hi5 account. This makes recruiting and training an SMO outsourcing team super easy because they are already familiar with the sites so all you have to do is channel their social media activities in the right direction.

I have been assembling and managing SMO outsourcing teams in Romania, India and the Philippines for years now and with great results. Some of my most recent social media campaigns resulted in hundreds of thousands of clicks to my customers’ sites and huge increases in revenues. To find out more about Social Media Marketing services or for a free consultation on SMO outsourcing, contact me directly by visiting my contact page.

Is Buying +1s Good for SEO? Buy Plus Ones

+1-metricsThe Plus One service I utilize is listed here on my SEO software, Tools, and Services page. While waiting for your +1s to be delivered here’s a tried and tested guide on how to buy Plus Ones. The best way to find out the impact of a Plus One campaign is to take a look in the Google Webmaster Tools section called +1 Metrics. There you will find a glimpse of how Google collects +1 data and uses it to aid their search results.

There you will also find Search Impact, Activity and Audience and definitely check out the +1 Metrics FAQs too. Without getting too far into the explanation I can easily sum it up by saying Google’s Plus one algorithms include measuring the value of the +1 by the user’s activity.  In other words, if someone with a weak Google Plus profile pluses a page and then never comes back, it’s a very weak + but if the Google account they are Plus-ing from has many friends, followers, and activity and on top of that they continue to revisit the site, then it will boost your SERPs faster than links.

The best search engine algorithms are always the simplest. The +1 system isn’t 100% spam proof but they have found a simple way to combat it by using data that is hard to replicate or automate. In the beginning, Google floundered in the social search arena but this time they found a way to 1up (no pun intended) Bing’s social search.

Buy +1s From a Trusted Network

This being said, it looks like the best way to buy +1s is to make sure the network of Google accounts are PVA and not only look authentic but also have natural behavior in their Google Plus profile as well as other Google services. Here are a few tips of what a good +1 looks like. Many friends followers and recent posts on Plus and some have even suggested to setup blogger, Youtube, search history even email activity can build power and legitimacy to a Google account. As my colleague Dean put it, I want to get a lot of good followers on Plus that way it will increase my Plus power. I’d be a PR5 in the Plus world.

So is Buying Plus Ones Good or Bad for SEO?

buying goole plus 1sSure buying +1s can be great for SEO but only if they are coming from good Google accounts. If you are buying them from an exchange network, this will be easier to assure but if you are buying them from a company or some dude on a forum somewhere then make sure their network of Google accounts is worth a damn.

Tutorial On-page SEO Attributes

Tutorial On-page SEO Attributes

Just about every day I get asked how to Analyze On-page SEO Attributes so I figured to make things easier I would write this post in an effort to save time and breath. This is a comprehensive, super easy step-by-step, checklist of how to make sure your site is fully optimized. I am hoping that this page will serve as an On-page SEO Tutorial or at least a prerequisite to further conversations. That being said, not only have I written about it but you can also use this post as a shining example of a completely optimized page.

Keyword Research for On-page SEO Attributes

For starters, you need to pick your keywords. Use one keyword per page and when selecting the right keywords, compare the competition level with the number of monthly searches. You can use Google’s keyword tool for this. Find this tool here, keyword tool or by searching in Google, using the search term keyword tool. In case you haven’t noticed, the keywords that I have chosen for this post are On-page SEO Attributes.

Meta Content for On-page SEO Attributes

Now we have our keywords selected, here’s how we implement them.  Start adding On-page SEO Attributes by including the keywords in the page title, meta description, and tags. The title should be catchy, the description should be 250 characters and you should use around 3 to 5 tags and make sure they include the main keywords and that the other ones are relevant. Meta keyword tags are no longer read by Google but many of the other search engines still use them so you might as well include them. Lastly,  very important On-page SEO Attributes are to make sure that you do not have any duplicate tags anywhere within your site or you will see a warning in Googles Webmaster Tools telling you to uniqueify them(yes, uniqueify is a real word now).

Optimize Images for On-page SEO Attributes

Business Graph with arrow showing profits and gainsNext, upload an image or two and use the keyword as the image file name, image title, image description and most importantly the images alt text. Not only is this considered good bot food but you would also be surprised how many clicks you can get from having your image show up as the first result in Google’s image search. For an example of how I optimize images for On-page SEO Attributes, just check out this HTML image to the left. Even download this image so you can see the file name I gave it and the other attributes.

On-page SEO Attributes Keyword Density

keyword-density-visualization-sampleNow we are ready for the textual content, which is probably one of the most important On-page SEO Attributes and the HTML code that you use to format the text. Each page should have a minimum of 250 words but you should really try to at least write 500 words or more. Then make sure the keyword is mentioned around 2% of the time within the text so if you write 500 words then the keywords should be listed around 10 times. Any more than 2% and it will look very spammy in the eyes of Google, any less and the page may not be picked up by the bots at all. This is called keyword density. I like to use this online text analysis tool but when writing on my WordPress sites, I also use a cool little plugin called SEOpressor. It’s best to just write your article without thinking about the On-page SEO Attributes and after you have finished, just go back over it and squeeze the keywords in later.

After you have your article written and the keyword density at 2%, you are ready to make your keywords stand out more. You do this by adding the following formatting features. Mention the keyword in the first and last sentence of the page. Put the keyword inside h1, h2 and h3 tags. I went a little header tag crazy on this page as you can see if you view the source. The h stands for “header” which is just a type or text formatting that puts your headers in a larger font. The tag looks like this <h1>. Bold your keywords using the strong tag like this <strong>On-page SEO Attributes</strong>, don’t use <bold>! Next, underline and italicize a few of your keywords as I have done on this page.

HTML & CSS On-page SEO Attributes

back-to-the-CSS-photoAs for HTML, you can use this validation tool from W3C that will point out errors. Other than that, be sure to use <div>tags instead of tables <tb>. Tables are super old school and actually give the search engines trouble while trying to crawl and index the text within.

Next is the navigation. The bots should be able to follow your navigation so it can go to the other pages on your site but a lot of people make the mistake of creating a flash or java based navigation. The best and most crawlable solution is a pure CSS navigation using list items for all the links within. The list item tag looks like this <li>. W3C also has a sweet little validation tool for CSS so you should check that for errors as well. Another great tool that checks for clean CSS and much more On-page SEO Attributes is this Web Page Analysis tool. You want to keep the code light and clean to increase load time and crawlability.

Siloing On-page SEO Attributes

Siloing was first written about by another veteran SEO,  Bruce Clay, who claims this is one of the most important On-page SEO Attributes. You can read what he has to say about it in this article Bruce Clay Silicone but while his theory is widely respected, I personally don’t agree with everything he has to say about it. The On-page SEO Attributes of site siloing work like this. If you have a site that sells fruit, then keep all the content about apples on the apples page and all the oranges on the oranges page. You will never want to assign or optimize a page to fit into more than one category and furthermore, you don’t want to link these pages or categories together. Instead, if someone is on your apples page, then they should have to go back to the homepage to get to the oranges category. You can also prevent these silos from getting mixed up by using the controversial <nofollow> tag. This sounds simple enough but it is easy to get your sites silos mixed up.

URL Structuring for On-page SEO Attributes

Just look up in the address bar and notice that the keyword is also in my URL.  Don’t use capital letters, always separate words with the dash not underscores and especially not spaces. If you see a URL with a question mark or equals sign, it is what’s know as a query string URL which means that page doesn’t actually exist in a static form so every time someone goes to visit that page it must be generated on the fly aka dynamically. On-page SEO Attributes with URLs are critical. Since the content for this page dynamic and is actually stored in a database most search engines simply cannot index it and the ones that can index it like Google and Bing still have a hard time doing so.

On my next On-page SEO Attributes post, look out for more details on siloing, internal link structure and my new and exciting flash optimization techniques.