The many dangers of NoFollow

NoFollow linking has never been so prominent, and never has it been so dangerous for both ethical and practical reasons.

I don’t like the NoFollow attribute.  When it was introduced in 2005, it made so much sense.  But since then it has been abused by both webmasters and the search engines, and that abuse looks poised to make a quantum leap sometime soon.

Therefore, there are two mes that don’t like NoFollow:

  • The ethical me, who much prefers to be honest when I promote a website.
  • The practical me, who much prefers not to be slapped down, tied up and fed to a herd of half-starved ninja gators when Google wakes up in 2015 or 2016, or gets displaced by an upstart.

I will cover three things in this blog post.  Yes, I’m organized!

  1. The history of NoFollow, which many newer marketers today are unaware of, and many who were around in 2005 might have forgotten.
  2. The ethical case to avoid using NoFollow (As a matter of fact, it is important.)
  3. The practical case to avoid using an attribute that could blow up in your face in a few years.

The short, tumultuous history of NoFollow

The NoFollow “tag”, as it has often been called, is not a tag.  It is an “attribute” (for those interested in correct use of language), which can be added to any <a href=””> tag.  It tells the search engines not to follow the link, because the owner of the website on which it appears cannot vouch for its trustworthiness.  Just to be clear, NoFollow does not necessarily mean that a link is bad.  It only means that the link has not been vetted by the website’s owner or administrator.

NoFollow's sordid history

The NoFollow attribute was introduced in early 2005 to stop blog comment spam, or at least to make it easier for the search engines to distinguish between links from legitimate comments and links from spam-happy bots.

Here is the direct quote from the Official Google Blog:

Q: How does a link change?
A: Any link that a user can create on your site automatically gets a new “nofollow” attribute. So if a blog spammer previously added a comment like

Visit my <a href=”http://www.example.com/”>discount pharmaceuticals</a> site.

That comment would be transformed to

Visit my <a href=”http://www.example.com/” rel=”nofollow”>discount pharmaceuticals</a> site.

Q: What types of links should get this attribute?
A: We encourage you to use the rel=”nofollow” attribute anywhere that users can add links by themselves, including within comments, trackbacks, and referrer lists. Comment areas receive the most attention, but securing every location where someone can add a link is the way to keep spammers at bay.

Matt Cutts, Google  chief “web spam” spokesperson, said:

“Wherever it means that another person placed a link on your site, that would be appropriate.”

Matt Cutts confirmed this in 2009 on his own blog:

“Nofollow is method (introduced in 2005 and supported by multiple search engines) to annotate a link to tell search engines ‘I can’t or don’t want to vouch for this link.’ In Google, nofollow links don’t pass PageRank and don’t pass anchortext.”

In other words, if you are not moderating your blog comments or other user-generated content, this will allow you to continue being careless or lazy or otherwise occupied without gumming up Google’s rankings.  And it’s not just Google.  MSN and Yahoo were involved in announcing simultaneously their support of the attribute.  In 2005, Google had about 37 percent market share, Yahoo had 30 percent, and MSN had 16 percent.  AOL and Ask Jeeves were still players, with ten and six percent respectively.

PageRank Sculpting

It was not long before some webmasters with overactive imaginations found a way to use NoFollow to their advantage through a method that came to be called “PageRank Sculpting”.

As you are probably aware, PageRank is the relative value of a page, and is the most visible of over 100 ranking signals.  Very roughly, the PageRank of a page is calculated based on the value of all the pages linking to it.  Each of those pages has its own PageRank, which it divides up evenly between all the pages it links to.  If you need to read up on the subject, I suggest this post by Danny Sullivan.

The key thing to understand about PageRank is that If a page contains 20 links, it divides its power 20 ways.  However, if it contains only 15 links, it divides its power 15 ways, sending more PageRank power to each of the 15 pages.

PageRank sculpting is the process of NoFollowing certain internal links, so that other internal links are more powerful.  The theory is that if every page of your website points to the contact, about, terms, and other administrative pages, that means a lot of PageRank power that could be going to money pages is being poorly directed.  By adding the NoFollow attribute to those admin links, webmasters believed that they were funneling more PageRank to their money pages.

To the best of my knowledge, nobody got penalized for doing this, but in in 2009 Google changed the way it read NoFollow links to make PageRank sculpting useless.  Webmasters got the idea, as PageRank Sculpting quickly went out of style.  But an important question about all this PageRank sculpting has to be asked, “What were they thinking?!?  NoFollowing links to their own pages from their own pages?  Telling the search engines that they can’t vouch for their own contact and about pages?  Saying, “Hey Google, I am such a shifty character that I don’t even trust myself”?  </rant>

Just wait for the other shoe to drop.

NoFollowing paid links

It was not only webmasters who played fast and loose with the rules.  Google took its turn, too.  In fact, Google now advises:

“In order to prevent paid links from influencing search results and negatively impacting users, we urge webmasters use nofollow on such links.”

Quite apart from the inconvenient truth that almost every link has been paid for in one form or another (yes, “earned” links can be very costly to “earn”), the fact is that there is no link more firmly vetted than a paid link.  A webmaster has to think much harder, “Is this money really worth possibly harming my site’s trust with visitors and the search engines?” than when they link for free.

NoFollowing unnatural/suspicious/random links

But Google seems to have moved past encouraging NoFollow just on paid links.  They seem to be quietly encouraging people to add NoFollow to a very widely defined array of low-quality links, unnatural links, suspicious links (those that might actually be natural, but Google really can’t tell the difference, so why not discredit them just in case) and seemingly random links.

Oh, and press releases.

These days, it seems that almost any link could be flagged as “unnatural” by Google, with so-called “manual” penalties being the result.  Many of Google’s recent manual penalties seem designed to upstage Monty Python.  Recovery from some of the more ridiculous penalties seems almost as random, and I have heard many people saying that by simply adding NoFollow to links, they have been able to recover.

In fact, many people writing about manual penalty recovery can be seen offering advice like this:

“After disavowing or no-following links, webmasters must submit a reconsideration request to Google. If the problem is not completely cleared, Google will send a denial message.”

Or advice like this:

“If it’s high quality, but just linked in the wrong way, ask the webmaster to add a nofollow attribute assigned to it.”

If you are wondering, “What’s next?”, so am I.  At this point, I have seen at least one example of almost every type of link drawing a penalty, and Google seems to be accepting  the NoFollow attribute as a way of crossing the blurry line of what is and is not acceptable on every third Tuesday, if the wind is blowing from the northeast with a faint whiff of Lavender in the air. In fact, Google has said that the Disavow tool is like a huge NoFollowifier.  Here is what Google’s John Mueller has to say on the matter:

“You don’t need to include any nofollow links…because essentially what happens with links that you submit as a disavow, when we recrawl them we treat them similarly to other nofollowed links.  Including a nofollow link there wouldn’t be necessary.”

Which brings us to today.  I watch, mouth hanging wide open (but not drooling on myself, just to reassure you), the mass NoFollowing of links that some desperate webmasters are doing.  There are plugins for WordPress, such as WP External Links and External Links.

I’ll go into why I think this is crazy below, but some highly respectable people have been driven by Google’s seemingly random penalties to actually use these tools.  Lisa of Inspire to Thrive  explains why she installed the WP External Links Plugin:

“I don’t agree with their nofollow policy or shall we say HINT of it but I don’t want to be penalized by this giant and I’d love to see how long the process takes so we can all learn something from this one.

Why NoFollow is unethical

You should not tell a lie.  NoFollow is ethical on user generated content, not because that is why it was created in the first place, but because it tells the truth.  Unless the website administrator moderates all user-generated content, such as on good quality blogs, the truth is that he or she cannot vouch for the links.  NoFollow truthfully communicates that to anyone who wishes to read that attribute, including search engines.

If NoFollow communicates that you cannot vouch for a link that you have in fact approved, that is a blatant lie.

NoFollow - what about your users?

Google is not the Internet. The main reason most people are adding the NoFollow attribute where it does not belong is in response to Google’s displeasure with certain links or the website administrator’s fear of Google’s displeasure with certain links.  Numerous statements by Google have led people to believe that Google wants people to add NoFollow to the links that Google has chosen to find irritating.

The problem is that Google is not the Internet.  There are other search engines and possibly other applications that will use your NoFollow attribute as a signal, too.  NoFollow tells others that the link is not trustworthy, too. It’s not just Google being lied to.

Read Google’s Webmaster Guidelines.  Google’s official guidelines, as vague as they are, are a lot more ethical than its enforcement is (Oh, that’s a whole other ethics topic that the company whose motto is “Don’t be evil” probably would rather I don’t get into).  Let’s see what the Quality Guidelines say:

“Don’t deceive your users.”

So if you are telling the search engines, I won’t vouch for this link, are you telling your users, too?  Just asking.

“Avoid tricks intended to improve search engine rankings.”

Like adding a hidden attribute, for instance.  Those who are old enough to remember what a search engine penalty meant before 2011, will recall that it meant you had done something sneaky and deceptive.  You were a dirty rotten crook, serving up different information to the search engines than to real people:

  • Doorway pages.
  • Hidden text.
  • Hidden links.

Search engines penalized websites for serving up different content to users and to robots, and rightly so.  So what about NoFollow links, where the link is viewable by users but not by search engines?

“A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee.”

“So, you see, I inserted this hidden NoFollow attribute because I don’t want to get in trouble with Google, but I’m OK sending my readers there.  Yes, I know that means I’m either a scammer sending users to a crap link, or a total wuss allowing Google to bully me into blocking robots from following a perfectly good link.” Hmm.  Sure, that’s what I would tell my competitor or a Google employee.

“Another useful test is to ask, ‘Does this help my users? Would I do this if search engines didn’t exist?'”

Seriously, would you put NoFollow in a link if search engines didn’t exist?

NoFollow means not taking responsibility for actions.  There are two main constraints that keep us from linking to bad neighbourhoods; because users might follow the links and because search engines might follow the links.  Putting NoFollow on bad links does not solve any real problem (it might help Google solve its problems), and makes it 50% more tempting to post a bad link.  In other words, far from cleaning up the Web, it is likely increasing the number of poor quality links, especially those posted on poor quality sites.

Why NoFollow is dangerous

Now, it might be that ethics are a less pressing worry on your mind than where you’ll find money to pay the rent, so maybe you are more interested in getting back lost rankings than in being 100 percent authentic and ethical.  Well, here are five reasons why NoFollow could bite you in that soft fleshy padding you sit on.

1. It might not work.  I have seen no official statistics on how many websites recover from different types of penalties, but it certainly sounds like a majority of those that bother trying don’t succeed on the first or second try.

2. You might lose rankings at Bing, Yahoo and other search engines.  Of course, they don’t have the same market share, so you might be willing to sacrifice all of them in order to access the 66 percent of searches that Google delivers.  But what if you NoFollow all your links and instead of winning Google’s approval, you simply lose your Bing and Yahoo rankings? Oops.

But that is just the short-term, and short-term is short-sighted, even if your main concern is next month’s rent.  The long term is what really counts.

3. Google might get you later on. At the current rate, half the Internet will be disassembled, Disavowed or NoFollowed before long, all because Google doesn’t want to count certain links in its algorithm.  What then?  The disassembled part (links people have removed) will no longer be there, but Google will have a huge database of domains that have been disavowed once, twice, thrice or 673 times.  Google will have a huge database of websites that have tons of NoFollow links pointing to them.  It won’t be hard to add into its algorithm a trust factor to account for how often a particular domain has been disavowed or NoFollowed.

Google will also have data on which websites NoFollow their links.  Ah, let’s follow the logic trail.  Google tells websites to NoFollow crappy links.  Website A has 300 NoFollow links on its site.  Website B has 3 NoFollow links on its site.  In Google’s mind, NoFollow means crappy links.  Hmmm.  Which site will Google consider more trustworthy?  Which site will Google see as less trustworthy?  It’s kind of a NoBrainer.  When you look at it from that perspective, is it worth sending such a negative message about your own website? Wikipedia will always be able to get away with it, but could your website?

Don’t believe this could ever happen?  Go back a few years when the best practice was to have keyword-rich anchor text in most of your inbound links, only to make sure you varied your text.  Now, websites are getting penalized for doing just that.  Go back a few more years when the best practice was exact match keyword anchor text.  That will land you in even more trouble today.

Google is now punishing websites for links that were built in accordance with their guidelines as far back as 2004 and 2005.  What you do today can come back to bite you tomorrow and even a decade from now.

4. Other search engines might get you later on. It’s just too easy.  Not every NoFollow link is crap and not every DoFollow link is amazing.  But if a search engine plays the averages, they can reduce the trust of websites littered with NoFollow links and increase the trust of websites that are clean.

5. Google’s rule is ephemeral.  I know it seems like Google rules the world.  But there are other search engines like Blekko and Duck Duck Go (as I wrote about here), and who knows where Bing or Yahoo might be headed?  Google controls 66 percent of search traffic now, but what if that share was to fall?

Can’t happen?  Think again.

Remember when Alta Vista ruled search?

Remember when MySpace was social networking?

Remember when Netscape was everybody’s browser of choice?

Remember when Digg was synonymous with social bookmarking?

Remember when Google ruled search?  It still does, but sooner or later, that question will come up.  And all the NoFollow attributes placed just for Google’s sake will serve as … what?

Your turn.  What do you think?

I would love to hear from you.  I certainly don’t have the last word on this.  I have not liked NoFollow from the start.  I called out Wikepedia on this in 2007, even going so far as to say Wikipedia should be spanked (I really do like the site; I just don’t like their NoFollow policy).

NoFollow made sense for what it was designed to do, but I have always thought that it sends a very bad signal to anyone watching, including search engines.  Obviously, not everybody feels the same way.  Some people might even today be using it for PageRank sculpting.

I would love to hear your thoughts in the comments below, or on blog posts of your own.  Support me.  Refute me.  Let’s get this out in the open and discuss it logically.






Google Disavow. Why I actually like it.

Google’s Disavow Tool is more than just a quick fix for a high-strung website owner. Used properly, it can help a website regain Google’s favour or possibly even avoid falling victim to Google’s link jailor Penguin mascot.  (post updated with video from Matt Cutts)

There is a lot of debate about whether or not it is a good idea to disavow backlinks. Some people think it is an admission of “guilt”. Others worry that, in using the disavow tool, people will end up losing valuable links that are not actually causing them any problems.

I will not dispute the validity of either of these views.

What if you know you have a backlinks problem?

Let us assume for a moment that you know you have a backlink problem. Perhaps you have received the infamous “unnatural backlinks” letter from Google. Perhaps your rankings have tanked, and you have ruled out other causes. Let’s assume that you need to clean up your backlink profile, one way or the other.

Basically, you have two choices. The first is to get rid of the backlinks. The second is to leave them up and use Google’s Disavow Tool.

Let’s be clear – Google prefers you to get rid of them. Let’s also be clear – most webmasters ignore requests to remove links. The first benefit of the Disavow Tool is that it lets you deal with the majority of links that you cannot get removed.

Remember, in this case, you will not be losing any valuable links with the disavow tool that you would not be losing if your begging, bribing, threatening and temper tantrums had worked with the website owners linking to you.

Read also: How Google reads your backlinks

There are also those links that you think are actually pretty good, but you are also pretty sure that Google disagrees with you. You probably should get rid of them to get back into Google’s good books…but what if those links are the reason you are still getting traffic from Bing and Yahoo. OR What if those links are sending you real traffic? Sure, Google is better than Bing, but Bing is better than nothing. And nothing is very realistically what you could end up with if you remove a whole bunch of links that Bing likes, and the Penguin still isn’t satisfied.

What if you do not have a backlinks problem…yet?

The Disavow tool is also a great way to take a pre-emptive strike to avoid getting into Google’s bad books. It has been my observation that it is a lot harder to get out of a penalty these days than to stay out. It’s sort of like falling into a well. It’s much easier to avoid being pushed in by a passing Penguin than to try scrambling out once you hit the bottom, so best to just avoid falling in.

Tweet this quote: “It’s much easier to avoid a Penguin penalty than to get out of one.”

I am not suggesting to make a pre-emptive strike for just any links, but I have seen twice how websites have been attacked by what you might call negative SEO. This very blog was used by a black hatter to try (unsuccessfully) to funnel PageRank to some websites through random text and image links pointing to blog comment URLs that did not exist (they left comments on this blog that were never published, but they pointed links at the non-existent URLs anyway). Their attempt was unsuccessful, but there were still hundreds of pure spam links on toxic domains increasing in rapid succession, pointing to this blog, to my domain.

Read Also: Monitor Backlinks – 7 juicy inside- and outside-the-box strategies

In another case, I worked with a website that was burdened with hundreds of new links pointing to it every day. The links were using pharmaceutical text (it was not a pharmaceutical site) and were in the company of dozens of other links all being placed invisibly in the code of blogs that the black hatters hacked into. The host blog owners never even knew the links were there, pointing to my client’s site or pointing to the many other sites.

In both these cases, spam attacks got the sites into Google’s bad books, but much, much, much more clean-up has been required to fix the link profile than just cleaning up the ones that got them into trouble. A preemptive disavow might have prevented huge headaches and a fortune of lost income for each of these websites.

The disavow tool should not be a crutch to lean on for worried website owners. If you know you have some really bad backlinks, do whatever you can to get them removed. But don’t be afraid to use the Disavow tool if that’s the best tool for your situation.

UPDATE: Google’s Matt Cutts has now confirmed that “If you’re at all worried about, you know, someone trying to do negative SEO or, you know, it looks like there’s some weird bot that’s building up a bunch of links to your site and you have no idea where it came from, that’s the perfect time to use Disavow, as well.”

Here is the video:

 

Disavow corrosive links

How Google reads your backlinks

People spend a lot of time scratching their heads, trying to understand how Google reads their backlinks.  They want to know what links they should seek to their websites that are still “safe”.

With all the turmoil over unnatural links and Penguin penalties over the past year or two, ever more people are sorting through their backlink profiles trying to understand which links to keep and which to try to cull.  What confuses many people the most is why some links would be valued over others.  “Why doesn’t Google like the links I worked so hard to build?”

The problem is that people are used to assuming that:

  • Every link is good.
  • High PageRank is what counts the most
  • Automation is good, because more is better.

These are wrong assumptions.  Remember that Google looks at each link to your website as a vote of confidence or a recommendation.  And not all recommendations are of equal value.  For instance, suppose you need headache medication…

 

Add the Infographic above to your site!

 

If one person recommends a headache medication, you might be inclined to try it. But if several people recommend a different headache medication…yes, exactly.  More is better.

But wait!  What if a doctor recommends a different headache medication.  Yup, authority trumps quantity.  And if several doctors recommend a completely different headache medication…exactly!  More is better, after all, especially when it comes with authority.

Now, what if the drug pusher around the corner offers his recommendation?  No thanks.  But what if a dozen drug pushers all recommend the same headache medication?  Of course you’ll take their advice, because more is better, right?

No way!

And Google is at least as smart as you are.  If hundreds of spammy sites link to your website, that is not a better recommendation than if one spammy website links to yours.  The more “drug pusher” websites recommend your website, the more likely Google is to label your website…

So, just as you would not want a throng of drug pushers recommending your product, make sure there is no throng of spammy websites recommending your website. Google will see more value in your website if inbound links come from trusted or – even better – highly trusted sources.

More on links from statistics websites

A couple couple weeks ago, I wrote about links from statistics and valuation websites, and posed the question whether it is worthwhile paying five dollars to have one’s domain submitted to 5000 of them. You might want to read the post before continuing.

There was an interesting comment by Graeme, that said:

“Did you check how my of these sites already had a link to yours? If I search for any of my domain names I get lots of these that I have never asked to list me.”

This is a good question, but it is hardly the full question. Some of these sites have static pages, and might already be linking back to you. But many of these sites create the pages on request. You punch in a domain name, and they grab the information from authority stats sites like Alexa or Compete, or from search engines like Google or Baidu.

So in most cases, the answer is “no” – most of these sites were not already linking to the site I submitted.

But as I said, that is not really the full question.

Think about it for a moment.

Keep thinking…

Aha! That’s it. If the linking page exists only because I requested it, what happens when I leave? Does the page still exist? Or more to the point, is the page stored somewhere for the search engines to find it? Yeah, 280 links or so, but on pages that really exist? A few, perhaps, but not most.

So you probably think I am about to change my mind and poo-poo the $5.00 I spent on this? Not quite. You see, $5 for 100 or 300 or more links (we are not sure exactly how many, remember?) is actually a pretty good deal. Any professional SEO consultant knows how much time it can take and how many fails one has to go through building just a couple links. What if we could take the pages we created on the fly and freeze them in time? Or in space? Or in cyber space?

Here’s what you do:

  1. Check which of the pages actually have a live link to your domain.
  2. Save the list of those pages.
  3. Build links directly to those pages

How? Here are four ways, depending on your comfort level.

  • Create a page on your website just for “Hey, look who thinks they know what our website is worth”.
  • Use these URLs when making blog comments.
  • Include these URLs in article marketing and blog posting (The Free Traffic System is ideally set up for this.)
  • Try some social bookmarking; there are many minor social bookmarking sites that are not as particular as Digg and Reddit are.

When the links you create are spidered, the pages evaluating your domain “exist” for the search engines. Plus, they actually have some small amount of link juice, which probably places them in the top 1% of pages on the each site for link popularity. Remember that most of these pages link only to your website, not to 30 or 40 or 50 other websites on some link exchange page. The more link juice these pages get, the better for your website.

So, the big question I am sure you all want to know is whether there were actually any improvement in rankings as a result of this little experiment. Well, here are the results at Google, keeping in mind that no links have been built in to these pages.

Keyword One before: around #70 (I did not take an exact reading)
Keyword One after: #60 (a couple days ago, I saw this at #55)

Keyword Two before: around #70 (I did not take an exact reading)
Keyword Two after: #65 and #66

Keyword Three before: around #70 (I did not take an exact reading)
Keyword Three after: #59 and #60

So these readings are positive in that it appears the site climbed a bit for all three search terms, even adding an interior page to two of the searches. The movement, however, is not phenomenal and it is possible that it is explained by other factors. It will be interesting to see if there is further movement once some links are built into a few of these pages. I might just have to report back to you again…

Links from statistics websites

I saw a gig over on Fiverr that caught my eye.  For those of you who don’t know about Fiverr, it is where anyone can offer to do anything (almost) for five dollars.  It’s a bit like The Dollar Store of online services.  You can get some amazing deals on Fiverr – stuff you would expect to pay $25 for.  Or $50.  Or even $100.  You also get some blatant scams.

Some great deals.  Some rip-offs.  But either way, five dollars isn’t much.  Like I said, it’s like The Dollar Store.

The gig that caught my eye was:

I will submit your main domain URL to well over 5000 statistic sites. How This Works. I will submit your URL to various statistic sites. These give a value of your site/blog, and also provide a free link back to your site. My software sends your URL to over 5000 sites which gives you that many one way backlinks and Rapidly gets your site indexed by Google! I will send you a text doc to prove works done too. Order now and get indexed.

Anything that generates hundreds or thousands of links automatically can’t be particularly useful for a professional SEO campaign.  But it did occur to me that a few of these sites might be useful, and the links would most likely be either the domain (some with www, some with http, some with both, some with neither?) or the title tag, so not the usual keyword style links you see in blog comment spam and forum profile spam.  And not from the type of sites my clients would usually get links from, so perhaps it would add a nice little variety to a site’s link profile.

With low expectations and high curiosity, I laid down my five bucks.

OK, first off I must say that I did not check through the full list of 7861 entries (representing 36782 sites?  I think there was a typo), but with domain duplications taken into consideration,  it is still likely that the promised 5000+ were delivered).

The first thing I noticed were how many of the statistic sites were obviously scraping results from Google, Yahoo, Bing and most of all Baidu (If you think China wants to buy up all Western real estate, what does this say about China’s hunger for Internet property?).  To be expected, I suppose, but irrelevant to this review.

I checked through 3 dozen entries, being careful not to duplicate any sites.  I guess my first disappointment were how many came up dead (sites were for sale, 404 error pages, server would not connect, etc.) – nearly half.  But I suspect that for five bucks a gig, nobody will bother to check 5000 sites for deadwood (although, maybe the software should be set up to remove dead sites).

My second disappointment were how many of them did not link to the domain they were reviewing.  They tended mostly to link internally to other pages about the domain in an internal web of sorts.

Did the gig live up to the promise of “over 5000 sites which gives you that many one way backlinks”.  Not a chance.  One of the pages gave a NoFollow link.  Another gave a link from a secondary page (which might have been one of the 7861 entries that I did not check).  Although the sample size is too small for an accurate extrapolation 36 site, or less than one percent of the total – it implies that the site did get over 280 new backlinks, from new pages on established sites.  Even if I am off by 50%, that is still 140 links for $5, with at least a couple of the links probably reasonably good.

Five bucks for 140+ links that took me just a few minutes to order (and a couple hours to blog about, but that’s another story).  I would say that it is worth it.

But there was another residual benefit, too.  A few of the statistic sites (2, 3, 4? – I didn’t keep track)  linked to various authority profiles that link back to your domain.  For instance, a profile on Surcentro.net will not link to your website, but it will link to your profile at:

  • Alexa
  • WayBackMachine
  • Robtex

And each of these links back to your site.  So we can assume that at least another 140 links have been built to your domain’s profiles on authority statistics sites that already link to your site, and that is also a worthwhile.

Would I use this gig again? Yes. I wish more Fiverr gig sellers would cut the hyperbole and be more accurate about what they are offering.  But inaccuracy aside, I would call this gig a worthwhile addition to a comprehensive link-building campaign.