BrowseRank Goes Beyond PageRank

I am just back from vacation and wading through three weeks of emails, but while I was gone a story broke that I just can’t let pass.  You might have heard me say it before, but sooner or later the search engines will shift their algorithms from focusing just on relevance and importance to include a third pillar: usefulness. 

This story entitled Microsoft Talks about BrowseRank Beyond PageRank shows that Microsoft is well on it’s way to developing just such an algorithm.  The article mentions a few ways a search engine can determine how useful searchers find a result, but there are more that are not mentioned in the article.

  1. Click-thru rates.
  2. Number of people who bounce back to the search page.
  3. Time before a person bounces back.
  4. Number of pages a user visits before bouncing back.
  5. Time spent on the specific page clicked.
  6. Whether the person bothered to scroll down on the page.

Of course, people like me would totally mess up the algorithm; I leave my windows open forever.  And if you think that user behavior is hard to manipulate, think again.  Usability will be now more important for SEO, but also will be coaxing users to spend more time on the website and go deeper in.

But the biggest change we will see is that website owners will have to focus on not letting their visitors bounce back to Google.  Suddenly having links to other useful sites will be a good thing, to the dismay of so many website owners who are terrified of placing a link to anybody else, for fear they might bleed customers, PageRank or both.

As all user search engines move into measuring user behavior, new strategies will be required.  I will report on some of those shortly.

Stay tuned… 


  1. Very interesting, although this has been tried before. DirectHit had a search engine built entirely on clickstream data (Acquired by in 2000). They got the data from ISPs in those days. The end-result is really not that much better than Page-Rank.

    We at Me.dium on the other hand ( are processing our user’s clickstream data in real-time to create a different lens based on what’s going on now. e.g. do a search for John Edwards on Google or Live, and you get and wiki/johnedwards. Do the same search on Me.dium and you learn that today people care about his love child, pictures of his mistress, etc.

    The difference is real-time (what people are browsing now) vs. historical (what they browsed in the past). Social vs. Old School. Check it out and let us know your thoughts.

  2. Ugh, this is going to be more of the same similar to how people have moaned and groaned about pagerank and even gone as far as to use re-directs to fake pagerank within the link buying sector. I think it is great that Microsoft is building out their capabilities to factor better ways to rank pages, but do we really need another green bar in our browsers.

    As a seo services company, it can be a frustrating to have a client measure success by pagerank. Frequently, clients forget about the only datapoint they should care about (volume) and just are slaves to the little green bar!

  3. The concept is interesting. I also leave my pages open all the time but I guess this is part of the logic behind it to think that if you leave a page open it is most likely because you think you might have to use it again later…therefore that page you are keeping open must be useful. Looking at my benchmark page in Google Analytics I wouldn’t mind that system. 🙂

  4. I absolutely agree of what you’ve mentioned here.

    To me, I’m more concern about the returning visitors than the click thru rate…

  5. It’s mind boggling how much information that the search engines can track form our search usage.This is important for developing the next strategy for getting visitors to stay longer on our sites.

  6. I think that you find a combination of both PR and browsing, especially as Google gathers data with more tools like Google Analytics. SEO as a just a set of rules and best practices will fade, and creating content and usability will become king. There are so many good sites that are poorly laid out and utilized that a lot of the optimization will become focused on usability and conversions, and less on links and title tags.

  7. It is amazing to see that they are getting beyond just what PR can tell us. It seems like the time before people bounce back could be very easy manipulated though.

  8. Sounds like MS may be on to a better mouse trap. Placing so much weight on link love alone is pretty myopic.

  9. Yes, it will be interesting to see if over the long run the results MSN displays are more pleasing to searchers and whether it helps them retain searchers more than Google or Yahoo.

    Even more interesting will be whether they can spin it to attract more searchers, as Google did with PageRank. And if they do try to spin it, will they be able to explain it very clearly to the public without creating Big Brother fears that they are watching where we are all surfing?

  10. Interesting concepts of Browse Rank developed by Microsoft. I’ve only heard fleeting discussion but never seen much written about it. I suppose it ads the usability factor which is often an overlooked subject. I actually kinda like it but MSN is so far behind Google it seems a bit too little too late. I hope it can make some impact and help people improve their sites.