The Principal program manager for Bing Webmaster Tools, Fabrice Canel has come out with an update on his and his team’s efforts to improve the efficiency of BingBot, the company’s webcrawler.

Canel had a talk at SMX Advanced back in June where he announced an 18-month effort to improve BingBot, and this update is the follow up to that.

Canel said in a blog post that numerous improvements were made based on the feedback provided at SMX Advanced.  He said they will “continuing to improve” the crawler and share what they’ve done in a new “BingBot series” on the Bing webmaster blog.

Canel outlined in his first post the goal for BingBot, which is to use an algorithm to determine “which sites to crawl, how often, and how many pages to fetch from each site.”  To make sure that the site’s servers aren’t overloaded by the crawler, Bingbot will limit its “crawl footprint” on a site while ensuring content in its index as fresh as possible.

Bing is working to balance this “crawl efficiency” to strike at scale.  Canel said, “We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.”  It’s still being currently worked on.

With this update, if you add new content to your site, and Bing isn’t able to see it, it will not rank.  Basically, searchers who are using Bing won’t find your new content.

Source –