fbpx

Category: Crawling & Robots

ladybug, beetle, coccinellidae-1480102.jpg

The Fundamentals of Crawling for SEO

What is crawling? Crawling is the process where Google deploys an web crawler, or internet bot, to publicly available web pages to “crawl” or read a page. When this happens, it downloads all the text, images, and videos found on the page. As crawlers or spiders visit these websites, they utilize the links on those …

The Fundamentals of Crawling for SEO Read More »

crab, sea, pliers-950463.jpg

Does Google Crawl URLs In Structured Data?

Being able to get your links discovered, crawled and indexed by Google is important to your SEO, and can ultimately be advantageous because if one URL gets crawled, then more pages can get crawled on your website. In an episode of SEO Office Hours, John Mueller spoke up about whether or not Google used links …

Does Google Crawl URLs In Structured Data? Read More »

Six Common Robots.txt Issues & And How To Fix Them

Robotst.txt are a wonderful tool that you can use to instruct search engine crawlers how you would want it to crawl your website. They are used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, …

Six Common Robots.txt Issues & And How To Fix Them Read More »

Are Big Robots.txt Files A Problem For Google?

In case you weren’t in the know, robots.txt may be a small part of SEO, but it’s still important for your website. They tell the search engines which pages to access and index on your website on which pages not to. Not only that, robots.txt files are great for keeping the search engines from accessing …

Are Big Robots.txt Files A Problem For Google? Read More »

The Past, Present, And Future Of Robots.txt

In this Google Search Central episode of Search Off the Record, Gary, Lizzi, and Martin from the Google Search team are joined by special guest David Price, a Product Counsel for Google. They discuss human initiated crawl requests vs. bot initiated crawl requests, the robots.txt file, and more!

New Crawl Stats Report Launched For Google Search Console

A new crawl stats report has been launched by Google for Search Console. The company says that this new report will be easier for Search Console users to find issues with Google crawling. Here is what’s new: Total number of requests grouped by response code, crawled file type, crawl purpose, and Googlebot type. Detailed information …

New Crawl Stats Report Launched For Google Search Console Read More »

Robots.txt Tester Is Added To Bing Webmaster Tools

A new feature has been added to the new Bing Webmaster Tools – a robots.txt tester. You can take the tester for a ride over here. This feature was added by Bing back in 2009, but was dropped. Now, it has officially returned. What does the Tester do? According to Bing, it will analyze Webmaster’s …

Robots.txt Tester Is Added To Bing Webmaster Tools Read More »

Server Migrations Are “Uneventful For Google Systems”

According to Webmaster Trends Analyst John Mueller, during the September 24 edition of #AskGoogleWebmasters, server migrations “are pretty uneventful for Google systems” so long as everything remains the same, even though Googlebot will readjust how often it crawls your site. Twitter user @JSAdvertiseMint asked: “Our site is changing servers and I’ve had this go disastrously in …

Server Migrations Are “Uneventful For Google Systems” Read More »

Google Notifying Webmasters To Remove Noindex From Robots.txt Files

Now that Google is completely removing support for the noindex directive in robots.txt files, Google is sending out notifications to those that have such directives. On the morning of Jul 29, a number of folks in the SEO community began to recieve notifications from Google Search Console with the subject line, “remove “noindex” statements from …

Google Notifying Webmasters To Remove Noindex From Robots.txt Files Read More »

Google Adds JavaScript SEO Basics To Its Search Developer’s Guide

On July 18, Google added a JavaScript SEO basics section to its Search developer’s guide. It includes general descriptions of how Google processes JavaScript as well as some best practices. Not only is there an overview of how Googlebot crawls, renderers and indexes JavaScript web apps, the guide provides basic tips, which are accompanied by links to …

Google Adds JavaScript SEO Basics To Its Search Developer’s Guide Read More »