Spiders and Bots are always scanning through your site and indexing the information as best they can. But how do you test your crawlability, to make sure the information you’re providing is indexed properly?
Enter SpiderTest and SEOBook.com’s Spider Test Tool.
Both crawl your site, but each reports back a separate set of info to help with improving your crawlability. SpiderTest gives a bulleted list of small technical improvements that will help with addressing the issues Googlebot has crawling your site. SpiderTest also reports some keyword statistical information on the page you set to crawl. While SEOBook.com’s Spider Test Tool gives more of a blunt, and less helpful, review of the page. Spider Test Tool does manage to list out some basic important info, like links on the page, the cache state of the page on search engines, and some rawly unrefined keyword info.
I would say SpiderTest is the more accurate for trying to see what GoogleBot sees and Spider Test Tool is more like what a crappier bot would see. But they are both worth a try!