Monday, October 12, 2009

Understanding robots.txt

Matt Cutts, head of Google's Webspam team, presents and discusses how Google handles the robots.txt file. Watch the video below, follow and share comments on Matt's blog or discuss the video on YouTube.


  1. Robots exclusion standard ( Wikipedia )
  2. Block or remove pages using a robots.txt ( Google webmaster tools )
  3. Robot Control Code Generation Tool
  4. Search Engine Optimization 101 ( Nettuts )

No comments: