User-agent: Googlebot Disallow: /nospam Disallow: /nospam/ Disallow: /nomirror Disallow: /nomirror/ Disallow: /admin/admin.php Disallow: /submit Allow: /users Allow: /cgi/users.pl Allow: /comments Allow: /cgi/comments.pl Allow: /cgi/fark/comments.pl Allow: /cgi/farkit.pl Disallow: /cgi/ User-agent: Mediapartners-Google Disallow: /nospam Disallow: /nospam/ Disallow: /nomirror Disallow: /nomirror/ Disallow: /admin/admin.php Disallow: /submit Allow: /users Allow: /cgi/users.pl Allow: /comments Allow: /cgi/comments.pl Allow: /cgi/fark/comments.pl Allow: /cgi/farkit.pl Disallow: /cgi/ User-agent: * Crawl-delay: 1 Disallow: /nospam Disallow: /nospam/ Disallow: /nomirror Disallow: /nomirror/ Disallow: /admin/admin.php Disallow: /submit Disallow: /users Allow: /comments Allow: /cgi/comments.pl Allow: /cgi/fark/comments.pl Disallow: /cgi/ # $Id: robots.txt 16280 2013-03-10 04:23:29Z mandrews $ # IMPORTANT NOTE: # User profiles have a meta tag on the page to tell search engines to NOT # index them. But to read the meta tag, the engines have to be able to crawl # the page. A disallow means "don't crawl", NOT "don't index" -- if some other # site has a link to a URL we have in our disallow list, search engines may # still index it anyway. So, counterintuitively, the reason we allow # Googlebot to crawl user profiles is so that they WON'T index them. # Our intent is that profiles NOT appear in search engines.