Robots.txt example:

User-agent: *
Disallow: /forbidden/

Log example:

65.55.106.209 - - [31/Aug/2009:16:53:15 -0600] "GET /robots.txt HTTP/1.1" 200 294 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.106.209 - - [31/Aug/2009:16:54:03 -0600] "GET /legitpage1.php HTTP/1.0" 200 14161 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.106.162 - - [31/Aug/2009:18:20:12 -0600] "GET /robots.txt HTTP/1.1" 200 294 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.106.162 - - [31/Aug/2009:18:21:03 -0600] "GET /forbidden/ HTTP/1.0" 403 3893 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.106.187 - - [31/Aug/2009:18:46:27 -0600] "GET /legitdir1/ HTTP/1.0" 200 9835 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"

65.55.51.70 - - [02/Sep/2009:18:18:00 -0600] "GET /robots.txt HTTP/1.1" 200 179 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.207.95 - - [02/Sep/2009:19:06:32 -0600] "GET /robots.txt HTTP/1.1" 200 294 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.207.95 - - [02/Sep/2009:19:07:34 -0600] "GET /legitpage2.php HTTP/1.0" 200 6494 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.51.70 - - [02/Sep/2009:19:27:45 -0600] "GET /robots.txt HTTP/1.1" 200 179 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.106.138 - - [02/Sep/2009:20:24:36 -0600] "GET /forbidden/ HTTP/1.0" 403 3893 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"
65.55.207.120 - - [02/Sep/2009:20:33:05 -0600] "GET /legitpage3.phpforbidden/ HTTP/1.0" 404 5514 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)"

For clarity, logs have been modded as follows:

forbidden = Location or Honeypot (Should NOT be Spidered)
legitpage# = Legitimate Web Page (Should be Spidered)
legitdir# = Legitimate Directory/Folder (Should be Spidered)