Mind explaining how?
Yes, please elaborate, I can use this info.
Sure. WIth Apache, you can use an .htaccess file as an added layer of protection. I suggest using password and IP-based protection. If you will only access /admin from your local subnet, then only allow access from that subnet. If you are connecting to a remote server, but from a fixed IP address, only allow access from that IP address.
You can do something similar with IIS and WIndows-based authentication.
im confused rob explain plz so do i really need robot.txt??
Well this is my point. The Computer Hope article explains it quite well. You tell spiders not to index parts of the web tree that it would be pointless for them to access - e.g. recursive directories (not something you're likely to encounter for a while) or any thread on YaBB that contains a post by Mac...