Robots.txt – There’s gold in them thar files

Posted on Posted in SecConf

Abstract:

Web penetration testing has benefited from certain sites providing a ready made list of sensitive areas that they don’t want crawled, robots.txt. I pulled, and analyzed, the robots.txt file from numerous sites to determine most common user-agents and locations. From the results, I have derived a better listing of directories to use with tools like dirbuster and for better reconnaissance.


 

Video:

Quelle: BSidesPGH

Facebooktwittergoogle_plus