At the beginning of this month, Parsero v0.71 was presented by ToolsWatch Hacker Arsenal in their blog. That is something that I really appreciate...

Today, I would like to introduce Parsero v0.75. Before writing about that, let me make a brief summary.

As has been written in OWASP Testing Guide v4: Testing: Review Webserver Metafiles for Information Leakage (OTG-INFO-003), robots.txt file could be used "for information leakage of the web application's directory or folder path(s)".

In order to get sensitive information thanks this file, I've developed Parsero which is able to perform this task automatically.

What is new?

Some problems have been fixed in the current version which have three new features that I would like to talk about.

  • In the last version, Parsero was able to detect if the content in the Disallow entries had been indexed by Bing by doing searches in this crawler. Now, we are able to check if these links indexed are actually available or not. Notice that Parsero only checks the links of the first Bing results page. It means the first 10 results are analyzed.


  • Now Parsero is able to detect if there are Disallows entries repeated in the robots.txt file in order to check each one once to save time. The picture bellow shows you a robots.txt file with the same links repeated.

And how Parsero is able to detect it and check each Dissallow entrie only once.


  • In the last version, Parsero downloaded the robots.txt to the machine in order to parse it. Now, Parsero performs this task by doing the same task on the fly.

You can download Parsero here: https://github.com/behindthefirewalls/Parsero

More info here: http://www.behindthefirewalls.com/search/label/Parsero