- Home
- Exclusive Tools
- Hidden Path Recon
Hidden Path Recon
Last updated:
Fetches robots.txt (parses all Disallow paths and Sitemap references), recursively follows sitemap indexes, then probes 50+ common sensitive paths (/.env, /.git/config, /admin, /backup, /debug, /wp-admin, /phpinfo.php, /.aws/credentials). Reports what's accessible, redirected, or forbidden.
Drag to your bookmarks bar:
🤖 Recon PathsRuns on any website — all processing in your browser.
Install the bookmarklet, then use it on any website
Hidden Path Recon
Every web server has paths beyond what's linked from the homepage. Robots.txt reveals what organizations want search engines to skip, sitemaps show site structure, and common path probing reveals accidentally exposed configuration files, admin panels, backups, and debug endpoints.
Critical Paths
Exposed .env files leak database credentials and API keys. Accessible .git/config reveals the repository URL. phpinfo.php exposes server configuration. server-status reveals active connections.
🤖 Hidden Path Recon — FAQ
Does this send many requests to the target?
It sends HEAD requests to ~50 paths. This is lightweight reconnaissance — similar to what a search engine crawler does.
What does 403 Forbidden mean?
The path exists but access is denied. This confirms the resource is present, which is itself useful intelligence.
Can it find admin panels?
It probes common admin paths (/admin, /wp-admin, /dashboard, /panel). Custom or non-standard admin paths won't be found.
Is robots.txt a security mechanism?
No — robots.txt is advisory for search engines. It has no access control. Disallowed paths are publicly readable.
Should I probe paths I don't own?
This tool is for authorized security testing. Probing sites you don't own may violate terms of service or laws in your jurisdiction.