Github: github.com/Spix0r/fback
#CyberSecurity #bugbountyTools #bugbounty #Recon #reconnaissance #bugbountytips
5/5
Github: github.com/Spix0r/fback
#CyberSecurity #bugbountyTools #bugbounty #Recon #reconnaissance #bugbountytips
5/5
echo https://example[.]com/files/config.php | fback -y 2020-2024 -m 1-12
Example Output:
config.php.bak
config_backup.php
config_2024.php
files_config.php
4/5
echo https://example[.]com/files/config.php | fback -y 2020-2024 -m 1-12
Example Output:
config.php.bak
config_backup.php
config_2024.php
files_config.php
4/5
You know those static websites, especially WordPress sites, where you encounter paths like:
example[.]com/files/config.php
But you don't have access to config.php, so now what? What should you test here?
3/5
You know those static websites, especially WordPress sites, where you encounter paths like:
example[.]com/files/config.php
But you don't have access to config.php, so now what? What should you test here?
3/5
It’s a tool that generates target‑specific wordlists to fuzz for backup files—think config.php.bak, config_backup.php, etc. Its perfect for hunting juicy unattended backups on static or WordPress sites.
2/5
It’s a tool that generates target‑specific wordlists to fuzz for backup files—think config.php.bak, config_backup.php, etc. Its perfect for hunting juicy unattended backups on static or WordPress sites.
2/5
github.com/Spix0r/cloud...
#CyberSecurity #BugBounty #BugBountyTools #pentest #infosec #Certificate #bugbountytips #reconnaissance #Recon
github.com/Spix0r/cloud...
#CyberSecurity #BugBounty #BugBountyTools #pentest #infosec #Certificate #bugbountytips #reconnaissance #Recon
[Passive Search] If you lack the necessary resources, you can utilize kaeferjaeger provider to conduct a passive search. 2/3
[Passive Search] If you lack the necessary resources, you can utilize kaeferjaeger provider to conduct a passive search. 2/3
Because it's possible that the site you are investigating had numerous paths listed in its robots.txt file that were subsequently removed in later updates. Despite their removal, those paths, files, and parameters may still be accessible.
3/3
Because it's possible that the site you are investigating had numerous paths listed in its robots.txt file that were subsequently removed in later updates. Despite their removal, those paths, files, and parameters may still be accessible.
3/3
I’ve created a tool called RoboFinder, which allows you to locate historical robots.txt files.
Robofinder on Github: github.com/Spix0r/robof...
2/3
I’ve created a tool called RoboFinder, which allows you to locate historical robots.txt files.
Robofinder on Github: github.com/Spix0r/robof...
2/3