Fuzzing / Wordlists
Param Bruteforce -https://twitter.com/HusseiN98D/status/1166759438503620610 - Arjun
Check Status Codes - https://github.com/Sy3Omda/dotfiles/blob/master/fetcher.sh
Robots disallowed - https://github.com/danielmiessler/RobotsDisallowed
Content Discovery - https://twitter.com/Alra3ees/status/1208502084246671366 (Also downloaded the same)
Dirbuster - When you're brute forcing for endpoints, don't forget to add extensions. You can also use this method to discover backup files. Here's a command I use frequently:
dirsearch -e php,asp,aspx,jsp,py,txt,conf,config,bak,backup,swp,old,db,sql -u - https://twitter.com/i/status/1221792235215151104
PathBrute - https://github.com/milo2012/pathbrute
Status codes - hakul/hakcrawl - gofetch , statusparser
What are your normal testing steps when you see a 401? - https://twitter.com/nomanAli181/status/1146411693590736896
https://github.com/deibit/cansina - web content discovery
New dirs to bruteforce -https://twitter.com/nullenc0de/status/1249804904790732802
Jhaddix -
Tools - Fast web fuzzer written in Go - https://github.com/ffuf/ffuf
When you're brute forcing for endpoints, don't forget to add extensions. You can also use this method to discover backup files. Here's a command I use frequently:
dirsearch -e php,asp,aspx,jsp,py,txt,conf,config,bak,backup,swp,old,db,sql -u
Thread Related -- https://twitter.com/search?q=FFuF&src=typed_query
Any tool to dedupe a list of urls according their parameters? I mean keep only 1 url if it appears several times with same params no matter their values - https://twitter.com/gwendallecoguic/status/1207435306410168322
wordlist by random robbie - https://gist.github.com/random-robbie/0f9d24a7b3c7268ee0c1ecdbe280611b
Subdomain bruteforce list - https://twitter.com/Alra3ees/status/1068079409117188096
Interesting file extensions to look for: - https://twitter.com/s0md3v/status/1271241942576185344
Check out subs_all txt here - https://drive.google.com/file/d/12nABC1GUL7lBsPuzC0pWJrPRzHMHqe8X/view?usp=sharing
Tool - wordlistgen
Huge_DIR_wordlist:- https://github.com/emadshanab/Huge_DIR_wordlist
Exploiting:
Last updated