URLies is a powerful, minimalistic passive recon tool that combines the strengths of tools like waybackurls, gauplus under one CLI interface. Built with concurrency, speed, and modular recon in mind.
Made by me for now...
- Wayback Machine Recon (paged & concurrent)
- Search Engine scraping (Google-style)
- JS-based link extraction (headless recon)
- Fully customizable:
--depth, output dir, and more
git clone https://github.com/black/urlies.git
cd urlies
go mod tidy
go build -o urlies main.go./urlies -u https://example.com -m archive -o output/
./urlies -u https://target.com -m all --depth 10 -o output/-u: Target domain (required)-m: Mode βarchive,engine,headless, orall-o: Output directory--depth: Page depth for archive crawling (default: 20)
Results are saved in:
output/
βββ example.com-wayback.txt
βββ example.com-engine.txt
βββ example.com-headless.txt
archive/wayback.goβ Wayback Machine, paged APIengine/googlite.goβ Google scraping with regexheadless/jsfinder.goβ JS + DOM link detectioncore/*β Shared tools (client, helpers, output)
If you're new to Go or never installed a CLI tool before, donβt worry β follow this and youβll be running
urliesin no time.
go versionYou should see something like go version go1.20.6 linux/amd64
β Install Go if needed
git clone https://github.com/black/urlies.git
cd urliesgo mod tidyThis fetches the required packages:
- goquery β DOM parsing
- urfave/cli/v2 β CLI flag parsing
- cascadia β CSS selectors
- golang.org/x/net β HTTP extensions
go build -o urlies main.go./urlies -u https://example.com -m all --depth 15 -o output/MIT License. See LICENSE for details.
Well Dont Worry I'll take care... If there is any queries contact me on telegram...