Retrieve a single url in headless mode #488
-
I have a list of URLs stored in Can you make katana retrieve its related JS and static file and follow redirect if any for a single URL in a headless mode with no follow to http links within the retrieved web page, please? I am looking for a simple cmd with a little help please. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
@Sim4n6 thanks for your question! I think I've developed a command to get you what you want. The command is:
Where:
I hope this helps - and even if it isn't perfect, it at least gives you some idea of what you can do! |
Beta Was this translation helpful? Give feedback.
@Sim4n6 thanks for your question! I think I've developed a command to get you what you want. The command is:
katana -list urls.txt -d 2 -jc -hl -sr -mr '(.*)\.js'
Where:
-list urls.txt
Uses your URL list - the format here shouldn't matter-d 2
Sets depth to 2, so you only crawl pages directly from the URLs you sent-jc
Sets JavaScript crawling on (optional) to crawl within those JS files-hl
Headless-sr
Save responses (optional)-mr '(.*)\.js'
Match a regex to only craw files ending in.js
- you could modify this regex as neededI hope this helps - and even if it isn't perfect, it at least gives you some idea of what you can do!