Forum Thread: How to Use Curl for Web Crawling?

hello recently i started to search a way to write a program that searches a website/forum and saves (downloads) webpages based on a keyword for example i would like to search entire null-byte for posts that have a keyword -for example wifi hacking- and then download any page that has that keyword. From my google search i found out about curl a user said that it can do that but i didnt managed to find a way to do it. could someone point me to the right direction to do that with curl or anyway else. thanks in advance :)

Our Best Hacking & Security Guides

New Null Byte posts — delivered straight to your inbox.

3 Responses

Hmm... I asked the same question on Stack Overflow. Didn't get anything about cURL yet...

From what I have found in relation to your post, it seems like 'wget' would be a better executable for this use case. To the best of my understanding, cURL is best used for single requests. Getting (or Posting) pretty much whatever you want to/from the server you specify, without having to use your web browser as the intermediary. I personally was looking for an easier way to get all the resources (pictures) from a specific URI, and it seems that 'wget' makes this specifc task of gathering multiple resources much more convenient. I'm pretty certain there's a variety of different ways to do this with cURL, but I am doing this to save time, ergo 'wget'. Hope that helps.

Share Your Thoughts

  • Hot
  • Active