Forum Thread: Scan a Website Which Blocks the Scan with robot.txt

When i scan a website with Vega , the website just go offline. I think it is because of the use of a Honeypot. May be in the robot.txt

Is there any way to scan it.
Thanks In Advance

7 Responses

What do you mean by "the website just go offline"? Like you just cannot reach it anymore?
If it was a honeypot it wouldn't necessarily be blocking you, it would be tracking you (kind of).

The robot.txt doesn't block you either. It simply tells legitimate web crawlers not to crawl certain pages or resources.

More than likely the site is deploying some sort of defense against automated scans. Like a WAF (Web Application Firewall) of some sorts. Automated scanners tend to be loud, fast and hard. This makes it easy for defense mechanisms to tell the difference between real user traffic and automated scans.

Well you should know some websites will go offline because the scan may be doing a DoS on the site as a side effect.
Comes back online shortly after scanning stops?

And yes 1 boxen scanning a site can DoS it.

What do you mean by "the website just go offline"? Like you just cannot reach it anymore? or the site actually goes down?

More information would be needed to try and help you. Depending on the security measures being used you may have to adjust speed and aggressiveness of the scan. You could always manually scan the site.

try changing the user agent.

and the speed of the scan, try a stealth scan on nmap

Share Your Thoughts

  • Hot
  • Active