No.
We follow Disallow directives but the only exception is the Home page. We are an SEO analysis tool, so we need to read at least the home page and based on the robots.txt `Disallow:` directives take further actions.
Ex.
User-Agent: * Disallow: / 172.104.138.120 - - [15/Sep/2018:10:11:45 +0200] \"GET / HTTP/1.1\" 301 546 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/600.1.17 (KHTML, like Gecko) Version/7.1 Safari/537.85.10 MarketGoo/2.1\" 172.104.138.120 - - [15/Sep/2018:10:11:45 +0200] \"GET / HTTP/1.1\" 200 35620 \"http://district1071.nl/\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/600.1.17 (KHTML, like Gecko) Version/7.1 Safari/537.85.10 MarketGoo/2.1\" 172.104.138.120 - - [15/Sep/2018:10:11:46 +0200] \"GET /robots.txt HTTP/1.1\" 301 566 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/600.1.17 (KHTML, like Gecko) Version/7.1 Safari/537.85.10 MarketGoo/2.1\" 172.104.138.120 - - [15/Sep/2018:10:11:46 +0200] \"GET /robots.txt HTTP/1.1\" 301 5770 \"http://district1071.nl/robots.txt\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/600.1.17 (KHTML, like Gecko) Version/7.1 Safari/537.85.10 MarketGoo/2.1\" 172.104.138.120 - - [15/Sep/2018:10:11:46 +0200] \"GET /robots.txt HTTP/1.1\" 200 5806 \"https://district1071.nl/robots.txt\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/600.1.17 (KHTML, like Gecko) Version/7.1 Safari/537.85.10 MarketGoo/2.1\" |
In the case of a `Disallow:` directive as restrictive as `Disallow: /`, then we won’t read any more pages, but please note that it would prevent our tool from showing you more in-depth analysis of your website because we need to read as much content as possible in order to implement a correct and concise task list for you.
If you think that it is not otherwise, kindly send us the full code log so we can check with our development team.