log when blocked by robots.txt

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

log when blocked by robots.txt

Stefan Groschupf-2
Hi Developers,
another thing in the discussion to be more polite.
I suggest that we log a message in case an requested URL was blocked  
by a robots.txt.
Optimal would be if we only log this message in case the current used  
agent name is only blocked and it is not a general blocking of all  
agents.

Should I create a patch?

Stefan

Reply | Threaded
Open this post in threaded view
|

Re: log when blocked by robots.txt

Piotr Kosiorowski
I think I would log in both situations but different message.
+1
P.

On 7/21/06, Stefan Groschupf <[hidden email]> wrote:

> Hi Developers,
> another thing in the discussion to be more polite.
> I suggest that we log a message in case an requested URL was blocked
> by a robots.txt.
> Optimal would be if we only log this message in case the current used
> agent name is only blocked and it is not a general blocking of all
> agents.
>
> Should I create a patch?
>
> Stefan
>
>