xoops forums

irmtfan

Module Developer
Posted on: 2012/5/30 2:23
irmtfan
irmtfan (Show more)
Module Developer
Posts: 3419
Since: 2003/12/7
#1

protector 3.51 in xoops 2.5.5 did not exclude webmasters

In a brief statement i believe protector should exclude webmasters group from all actions but it recognise members of webmasters group as "CRAWLER" when they load too much pages and show them a blank page.

It is a bug or it is the normal function which should be corrected in the later versions?

even i think it should exclude certain groups like registered users from the "CRAWLER" attackers.


irmtfan

Module Developer
Posted on: 2012/6/9 1:42
irmtfan
irmtfan (Show more)
Module Developer
Posts: 3419
Since: 2003/12/7
#2

Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters

Any idea how to solve this?
Now users and even webmasters get a blank page.
i tried to exclude forum module for crawling but it resulted to non viewed high traffic and finally db connection failed error.

irmtfan

Module Developer
Posted on: 2012/7/10 4:00
irmtfan
irmtfan (Show more)
Module Developer
Posts: 3419
Since: 2003/12/7
#3

Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters

is this a bug?
if not i want to add this to feature request.
Please confirm.

Mamba

Moderator
Posted on: 2012/7/10 4:18
Mamba
Mamba (Show more)
Moderator
Posts: 10334
Since: 2004/4/23
#4

Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters

I don't think that it is a bug. The current value for "Bad counts for Crawlers" is set at 40, i.e. if you have more than 40, you'll get a blank page.

I have normally tons of tabs open with XOOPS pages, but I never got a blank page, so I guess, I was below 40.

If this is on your local site, try to increase the value for "Bad counts for Crawlers" and see if it helps...
Please support XOOPS & DONATE
Use 2.5.8 | Debug | Requests | Bugs

irmtfan

Module Developer
Posted on: 2012/7/10 4:42
irmtfan
irmtfan (Show more)
Module Developer
Posts: 3419
Since: 2003/12/7
#5

Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters

I think you are wrong. the effective value here is "Watch time for high loadings (sec)"
the default is 60 but in my site it is fixed at 20.

Also 40 is extremely high for "Bad counts for Crawlers". in my site i reduce it to 5.
if i increase the values the result is db connection failed and resource usage abuse and finally suspended account in the server.

Today I can see db connection fail error in xoops.org which has a dedicate server.
what happens to my poor shared hosting