1
irmtfan
protector 3.51 in xoops 2.5.5 did not exclude webmasters
  • 2012/5/30 6:23

  • irmtfan

  • Module Developer

  • Posts: 3419

  • Since: 2003/12/7


In a brief statement i believe protector should exclude webmasters group from all actions but it recognise members of webmasters group as "CRAWLER" when they load too much pages and show them a blank page.

It is a bug or it is the normal function which should be corrected in the later versions?

even i think it should exclude certain groups like registered users from the "CRAWLER" attackers.



2
irmtfan
Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters
  • 2012/6/9 5:42

  • irmtfan

  • Module Developer

  • Posts: 3419

  • Since: 2003/12/7


Any idea how to solve this?
Now users and even webmasters get a blank page.
i tried to exclude forum module for crawling but it resulted to non viewed high traffic and finally db connection failed error.

3
irmtfan
Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters
  • 2012/7/10 8:00

  • irmtfan

  • Module Developer

  • Posts: 3419

  • Since: 2003/12/7


is this a bug?
if not i want to add this to feature request.
Please confirm.

4
Mamba
Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters
  • 2012/7/10 8:18

  • Mamba

  • Moderator

  • Posts: 11409

  • Since: 2004/4/23


I don't think that it is a bug. The current value for "Bad counts for Crawlers" is set at 40, i.e. if you have more than 40, you'll get a blank page.

I have normally tons of tabs open with XOOPS pages, but I never got a blank page, so I guess, I was below 40.

If this is on your local site, try to increase the value for "Bad counts for Crawlers" and see if it helps...
Support XOOPS => DONATE
Use 2.5.11 | Docs | Modules | Bugs

5
irmtfan
Re: protector 3.51 in xoops 2.5.5 did not exclude webmasters
  • 2012/7/10 8:42

  • irmtfan

  • Module Developer

  • Posts: 3419

  • Since: 2003/12/7


I think you are wrong. the effective value here is "Watch time for high loadings (sec)"
the default is 60 but in my site it is fixed at 20.

Also 40 is extremely high for "Bad counts for Crawlers". in my site i reduce it to 5.
if i increase the values the result is db connection failed and resource usage abuse and finally suspended account in the server.

Today I can see db connection fail error in xoops.org which has a dedicate server.
what happens to my poor shared hosting


Login

Who's Online

347 user(s) are online (220 user(s) are browsing Support Forums)


Members: 0


Guests: 347


more...

Donat-O-Meter

Stats
Goal: $100.00
Due Date: Nov 30
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits