Modules: Spiders 2.60 - Robot/Spider/Crawler Manager

Posted by: wishcrafton 2010/10/18 22:25:13 5177 reads
Resized Image
Spiders 2.60

Spiders is a tool for managing your spiders, crawlers and robots that come to your site. It also utilises cloud technology to spread the robots around the network using the cloud as an SEO Advantage.

This will allow you to assign User Agents of a robot to a user and user group, which means you can control with xoops permissioning the crawl radius on your site. It is also a useful tool to use if you need to by generating a secondary group for banned and aggressive bots and assigning the user for that bot to it.

There have been some improvements to the maths system that is used to identify a robot. I suggest a match percentile of around 53% which is the last option in the preferences. You can get a list of robots from the cloud as well as add robots or change details of a robot in the cloud by using the send option in the installed robots section of the software. This will lodge a modification notices on and when maintenance is time and due the robot or crawler is added.

It will also not only import robots off the cloud but will also import a robots.txt - There are also other options with this tool as well the cloud maintains a list of current subscribers that are being crawled and populates only on the robot page an SEO Footer for catch and spread of bots around the network, this will also improve your SEO Score, you can also turn sharing off in the preferences, but i recommend you use this with xortify.

Where can I download this from?

Direct: (103Kb)
Mirror: (103Kb)
SVN: (103Kb)

What is fixed in this version?
  • Mozilla client logging in as robot - based on scoring system change
  • Googlebot Identifity passing unnoticed
  • Cloud Functions & Plugins
  • CURL API use
  • furl open API use
  • SEO Footer
  • Match Scoring System