1
Rincewind
Re: Polish your theme CSS files
  • 2003/2/24 0:31

  • Rincewind

  • Just popping in

  • Posts: 16

  • Since: 2002/1/14


Some more resources for good css.

W3C Schools CSS pages.

The w3c.org CSS ValidatorI run all my css through this.

The CSS forums at webmasterworld.



2
Rincewind
Re: Hacked the REF Hack :o)
  • 2002/3/1 2:49

  • Rincewind

  • Just popping in

  • Posts: 16

  • Since: 2002/1/14


Maybe didn't make myself clear. I'm not against cloaking. It is as I said very very useful. Just don't get caught thats all.

Also, I would still like to run the script in this hack even when users view the page as the changes to the title tag are the most important part of the hack. Let me explain; XOOPS normally displays the same title for every single page. However even if I get a page on a keyword into a good SE ranking, it could fail to deliver decent traffic because the search results from some SE use the title and description tags off your page as the results text. Currently these would describe the general theme of the site that you keyed in to admin/preferences. But if the keyword is more exact then you would want the title and discription to be more exact and so the hack become useful in changing the title. You would still want the same title to apear on the browser when the visitor arrived so maybe that part of the hack should remain.

If you really want to save on server load, you could miss out the keyword creation part completely. Only the smaller, or older SE's still use metakeywords in ranking. Instead they parse the real text of your page the same way that your script does and decide on there own keywords to use. So if you missed all the metakeywords out and just stuck with title and description few SE's would notice (and any that did would not be worth bothering about)



3
Rincewind
Re: Hacked the REF Hack :o)
  • 2002/2/28 20:57

  • Rincewind

  • Just popping in

  • Posts: 16

  • Since: 2002/1/14


Wow there Half-dead. Sounds like your talking about cloaking. From Search Engine World:

Quote => Using some system to hide code or content from a user, and deliver custom content to a search engine spider. The word Cloak comes from Star Trek where the Klingons were capable of "cloaking" their ships invisible. There are three main types of cloaking: IP based, User Agent based, and the combination of those two. IP based cloaking custom delivers a page based on the users IP address (this can be used to deliver custom language based sites or target groups of users from particular ISP's such as AOL or @home users). User Agent cloaking sends a custom page based upon the users Agent (most often use to take advantage of a particular agents strengths or features). Finally, the combination of Agent and IP cloaking is use to target specific users <= end quote

There are two schools of thought when it comes to cloaking. The search engines say "don't do it. It's spamming and we don't like it." and the web masters say "The search sites use cloacking themselves (ever tried to get to altavista US and ended up in UK or where ever your from) so it must be OK. And also if you don't catch us, what you don't see don't hurt."

Your probably thinking, if the sites cloacked how could the SE's see you. Well the occasional human does actualy confirm a proportion of listings. They compare your real web page to the one in the spiders cache and if they don't match However if the only dif is the meta tags then a human will never notice, right, wrong. One of the first checks is a simple bit count. If the cached page has a different bit count form the real page then further investigation is taken.

Basicly what I'm saying is cloaking sections of your site can be very veryuseful but also fraught with danger. For more info check out these two articles Here and here.



4
Rincewind
Re: Hacked the REF Hack :o)
  • 2002/2/28 12:11

  • Rincewind

  • Just popping in

  • Posts: 16

  • Since: 2002/1/14


This is a very usefull hack. However there is a few more feature I would find usefull.

You script cuts out short words like "the" "and" "to" but many words in english are longer than three letters but should also be cut out such as "from", "because", "also", and others. Would it be posible to create a data file of words to exclude. Or crossreferance the keywords with a dictionary file and only include nouns and verbs. Thus excluding all proverbs. I understand this is a bit more than just a hack and could be a leanthy mod. Plus it would have to be upadted with each fresh language supported. However it could be significanly benificial.

Sencondly: Would it be posible for someone out there to create a module that parsed the text on your pages a came back with a word count on each keyword used. This would be usefull to determine how well the search engines may rank you in thier listings. You would be able to alter the keyword density of your documents to optomise your ranking without the overkill of spamming keywords on pages, or the underkill of not enough keywords. I have found problems in underkill before. I once had a web design company site which ranked better under aromatherapy than under web design. This was because one of the clients was an alternative medicine group and so lots of my pages were discussing the clients site and not mine. Indeed at one point my web design site ranked higher for aroatherapy that the aromatherapy site itself. A mod like that discribed above would help predict how search engines will rank your pages before you submit them so you can correct things before it's to late. It's hard enough getting listed in search engines without worrying about listing incorectly.




TopTop



Login

Who's Online

220 user(s) are online (151 user(s) are browsing Support Forums)


Members: 0


Guests: 220


more...

Donat-O-Meter

Stats
Goal: $100.00
Due Date: Apr 30
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits