1
brash
Minimizing HTTP Requests?
  • 2004/2/26 23:43

  • brash

  • Friend of XOOPS

  • Posts: 2206

  • Since: 2003/4/10


Hi All,

I host my site over my home 512/512 kilobit SDSL connection, and am looking for ways to minimize site loading times. One of the ways I'm looking into is trying to minimize the total number of HTTP requests when loadiing my site as I currently have 39.

Upon looking at what the HTTP requests are calling for, I noticed that most of them are calling a single image or javascript each. What I'm wondering is, is it possible to call more than one object in a single HTTP request, and if so how? If I could do this I could group all the images and javascript into their own groups/blocks and reduce the number of HTTP requests to under 20 quite easily.

Also, in this post OldSwede posted some very interesting stats regarding site loading times. The stats he included were time until first byte recieved, time from first byte until last byte, total data transfered, and transfer rate. Does anyone know what program(s) might have been used to produce these stats as I'd find it a very useful tool to have while trying to optimize my site. Thanks.

2
sunsnapper
Re: Minimizing HTTP Requests?

I doubt you need to worry about the request for the javascript file, since after it is requested once, it is cached on the client computer. Subsequent requests by that user will not cause the server to resend the file. (It will log the request, but, not transmit it).

It can be advantageous to avoid using images when possible. However, image files cache, too. So, if you create a design that reuses the same small image, the file only needs to be sent to the user once.

However, let me see if I can find more about the stats, to help you with your quest.

3
brash
Re: Minimizing HTTP Requests?
  • 2004/2/27 5:11

  • brash

  • Friend of XOOPS

  • Posts: 2206

  • Since: 2003/4/10


Hi sunsnapper,

You are right that javascript and images will be cached by the client one the site has been loaded once. However, the theory I'm working on (which might be wrong) is that there is a high latency value with my site. So if I can reduce the amount of HTTP requests it'll speed up my site loading time regardless. Wouldn't it be quicker if in one HTTP request 5 or 6 images could be called for and downloaded if need be rather than calling them all one at a time? I suppose it would also have a lot to do with the way the requests are processed. If the processing of a request isn't started until the previous one is completed, then reducing the total number of HTTP requests would have a large impact on performance. However, I can't imagine this is the case as it wouldn't make sense. I imagine that all the requests are sent off at the same time. Still, the pure fact of having less to process should help shouldn't it?

I've tried to use as few images as possible, and control colour content using HTML and CSS as much as possible. The images I have used have been optimized to the nth degree using web function in PS7 which got the toal graphics content down to about 30K. However, there are a few images I don't think relevant anymore which I'm going to get rid of which should reduce it by nearly a further 10k.

One thing I'll be doing over the weekend is relocating all the images back to my server. A few months ago I moved them all to a free webspace located on my servers ISP to try and minimize the use of bandwidth on my site. I can't help but wonder if this has actually slowed it down, as all the images hosted on this free web space are the last to render when loading the site.

I found a few websites and tools that help with analysing site performance. The sites I used are:

http://www.netmechanic.com/toolbox/html-code.htm

http://www.websiteoptimization.com/services/analyze/

http://www.searchengineworld.com/cgi-bin/page_size.cgi

I also downloaded web stress testing tools from Paessler and Microsofts Web Application Stress Tool which look like they do a bit more of what I'm looking for. However, I'm still yet to find just a simply utility that will gather these stats on a once off basis with minimal setup and configuration.

4
sunsnapper
Re: Minimizing HTTP Requests?

I still blame the visitor stats block (though it does seem to work faster now). You are right though, some of the graphics (like the translation flags) take a while to load.

Wish I could be of more help. I came across that web site analyzer page referred to in the previous thread. But, I couldn't find anything that would provide stats like OldSwede had. Sorry.

5
brash
Re: Minimizing HTTP Requests?
  • 2004/2/27 5:46

  • brash

  • Friend of XOOPS

  • Posts: 2206

  • Since: 2003/4/10


I'm going to try and get my head around those web stress tools tonight that I mentioned in my last post and see if I can nail down the loose ends. Hopefully it won't be that stats block, but who knows. Thanks for trying though Sunsnapper, I really appreciate the effort !

6
Bunny
Re: Minimizing HTTP Requests?
  • 2004/2/27 6:10

  • Bunny

  • XOOPS Advisor

  • Posts: 57

  • Since: 2002/10/21


Quote:
Does anyone know what program(s) might have been used to produce these stats as I'd find it a very useful tool to have while trying to optimize my site.

These kind of info (and a lot more) you can get with IBMs PageDetailer tool, which you can get here free:

http://www.alphaworks.ibm.com/tech/pagedetailer

(Make sure to kill the process after you're done )

7
brash
Re: Minimizing HTTP Requests?
  • 2004/2/27 8:07

  • brash

  • Friend of XOOPS

  • Posts: 2206

  • Since: 2003/4/10


Nice tool Bunny ! This is a lot more like I'm after to gather quick stats. Thanks!

Login

Who's Online

54 user(s) are online (28 user(s) are browsing Support Forums)


Members: 0


Guests: 54


more...

Donat-O-Meter

Stats
Goal: $100.00
Due Date: Jul 31
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits