Degrading performance over time

I’m having a very weird problem in which the more scripts I access on my site, the longer they begin to take to run. It starts out at about 1-2 seconds (still longer than it ought to, since the combined MySQL queries only take .5 seconds), and for example after I left it on overnight in the morning it was taking 40 seconds. If I block access to the site for a while, it seems like it’s almost reset, because when I open it up again it’s lower than before. Also peculiar was that even when it was taking 40 seconds for pages to load, I wrote a script that does nothing but test how long a bunch of queries take, and they were just as fast as before.

Has anyone seen this kind of thing before, or have any guesses what might be causing it? I would appreciate any help you all could give me.

Thanks,
Max

Without know much about your O/S or Webserver, it sounds like the possibility of a virus.

If you are running Windows with IIS or PWS then I would strongly recommend that you become real familiar with the Windows update page. There are many exploitable holes in IIS and PWS (and windows in general) that if you don’t keep current with the patches, you could have problems.

Of course you could run Linux 8) and avoid many of these issues (although not immune to them, they are substantially less).

Beyond that, we would need to know more about your script, what O/S you are using what webserver, database, scripting etc… and version numbers of them to be more specific.

OS: Linux (kernel 2.4.26-HN-1.6-P4)
PHP: 4.3.7
MySQL: 4.0.18

It’s a shared host. Truthfully, it’s inadequate for my site, and I depend on a lot of mirrors to make it all work, but it’s all I can afford.

With regard to the scripts: I could give you the source code if you wanted, but there’s a lot of it. I can give a general description, though: almost every script starts with a bunch of includes - a file that defines all the functions, one that checks if the user is logged in, etc. After that, it’s just the usual stuff - validate inputs, query the database. From there it goes one of two ways: if validation fails, then a function is called that includes the open script (just outputs the beginning of the page), writes an error message, then includes the close script. Otherwise, the open script is included, then most of the output is done by a few functions then the close script is included. Generally speaking, there aren’t more than 10 queries per page (except on a few infrequently run ones that have to change a lot of stuff in the database). At the moment, I use a persistent connection and all of the data is in one database, but I’ve tried alternatives. There’s no recursion and for the most part loops only cycle through MySQL results, so I don’t -think- it’s an unterminated loop thing. For reasons stated before, I don’t think it’s a MySQL performance thing. It happens on just about every page, so I assume it’s something that happens in one of my includes or a function I use a lot.

I hope that helps. If there’s any other information you need or you’d like to see some code, just let me know. Thanks for offering your help.

How many entries do your tables contain? You might have reached a point where the hard drive is slowing down the searches. Make sure the fields you search on using an equal (join conditions most of the time) are indexed. The performance boost is incredible. We made some tests on tables with 1.5 million of entries (few gigs of data). With fields not index, the dual P4 server with 2 gigs of ram was going down. The hard drives just couldn’t keep up and the requests would take around 10 seconds each. With indexed fields, each request took 0.001 second.

Make sure you index everything correctly and try to reduce the amount of queries (use inner joins as much as possible).

The problem might also be the other users. You might want to ask your host to verify the other client’s usage and ask them to optimize their scripts.

Sponsor our Newsletter | Privacy Policy | Terms of Service