Curing timeout problems with large scritps?

Hi all :)

I was coding a large script to update a large scale database until a c++ version could be fabricated.

Now the problem is the script will take approximately 30 minutes to complete and obviously thats way above the default timeout.

Is there a way to disable a timeout for a certain script? maybe a work around? or could it be processed via the command line version of php interpreter?

Cheers.
Miles

Hmm - maybe use ini_set to change your php setting - or use the exec to call the script from the command line…

And yeah - I’m talking out of my butt. ;)

[quote author=“lig”]Hmm - maybe use ini_set to change your php setting - or use the exec to call the script from the command line…

And yeah - I’m talking out of my butt. :wink:

I think I’m a side of screwed on this :/. May just have to recode in C or “work around” it

30 minutes to load? Whoa … sounds like you have some serious loops to go through (for some reason I have trouble accepting that this is not caused by loops in loops, or at least large loops, but correct me if I’m wrong). If possible, you could split up the code into various pages and have the first page call the second, etc.

You could also change the timeout in php.ini (and in your server app’s config file), but (older) servers may choke on it anyway.

Well its mainly due to massive data handling, and the parsing of that data several times. Finishing with various updates to MySQL…

Yeah the old system we had would refresh the page and do the next “step” but it seems a cheap way around. Plus how can you get that to work with cron jobs?

Cheap indeed … and cron jobs? That’s over my head :-?

truthfully - sounds like you have some inefficencies in your code (parse the data multiple times for example)… maybe optimize the code. One that comes to mind - if the DB allows it - do multiple updates in 1 query or write a batch file then run that. What is the logic flow of the current script?

Sponsor our Newsletter | Privacy Policy | Terms of Service