Check this code it writes in php but it takes more than 7 hours to run

ini_set('max_execution_time', 15000);

mysql_query('TRUNCATE TABLE scrip_average');

$sql_all = mysql_query("select * From all_scrip limit 8000");
  while($row_all = mysql_fetch_array($sql_all))

  $sql_last = mysql_query("select * From scrip where SC_CODE='$SC_CODE' ORDER BY ID DESC  LIMIT 1");
  while($row_last = mysql_fetch_array($sql_last))


  for ($n = 200; $n >0; $n--)

  $result_open=mysql_query("SELECT SUM(OPEN) AS total_value FROM scrip where SC_CODE='$SC_CODE' ORDER BY ID DESC  LIMIT $n");

 $result_high=mysql_query("SELECT SUM(HIGH) AS total_value FROM scrip where SC_CODE='$SC_CODE' ORDER BY ID DESC  LIMIT $n");

 $result_low=mysql_query("SELECT SUM(LOW) AS total_value FROM scrip where SC_CODE='$SC_CODE' ORDER BY ID DESC  LIMIT $n");

 $result_close=mysql_query("SELECT SUM(CLOSE) AS total_value FROM scrip where SC_CODE='$SC_CODE' ORDER BY ID DESC  LIMIT $n");


				//'img_postdate'    =>date("y,m,d"),



So many problems.
You are using dangerous obsolete mysql code that has been completely removed from Php. You need to use PDO with prepared statements. You need to learn how to join tables. Multiple nested queries is a big no no.

Are you really using eight thousand records at one time?

Bottom line, the whole script is junk and needs to be tossed.

How about telling us what the actual problem is you are trying to solve, not your attempt at solving it.


Is this a script for a game?

There has to be a better way than whatever it is that does.


To optimize code/queries, you need to first find out what part of the code/queries is taking the most time, by profiling the code execution.

To do this, you need to add program logic that keeps track of and then displays the time that various parts of the code and each query takes to execute. You can then concentrate on optimizing the parts of the code/queries that take the most time.

Some things that will have the biggest impact -

  1. The indexes in the database table(s).Are the columns you are referencing in the WHERE clause and ORDER BY clause indexes?

  2. Running queries inside of loops. This is a big performance killer, since it takes time to communicate each query between php and the database server and then for the execution time on the database server. Whenever possible, you should avoid running queries inside of loops, by using single/fewer JOINed queries that get the data you want in the order that you want it, and if you do have a case where you need to run queries inside of a loop, you want to run the fewest queries and make each one as efficient as possible.

  3. Each query that gets executed more than once should be a prepared query, so that you can save the time needed to send the query to the database server and for the database server to plan the execution of the query. To accomplish this, the code will need to be converted to use the php PDO extension, which supports prepared queries, and since the mysql_ extension is obsolete and has been removed from the latest php versions, you will want to convert the code anyway so that it will keep working should you or your web host update the php version. The PDO extension also uses current driver code that is more efficient than the old mysql_ extension uses, so this conversion in itself will result in a performance increase.

Lastly, since you are repeatedly operating on the data in the scrip table, I would just retrieve the last 200 rows of data, in turn, for each SC_CODE value, and store it in a php array variable, then operate on the data in this array using php code. Inside the for(){} loop, you can use php’s array_column() and array_sum() to get the sum from each column of data. You can then remove a row from the data array in each iteration of the loop and then operate on the remaining rows in the next iteration of the loop.


On a site I maintain, I run an extremely complicated linear regression code loop on 37,000+ table entries. Each record contains 140 fields all are processed and updated each night. The process takes about 30 minutes at most. I think you need to step back and look at what you need computed and learn about using join’s as benanamen mentioned. You need to make the entire process into an MySQL verion. Pulling out data into PHP and then reprocessing it back in many many queries, well, just defeats the purpose of using queries! I would start by listing what the process does and create a more complicated query for it.


Looking at your code itself, i have many many questions.
First, you pull out only 8000 items. How many fields in each item? (1, 5, 3000?)
Next you loop from 200 to 0 and do numerous queries on values.
So, you process $n as 200, then 199, then 198??? Why do you do this loop?
This means that you are creating sums based on random values since you limit the query to a set number.
This does not make sense to me at all. You create sort of a “sliding scale” of items, but, it is not logical.
Are you attempting to create a linear regression based on field values? That can be done in one small query. It does not take any math at all and is just a few lines of code. Would run in 2 seconds on 8000 rows of data…
Perhaps you should explain your data structure and what you are attempting to pull out of it…