I am attempting to read a large file, about 350Mb, and load it into a database (it is a directory application). The load program is a commercially used application that has been known to work with very large files. The load is failing so I’m in debug mode.
I wrote a script to simply count the records with fgetcsv, which is what the load uses. When I read anywhere 15,000 to 30,000 records, the counters that are being displayed go crazy, they go from 30, 000 to 700, then continue counting then they jump to 32,000, count some more then drop back to a lower number then count some more for a very long time. I clocked the counter at over 3 minutes (I got past the time out issues) there are no PHP warnings.
Now the interesting part. If I insert an echo “x”; statement into the read loop, the script prints out the list of x’s on the screen, the counters work as expected and time drops down to about 30 seconds. Same file and no other changes to the script.
This problem showed up on the load program and appears on my test scripts. Does anyone have a clue why this is happening?