How to code my website to download files from a file sharing site

I have two large files that I want to make available for users to download. When trying to download from my web host, the download never starts and I get a “504 Gateway Timeout”. How would I revise my code to download from a file sharing site? I tried the following code but get a “This page isn’t workingfoxclone is currently unable to handle this request.”

Appreciate any help you can provide.

<?php

function mydloader($l_filename=NULL) {

    $php_scripts = '../../php/';
    require $php_scripts . 'PDO_Connection_Select.php';
    require $php_scripts . 'GetUserIpAddr.php';

    if( isset( $l_filename ) ) { 
      
       if substr($l_filename, 0, 10)=='foxclone_s' {
            $url_path = 'https://www.mediafire.com/file/tyqr4tzthiftuy1/foxclone_std-50-06.iso/file' }
       elseif substr($l_filename, 0, 10)=='foxclone_e' {
          $url_path = 'https://www.mediafire.com/file/xnovs5ltdj2z5wr/foxclone_edge-50-07.iso/file'  }
      else
             
              header('Content-Type: application/octet-stream');
              header("Content-Transfer-Encoding: Binary"); 
              header("Content-disposition: attachment"); 
              header('Pragma: no-cache');
              header('Expires: 0');       
              readfile($l_filename);
    }
              

        $ip = GetUserIpAddr();
        if (!$pdo = PDOConnect("foxclone_data"))
        {	echo "unable to connect";
            exit;
        }

        
        $test = $pdo->query("SELECT lookup.id FROM lookup WHERE inet_aton('$ip') >= lookup.ipstart AND inet_aton('$ip') <= lookup.ipend");
        $ref = $test->fetchColumn();
        $ref = intval($ref);

        $ext = pathinfo($l_filename, PATHINFO_EXTENSION);
        $stmt = $pdo->prepare("INSERT INTO download (address, filename, ip_address, lookup_id) VALUES (?, ?, inet_aton('$ip'), ?)");
        $stmt->execute([$ip, $ext, $ref ]) ; 
    
       }
        
    else {
        echo "isset failed";
        }  
}
mydloader($_GET["f"]);
exit;


Well, there are two ways to download files from your website. One is to just place a link on the page pointing to the file. < a href=‘yoursite/yourfolder/yourfilename’ > Loosely like that and let the user download by clicking on your anchor link.

Or, use passthru to force a download from a button. Or from a posted form request. Loosely, something like this will force a download to start without the user pressing an anchor:

        header('Content-Type: application/download');
        header('Content-Disposition: attachment; filename=' . $filename);
        header("Content-Length: " . filesize($filename));
        $fp = fopen($filename, "r");
        fpassthru($fp);
        fclose($fp);

I have found that some servers have issues with the header formatting. But, this version makes it a download with an attached file. Works fine on one of my sites. Hope this helps!

Thanks for the reply. I’m still getting the “504 Gateway Timeout” using your code. That’s the reason I want to try to use a file sharing site to redirect the download from. The files that are giving me problems are 636MB and 870MB .iso files.

I tried several ‘file chunking’ schemes and none of them worked.

Well, you didn’t mention that size files. You can not download that size file without telling the site to remove timeouts. Basically, PHP times out in various ways to prevent locked code problems. Also, depending on your server, you need to store the entire file in the server’s memory. Therefore, you need to fix timeouts and memory size before running your page. Loosely, something like this might help:

set_time_limit(0);
ini_set('memory_limit', '2048M');

The first line tells the PHP script to not time out if the download takes a long time.
The second sets the server’s memory size for PHP to 2gigs which should cover the program and data.
Not sure this will work, but, it might solve it. Try it and let us know. The 504 error is described as:

The HyperText Transfer Protocol (HTTP) 504 Gateway Timeout server error response code indicates that the server, while acting as a gateway or proxy, did not get a response in time from the upstream server that it needed in order to complete the request.

This says that one or more of the server’s or proxies are timing out. It could be the first server or second. Therefore, you must set the timeout fix on one or more servers if you are using cross-server accesses.
Hope this helps!

@ErnieAlex, my website now gives me a different error after adding that to the download script. The error is now " Error 503 - Service Unavailable
The resource you requested is currently unavailable. Typically this is a temporary condition. Please contact the web site owner for further assistance."

Do you have any other ideas?

Well, that could be many different causes. One is that your server has a memory limit and you set it too high. You might want to try it again but with 900megs instead of 2gigs. If your file is 870megs, 900 should cover the code, html and file.

Also, for further details on your error, check your server logs. Is this a dedicated server? If so, you can alter the base PHP settings to set them higher memory use. If it is a shared server, it might not allow as much memory and probably does not allow 2 gigs.

I checked the logs on the server and here’s what I’m seeing:
[19-Mar-2022 15:57:29 UTC] PHP Fatal error: Allowed memory size of 734003200 bytes exhausted (tried to allocate 636485632 bytes) in /home/foxclo98/public_html/download/mydloader.php on line 22
[19-Mar-2022 15:57:29 UTC] PHP Fatal error: Unknown: Cannot use output buffering in output buffering display handlers in Unknown on line 0

It wouldn’t work at all with 900MB allocated. I got the same error, so I tried 700MB. What is the second problem about output buffering?

Got the same errors with “ini_set(‘memory_limit’, ‘800M’);”

This is a shared server. That is why I wanted help on serving them from a file sharing site.

Well, the first message says it ran out of space at 734megs. So the 900mb limit should work.
The second error says your server is not set to output buffer. Is this a dedicated server or shared server?
And, is this a Word Press site? WP has other issues with large files. If it is WP, you can set it this way.

define('WP_MEMORY_LIMIT', '1024M');
define('WP_MAX_MEMORY_LIMIT', '2048M');

Also, if it is a dedicated server and you access the php.ini file, you can turn on output buffering with:
output_buffering = on or output_buffering = 900m

This is a shared server. When I had it set to 900MB, I got the “Allowed memory size of 943718400 bytes exhausted (tried to allocate 636485632 bytes) in /home/foxclo98/public_html/download/mydloader.php on line 22”. Not WP site. I do have access to php.ini thru Cpanel. Output buffering is set to On.

Well, that means you need more than 950megs to let it work. so try 2048megs.
Or, just use anchors…

I don’t want my users to have to click twice, once from the calling script and once in this script. How can I use anchors without having to click on a button?

with ini_set(‘memory_limit’, ‘2048’); I get a zero byte file instead of the " Error 503 - Service Unavailable
The resource you requested is currently unavailable. Typically this is a temporary condition. Please contact the web site owner for further assistance."

I am not sure really. Perhaps you should test it first with a small file, like 1 meg file. And, see if that works.
If so, then it is a size issue. If not, then it is something else in your server. Normally, there are extra details
in the server logs. Does it indicate what line is failing? Try the small file and let us know what happens!

It downloads a 50MB file without problems.

The error indicates line 22, which is the readfile(). I guess this boils down to a memory issue on my shared server.

I just tried doing a html <a ref download and got a zero byte file. That tells me that I have to find a way to download the files from a file sharing site.

Not exactly, but, at least we are getting somewhere finally! This could mean a memory issue or it might be an HTTP issue. There are limits when sending items to the web browser. All of these can be an issue:
memory_limit upload_max_size post_max_size upload_max_filesize max_execution_time max_input_time

memory_limit = 900M
upload_max_size = 900M
post_max_size = 900M
upload_max_filesize = 900M
max_execution_time = 300
max_input_time = 1000

Any of these might help fix the problem. I would start with the upload_max_filesize first.

Made the suggested changes - no change. Still getting a zero byte file.

Well, it works with a 50m file. So, test with larger ones and see where it fails. Then, adjust each of these one at a time, not all of them. You can also run Google Chrome Inspector and view the network status as you download the file and see where it is failing.

I got with my web host and they made some adjustments on their end. Things seem to be working correctl but I won’t know for sure until we get clear sky here. I’m on satelite and we’re having thunderstorms in the area. My download speeds are below 50kb/sec.

Oh, great news! Hope it works good for you soon!

It works great! Thanks for trying to help.

Nice! Glad you solved it!

Sponsor our Newsletter | Privacy Policy | Terms of Service