Enjoy unlimited access to all forum features for FREE! Optional upgrade available for extra perks.
NDD Camp 2024

Large file download script

Status
Not open for further replies.

SGBoise

Level 2
Legacy Platinum Member
Joined
Jan 29, 2008
Messages
41
Reaction score
0
Hello,
I'm using a script that allows my clients to download files from my store. The problem I'm running into is that for some reason the download stops at around 5 minutes. I'm guessing it's hitting the max_execute_limit set by the web host.
Below is the code that I'm currently using. Can anyone suggestion a different method to allow people to download large files?
What I think would be if I could redirect the user to the actual file once I validate that the person has access to the file but that would mean giving the person direct access to the folder which I don't want to do.
PHP:
function readfile_chunked($file_name,$ret_bytes=true) {
    set_time_limit(0);
    $memory_limit = 1;
    $chunksize = $memory_limit*(1024*1024);
    $buffer = '';
    $cnt =0;
    $handle = fopen($file_name, 'rb');
   if ($handle === false) {
       return false;
   }
   while (!feof($handle)) {
       $buffer = fread($handle, $chunksize);
       echo $buffer;
       ob_flush();
       flush();
       if ($retbytes) {
           $cnt += strlen($buffer);
       }
   }
       $status = fclose($handle);
       exit;
   if ($retbytes && $status) {
       return $cnt;
   }
   return $status;
}
Thanks in advance.
 

Luc

Old school
Legacy Exclusive Member
Joined
Jul 18, 2002
Messages
1,574
Reaction score
5
There are a few ways of handling this...

1. Simply increase the max execution time of your PHP scripts in PHP.ini so that it can run for more than 5 minutes.

2. Create a temporary directory that's web accessible. Then LINK (ln -s) the file the user is trying to download, to a temporary file with a random filename (ex: ln -s SOURCEFILE /www/site/tmp/RANDOMFILE.zip) and using PHP use header("Location: URLTOTMPFILE). That will basically send a redirect header to the remote host, and then apache will handle the actual download.

Since you're using file linking in option #2, the file doesn't use up any space on your PC but also doesn't reveal the true path and filename of the actual file, but rather some temporary file (which you can delete from cron every XX days).

Good luck,
Luc L.
 

SGBoise

Level 2
Legacy Platinum Member
Joined
Jan 29, 2008
Messages
41
Reaction score
0
Luc, you're a genius. I tried 1 already but 2 is a great idea. I'll have to try that.

Thanks.

There are a few ways of handling this...

1. Simply increase the max execution time of your PHP scripts in PHP.ini so that it can run for more than 5 minutes.

2. Create a temporary directory that's web accessible. Then LINK (ln -s) the file the user is trying to download, to a temporary file with a random filename (ex: ln -s SOURCEFILE /www/site/tmp/RANDOMFILE.zip) and using PHP use header("Location: URLTOTMPFILE). That will basically send a redirect header to the remote host, and then apache will handle the actual download.

Since you're using file linking in option #2, the file doesn't use up any space on your PC but also doesn't reveal the true path and filename of the actual file, but rather some temporary file (which you can delete from cron every XX days).

Good luck,
Luc L.
 

Luc

Old school
Legacy Exclusive Member
Joined
Jul 18, 2002
Messages
1,574
Reaction score
5
No problemo bro. If you need sample code or something shoot me a PM. I use that second method all the time.

Just make sure you read up on the LN command (file linking) so you know how it works, but it should be fairly simple.

Good luck!
Luc
 
Status
Not open for further replies.

The Rule #1

Do not insult any other member. Be polite and do business. Thank you!

Sedo - it.com Premiums

IT.com

Premium Members

AucDom
UKBackorder
Be a Squirrel
MariaBuy

Our Mods' Businesses

URL Shortener
UrlPick.com

*the exceptional businesses of our esteemed moderators

Top Bottom