8

I'm having the following problem with my VPS server.

I have a long-running PHP script that sends big files to the browser. It does something like this:

<?php
header("Content-type: application/octet-stream");
readfile("really-big-file.zip");
exit();
?>

This basically reads the file from the server's file system and sends it to the browser. I can't just use direct links(and let Apache serve the file) because there is business logic in the application that needs to be applied.

The problem is that while such download is running, the site doesn't respond to other requests.

3
  • Not that this is the problem, but when serving large files you should always call set_time_limit(0);. It shouldn't make any difference for you at the moment, but will head potential problems off that you may experience if you move this at some point to a <shudder> Windows platform. Commented Jan 17, 2012 at 11:47
  • 1
    How have you discovered the issue? Are you testing this by making multiple requests from the same machine? And are you using sessions? Commented Jan 17, 2012 at 11:53
  • @DaveRandom I noticed the problem when I tried to download multiple files(they were queued for download). I'm using sessions -- just tried and it looks like this restriction doesn't affect other sessions. Thanks for your thoughts -- I'll investigate further now. Commented Jan 17, 2012 at 12:07

6 Answers 6

32

The problem you are experiencing is related to the fact that you are using sessions. When a script has a running session, it locks the session file to prevent concurrent writes which may corrupt the session data. This means that multiple requests from the same client - using the same session ID - will not be executed concurrently, they will be queued and can only execute one at a time.

Multiple users will not experience this issue, as they will use different session IDs. This does not mean that you don't have a problem, because you may conceivably want to access the site whilst a file is downloading, or set multiple files downloading at once.

The solution is actually very simple: call session_write_close() before you start to output the file. This will close the session file, release the lock and allow further concurrent requests to execute.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks, that was the problem.
What a life saver, wish I had stumbled on this years ago.
2

Your server setup is probably not the only place you should be checking.

Try doing a request from your browser as usual and then do another from some other client.

Either wget from the same machine or another browser on a different machine.

Comments

1

In what way doesn't the server respond to other requests? Is it "Waiting for example.com..." or does it give an error of any kind?

I do something similar, but I serve the file chunked, which gives the file system a break while the client accepts and downloads a chunk, which is better than offering up the entire thing at once, which is pretty demanding on the file system and the entire server.

EDIT: While not the answer to this question, asker asked about reading a file chunked. Here's the function that I use. Supply it the full path to the file.

function readfile_chunked($file_path, $retbytes = true)
{
$buffer = '';
$cnt = 0;
$chunksize = 1 * (1024 * 1024); // 1 = 1MB chunk size
$handle = fopen($file_path, 'rb');
if ($handle === false) {
    return false;
}
while (!feof($handle)) {
    $buffer = fread($handle, $chunksize);
    echo $buffer;
    ob_flush();
    flush();
    if ($retbytes) {
        $cnt += strlen($buffer);
    }
}
    $status = fclose($handle);
    if ($retbytes && $status) {
        return $cnt; // return num. bytes delivered like readfile() does.
}
    return $status;
}

3 Comments

The new requests are waiting for download completion. I'm not quite sure how can I implement the chunk download in PHP?
Hi, Emil M. I've added the function I use to read the file in chunks.
I've just -1'd your answer so let me explain. readfile doesn't read the file contents into memory and then write it out to stdout. No, it calls the internal function _php_stream_passthru() which is a C loop similar to yours, except it doesn't do the flushes since the output_buffering INI setting will achieve this anyway. And zlib.output_compression etc. should be false for outputting already compressed content. Your suggestion is just adding complexity for no benefit.
0

I have tried different approaches (reading and sending the files in small chunks [see comments on readfile in PHP doc], using PEARs HTTP_Download) but I always ran into performance problems when the files are getting big.

There is an Apache mod X-Sendfile where you can do your business logic and then delegate the download to Apache. The download will not be publicly available. I think, this is the most elegant solution for the problem.

More Info:

Comments

0

The same happens go to me and i'm not using sessions. session.auto_start is set to 0 My example script only runs "sleep(5)", and adding "session_write_close()" at the beginning doesn't solve the problem.

Comments

0

Check your httpd.conf file. Maybe you have "KeepAlive On" and that is why your second request hangs until the first is completed. In general your PHP script should not allow the visitors to wait for long time. If you need to download something big, do it in a separate internal request that user have no direct control of. Until its done, return some "executing" status to the end user and when its done, process the actual results.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.