Solving Concurrent Request Blocking in PHP

Published on June 3, 2013 by

If you have tried to make several concurrent AJAX requests (e.g. with jQuery) to a PHP script, then you may have experienced that the requests were not really handled concurrently. Instead, the requests were completed sequentially and thus blocking each other. The HTTP connection limit in the browser is probably not the cause, as most browsers have a limit of at least six concurrent HTTP connections.

The more likely explanation is that your script is either writing or reading session data. By default, session data in PHP is stored in files on the file system. Normally writes to the file occur once the script terminates, and to prevent concurrent writes, an exclusive lock is obtained on the session file. This means that at any given time, only one script (i.e. process or thread) may operate on the session data. As a result, other concurrent requests are simply waiting for the lock to be released such that the execution can proceed – one at a time. This causes the execution to be sequential rather than parallel as expected.

Luckily, PHP provides the session_write_close that can be used to resolve this. What this function does is to write the session data to the file and release the lock, which then lets other requests access it. Therefore, as soon as all the necessary data has been saved to the session, this function should be invoked. At that point, the request will stop blocking other concurrent requests for the same script. It will still be possible to read session data after the function call, but no writing can occur.

The code below is an example of a scenario where the session_write_close function can be used to increase the throughput of the script.


$_SESSION['some'] = 'value';
$i = 0;

while ($i < 10) {

While the above example is purely theoretical, imagine that five concurrent AJAX requests are made to the script. Each execution lasts for roughly ten seconds. This means that each request will block the subsequent request with approximately ten seconds. This only happens because session data is accessed, and thus a lock is obtained on the session file. As you can imagine, this blocking is not necessary because the session data is only changed before the time consuming part of the script. As such, one can close the session immediately after changing the data to save time, because then the request will no longer block other requests. This approach is reflected in the code snippet below.


$_SESSION['some'] = 'value';
session_write_close(); // From here on out, concurrent requests are no longer blocked
$i = 0;

while ($i < 10) {

You will still be able to read session data after closing the session, so you only have to be cautious about when you are changing session data. After introducing this simple function call, you should notice that your requests are now executed in parallel and are no longer blocking each other (after the function call). This can potentially improve the throughput of your application greatly, depending on the use case.

Author avatar
Bo Andersen

About the Author

I am a back-end web developer with a passion for open source technologies. I have been a PHP developer for many years, and also have experience with Java and Spring Framework. I currently work full time as a lead developer. Apart from that, I also spend time on making online courses, so be sure to check those out!

13 comments on »Solving Concurrent Request Blocking in PHP«

  1. Logan

    Thanks for posting! We ended up using this to decouple the user session from a long-running process, once the process had gotten everything it needed from the session, allowing the user to continue to make non-blocking requests to the server while the process executed in the background. Amazing how simple a solution it ended up being to a rather complex problem!

    • Thank you for your comment, and I am glad that you found it helpful. :-)

  2. Wilfried

    Thanks for posting, you have my eternal gratitude

  3. Hirbod

    Today, I want to celebrate. I had one of these “ahaaaaa” moments after more then 8 years experience with PHP. I really don’t know why I’ve never read anything about that.

    My application (works with Angular.js and makes a lot of async calls to the API) laods up to 20 times faster now.. I really want to say: I love you!

    • That’s awesome to hear. I’m very happy to hear that it helped you out with your application. :-)

  4. akshay

    What if our php script is writing to some txt file and recieving huge requests concurrently?
    Will blocking occur in that case?

  5. Xplouder

    Great post!
    What about DB concurrency? like change the same row on same request.
    I mean, doing this trick we might have other problems with for example DB right? If so, any idea how to handle it?
    Thank you!

  6. Bernard

    Thanks for this Post. Solved my problem.

  7. javad madani

    thanks, very useful tip

  8. Rahul Ahiray

    I have been facing this problem for months where a single request blocks every other request that is made. Adding a single line to the code solved my problem. Thanks a lot!

  9. Glenn

    Thanks, couldn’t find this on StackOverflow but your blog post is exactly what I needed!

  10. Kishore

    WOW just WOW, in my case i really have to block the concurrent requests, after reading your blog and trying the code myself with a timer in hand to check the requests getting executed i was amazed.

    Great Post.

Leave a Reply

Your e-mail address will not be published.