Eric Hodel (drbrain) wrote,
Eric Hodel
drbrain

mod_fastcgi and Max Processes

So, I wrote up the following little ruby-fcgi script:

#!/usr/local/bin/ruby

FCGI_PURE_RUBY = false
require 'fcgi'

count = 0 

FCGI.each_cgi do |cgi|
  sleep 60 if count < 3 
  cgi.out { "hi: #{count}, #{Time.now}" } 
  count += 1 
end


Then, I configured the server with only one FCGI process, and hit it 7 times in a row, waiting for the first to finish before grabbing the next. Here's the results:

Mon Jan 17 16:23:07 PST 2005 500 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
Mon Jan 17 16:23:37 PST 2005 500 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
Mon Jan 17 16:24:07 PST 2005 500 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
Mon Jan 17 16:24:37 PST 2005 500 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
Mon Jan 17 16:25:07 PST 2005 500 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
Mon Jan 17 16:25:37 PST 2005 200 hi: 5, Mon Jan 17 16:25:38 PST 2005
Mon Jan 17 16:25:37 PST 2005 200 hi: 6, Mon Jan 17 16:25:38 PST 2005


The <!DOCTYPE lines are Apache 500 pages from the script timing out after 30 seconds of not writing anything. As you can see, there's 5 500 errors, not three. When you hit the max process limit, your requests just queue up, sending your server into a death-spiral of doom.

Fortunately, though, so long as the requests are queued, they'll continue to be fulfilled by the FCGI processes.
Subscribe
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 0 comments