As most people involved with Web development know, Ruby on Rails 2.0 was released on December 7. There was some discussion about the release on Slashdot, where I noticed some interesting comments regarding scalability in the age of Ruby on Rails.
I found this one comment in particular to be worthy of further examination. The comment author describes an online game he or she had implemented using Ruby on Rails, and stated, “It’s running of a single dedicated server and it seems to be handling a lot of requests just fine. During peak periods, we’ve got multiple requests per second and I’ve never had any complaints about the performance.”
Given the massive amount of processing power offered by even a low-end consumer PC today, nevermind a server-oriented system, there’s absolutely no reason why a Web site should not be able to handle “multiple requests per second”.
About a decade ago, I developed a number of Perl CGI scripts for one Web site. At peak times, they were getting upwards of 50000 hits per hour. That’s about 14 hits per second. But the interesting thing is, they were serving their site off of two high-end SPARCstation 20 systems. For those unfamiliar with such systems, in terms of processing power they’re comparable to a typical PC from 2000 or 2001. One of the SPARCstations ran the database server, the other ran the Web server and the CGI scripts. During the time I was affiliated with them, scalability was never an issue. And this was with a limited amount of caching, distributed processing, and other scalability techniques that have become more important.
If a scripting language like Perl was suitable for developing highly-utilized Web sites a decade ago on hardware that was already a few years old, there’s no reason why Ruby on Rails shouldn’t be able to handle “multiple requests per second” on modern hardware.