Is Node.cs a cure for cancer?

time to read 2 min | 394 words

This is mainly a tongue in cheek post, in reply to this guy. I decided to take his scenario and try it using my Node.cs “framework”. Here is the code:

 

public class Fibonaci : AbstractAsyncHandler
{
    protected override Task ProcessRequestAsync(HttpContext context)
    {
        return Task.Factory.StartNew(() =>
        {
            context.Response.ContentType = "text/plain";
            context.Response.Write(Fibonacci(40).ToString());
        });
    }

    private static int Fibonacci(int n)
    {
        if (n < 2)
            return 1;
        
        return Fibonacci(n - 2) + Fibonacci(n - 1);
    }
}

We start by just measuring how long it takes to serve a single request:

$ time curl http://localhost/Fibonaci.ashx
165580141
real    0m2.763s
user    0m0.000s
sys     0m0.031s

That is 2.7 seconds for a highly compute bound operation. Now, let us see what happens when we use Apache Benchmark to test things a little further:

ab.exe -n 10 -c 5 http://localhost/Fibonaci.ashx

(Make a total of ten requests, maximum of 5 concurrent ones)

And this gives us:

Requests per second:    0.91 [#/sec] (mean)
Time per request:       5502.314 [ms] (mean)
Time per request:       1100.463 [ms] (mean, across all concurrent requests)

Not bad, considering the best node.js (on a different machine and hardware configuration) was able to do was 0.17 requests per second.

Just for fun, I decided to try it with  a hundred requests, with 25 of them concurrent.

Requests per second:    0.97 [#/sec] (mean)
Time per request:       25901.481 [ms] (mean)
Time per request:       1036.059 [ms] (mean, across all concurrent requests)

Not bad at all.