Ayende @ Rahien

It's a girl

Is Node.cs a cure for cancer?

This is mainly a tongue in cheek post, in reply to this guy. I decided to take his scenario and try it using my Node.cs “framework”. Here is the code:

 

public class Fibonaci : AbstractAsyncHandler
{
    protected override Task ProcessRequestAsync(HttpContext context)
    {
        return Task.Factory.StartNew(() =>
        {
            context.Response.ContentType = "text/plain";
            context.Response.Write(Fibonacci(40).ToString());
        });
    }

    private static int Fibonacci(int n)
    {
        if (n < 2)
            return 1;
        
        return Fibonacci(n - 2) + Fibonacci(n - 1);
    }
}

We start by just measuring how long it takes to serve a single request:

$ time curl http://localhost/Fibonaci.ashx
165580141
real    0m2.763s
user    0m0.000s
sys     0m0.031s

That is 2.7 seconds for a highly compute bound operation. Now, let us see what happens when we use Apache Benchmark to test things a little further:

ab.exe -n 10 -c 5 http://localhost/Fibonaci.ashx

(Make a total of ten requests, maximum of 5 concurrent ones)

And this gives us:

Requests per second:    0.91 [#/sec] (mean)
Time per request:       5502.314 [ms] (mean)
Time per request:       1100.463 [ms] (mean, across all concurrent requests)

Not bad, considering the best node.js (on a different machine and hardware configuration) was able to do was 0.17 requests per second.

Just for fun, I decided to try it with  a hundred requests, with 25 of them concurrent.

Requests per second:    0.97 [#/sec] (mean)
Time per request:       25901.481 [ms] (mean)
Time per request:       1036.059 [ms] (mean, across all concurrent requests)

Not bad at all.

Comments

Ferret Chere
10/05/2011 10:46 AM by
Ferret Chere

Call me naive but isn't this completely pointless/meaningless when you're benchmarks are being run on a completely different machine to the "node.js is cancer" writer?

Merouane Atig
10/05/2011 10:55 AM by
Merouane Atig

It seems to me that you arrive to the same conclusion than him on his new post http://teddziuba.com/2011/10/straight-talk-on-event-loops.html Use threads!

Khalid Abuhakmeh
10/05/2011 11:20 AM by
Khalid Abuhakmeh

That "Node.js is Cancer" article is hilarious. I understand what he's saying, but the Node.js community is still fledgling. He pretends that the people behind UNIX got it right the first time, which I'm sure they didn't. Any issue that Node.js has will probably be solved in time as that community grows. Let's not forget that Node.js is just a tool in your developer toolbox. It isn't meant to be the silverbullet for all your werewolf killing needs.

Still... very funny. If you haven't read it, it's worth reading.

Alexei K
10/05/2011 12:52 PM by
Alexei K

That whole thing was so hilarious. The responses to the original troll article are even worse than the original troll. Here is a distilled summary, for those who missed it:

http://www.unlimitednovelty.com/2011/10/nodejs-has-jumped-shark.html

Also, I've learned a new word: "roflscale". It is now my mission to find a reason to use it in the workplace.

Nican
10/05/2011 02:37 PM by
Nican

This is hilarious.

https://github.com/glenjamin/node-fib

So far, I have counted NodeJS, Python, PHP, C#, Ruby, Haskell making a Fibonacci server.

Uriel Katz
10/05/2011 02:52 PM by
Uriel Katz

The problem is that people doesn't understand why they need node.js and why it works very good for certain cases (file upload,chatting). Using event loops without threads in a CPU intensive workload is stupid and mostly useless.

BUT if you workload is mostly IO bound(like 99.999% of webapps are) then using event loop based IO is very good and necessary in some cases like chat,file serving/uploading with many concurrent users.

At work(Binfire.com) I designed(python with gevent) a file upload/download service that proxies cloud files and it could handle a DDoS attack from 200 IPs(apart from normal traffic) trying to download a 200MB file(the case for many concurrent long lived connections) and it used 45MB of memory.

If it was written with 1 thread per client it would use more memory(1MB thread stack instead of microthread 4KB stack) and more CPU(real context switches instead of getcontext/setcontext low overhead).

So the lesson for this is quite old:Use the right tool for the job!

tobi
10/05/2011 03:12 PM by
tobi

I understand neither Teds post nor yours. Both of you take the worst possible scenario for node.js and use that to justify ... something.

The reason why node.js is total crap is different: Your code gets pulled inside out. You cannot even add logging later on without needing to refactor the whole call tree. Logging needs to do IO, so it needs to be async.

node.js is a very special purpose lib for chats and file transfers like Uriel said. I would just use an async actionmethod/httphandler with asp.net for that. It is really embarrassingly useless for production.

Demis Bellot
10/05/2011 06:11 PM by
Demis Bellot

Ted's troll post is annoying, he focuses on known event loop design limitations and uses it to discredit the entire technology.

Here's node.js response post showing how to handle high CPU-load tasks like video encoding in node.js: http://blog.nodejs.org/2011/10/04/an-easy-way-to-build-scalable-network-programs/

Rafal
10/05/2011 07:27 PM by
Rafal

Thanks for linking Ted's blog, it's great.

Felice Pollano
10/10/2011 10:58 AM by
Felice Pollano

Just for the citation: Fibonacci has two 'c' ;)

Comments have been closed on this topic.