Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a counterpoint, I built a relatively large system using Node (this was a couple months ago, to be clear) and had issues with several modules. mysql: cannot handle binary blob columns (this is just now being fixed in an alpha version of the library). mysql: very slow parsing of large responses. aws*: many half built / half broken libraries -- nothing that met our modest needs. request: (http request library) found several issues.

Node.JS has a great community that is writing many great modules, no doubt. But the community is very young and almost by definition many of the modules are immature.

This can be very enjoyable from an engineering perspective (you get to write and hack on things that you would not otherwise), but can also slow down the process of building things since you do end up having to reinvent the wheel at times.



In our case, it isn't a large system. Our node production environment serves up a smallish set of API/Ajax requests that are heavily used. With that, we've been able to eliminate more than a few Web servers that otherwise had to load the entire Apache/PHP stack just to serve a small 20-100 byte request.

With your response in mind - I don't see Node being mature enough (yet) to solely support an end-to-end large scale Web property. But, for serving up API content from Memcache/MySQL - it is blazing fast with minimal footprint. On our stack - we run Node right on our existing Apache Web servers (behind haproxy).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: