If you’re even moderately involved in the JavaScript world these days (and you probably are if you’re reading this blog) you would have to be dead asleep to not have noticed and heard some of the hype and celebration for the poster-child for server-side JavaScript: Node.js.

I regularly follow the chatter on the interwebs, and I’m amazed and thrilled at just how much gravity Node.js has accumulated in terms of developer excitement and actual project input. In fact, in some ways, the Node.js ecosystem of companion projects is even more awesome than Node.js itself! It’s a fantastic example of how the community was desperate for something (we didn’t even know exactly what) and how well we quickly rallied around it when we finally found it. It’s plainly obvious that server-side JavaScript is an idea whose time has come, and Node.js, in many ways, will take us there.

This post is an attempt to put my little niche spin on what Node.js could mean for someone wanting to tackle re-architecting the middle-end of their web application.

What is Node.js?

In the broadest terms, Node.js is an application server platform. It’s actually a server-side JavaScript execution environment (roughly similar to something like Narwhal) wrapped around the V8 JavaScript engine. But it’s a very special type of environment compared to other options in this space. Node.js is completely asynchronous. This means that everything you do in Node.js, you do in terms of asynchronous-friendly API’s, like network calls, file i/o, etc.

But even more important is that Node.js is specially designed to operate as an independent and fully-functional network server. What do I mean by this? Node.js’ flagship capability, and indeed how most people use it, is its ability to “listen” for incoming requests on a particular network port (like port 80 for web traffic) and service those requests like a web server like Apache or IIS might do. In other words, Node.js at its most optimal is a drop-in replacement for your current web server. And it wraps in a fully capable application server (using JavaScript) automatically. Cool, huh?!

Because Node.js is asynchronous, it doesn’t operate under the covers at all like other web servers do. Instead, it operates in an “evented processing loop” where it is simultaneously and asynchronously listening for incoming connections, firing off processing to handle each connection, and then “listening” for those processing contexts to finish to hand the results back to the requesting connection stream. The result is that in many use-cases, Node.js is able to achieve mind-blowing amounts of parallel processing and throughput compared to more standard web servers like Apache.

Bottom line: Node.js’ core competency is to take a server-side JavaScript application server environment and wrap it cleanly around a super-efficient asynchronous network server. In most respects, this is simply the most efficient server-side JavaScript environment you’re likely to ever find, and it allows JavaScript to compete head-to-head with even optimized, compiled binary alternatives.

Is it for me?

Up until now, every thing I’ve spoken about and written regarding middle-end architecture and server-side JavaScript has been conspicuously silent on the topic of Node.js. There is good reason for that, but I only want to touch briefly on it here, by means of comparison. The next post will dive into this much more thoroughly.

The utter awesomeness that is Node.js comes with a price. For most developers who are hacking and tinkering with new ideas all the time, this price is mere “pocket change” and that’s probably the biggest reason why the Node.js community has grown so quickly and so broad. But there is a “silent majority” lurking out there for whom the Node.js price may not be quite so trivial. What is this price? Infrastructure.

Thus far, my focused efforts have been on finding the lowest possible barrier-of-entry into the server-side JavaScript world. By barrier-of-entry, I mean the least amount of footprint/impact on existing infrastructure/architecture and to existing maintenance and support/IT staff. The current fruits of that labor has been the humble BikechainJS server-side JavaScript project.

I shyed away from presenting my middle-end ideas in the context of Node.js because there are many who cannot necessarily proceed under the guise of replacing their top-level web server (along with all its associated dependencies, modules, configurations, etc) with an entirely new (and fundamentally paradigm-shifting) solution like Node.js. No doubt we’d mostly all agree that it would be exciting and probably even more efficient, but the slow-to-change momentum of existing applications, teams, infrastructure, maintenance, reliability, and IT support staff have a noticeably chilling effect on the hyper-excited server-side JavaScript movement.

It was my goal that something like BikechainJS, with its synchronous, per-request paradigm, could squeeze much more nimbly into existing application infrastructure, even at the cost of the wins from Node.js’ performance.

But Node.js is just so damn awesome!

That’s absolutely true. And I’ve come to believe that the awesomeness of Node.js does not have to be mutually exclusive of the middle-end architectural ideas I’m advocating for, nor does it have sit out of reach from so many existing web applications, dev teams, and web shops.

Node.js can (and perhaps should!) be the magic key to unlocking the full potential of your application’s middle-end.

What if we can have our cake and eat it, too? What if we can find a clean way to plug Node.js into the existing infrastructure of our web applications, and at the same time give it the power to revolutionize our middle-end tasks? We’d get exponentially better performance and revolutionary better code architecture. That idea is just so full of win it’s hard to type without going nuts!

Augment, not replace

My biggest mental sticking point all along with Node.js has been the (im)practicality of asking an existing application to just simply swap out its entire web server tier for Node.js. I explored even the idea of running Node.js in a more limited, synchronous, per-request (CGI-like) context, but quickly found that was like trying to teach a bird to swim.

Then it hit me. The best way Node.js revolutionizes the middle-end of your existing/legacy web application is if you build your middle-end Node.js-based and insert it wholly into the stack in between the browser and your existing server.

In this respect, your middle-end Node.js layer becomes a “proxy” (or “web balancer”) server of sorts, sitting in front of your existing web server. All you have to do is bring up a Node.js VM/server instance (even cloud-based!) and direct all your primary traffic to that instance first. Then, you build out your middle-end architecture, doing templating, URL routing, data validation, and all the other tasks, as necessary, in your Node.js server-side JavaScript, and finally, you hook Node.js up to ferrying requests back over to your existing application server.

In this blind-proxy model, you start off with a dumb pass-thru of all your application’s requests, and then one-by-one you can inject some intermediate middle-end logic using the server-side JavaScript. For instance, as I talked about before, you can start doing data-validation of inbound data fields using your Node.js-driven JavaScript. And then you can move on to evolving your templating into a true middle-end task in your JavaScript. And so on.

Win, win, win

The benefits of this approach are hard to explain by mere words. First and foremost, you will see an insane jump in the request/response performance simply by letting Node.js manage your application’s front line web server handling. But equally important, you gain invaluable flexibility to start converting your thick back-end into a well-crafted middle-end/back-end approach. And you don’t have to change very much of your existing infrastructure at all.

This is what I like to call a “middle-win” scenario! Node.js really rocks the middle-end.

This entry was written by getify , posted on Thursday July 15 2010at 04:07 pm , filed under JavaScript, UI Architecture and tagged , , , , . Bookmark the permalink . Post a comment below or leave a trackback: Trackback URL.

14 Responses to “Why Node.js rocks the middle-end”

  • Julián Landerreche says:

    Coins have begun to drop in the back of my head. This all sounds (and probably is) very interesting, really. It’s a shame that I suck at real programming, being it JS (being it front-end or middle-end) or some “real” server-side language.

    But I will keep tuned to this new ideas and concepts, and will try to set some server to play with this.

    This may be a dumb question: you say that one key advantage of this would be reusing, in the middle-end layer, the same JS that has been written for the front-end layer. That really sounds too good to be real, but I believe you! Then, the question is: it would be possible to run server-side jQuery on top of the Node.js/V8 combo? That would make it really easy for us non-programmers code monkeys to begin playing with stuff like this…

  • getify says:

    The Yahoo team has got YUI running on top of Node.js… I don’t think it would be that difficult to do something similar with jQuery. I’m not personally a big fan of the approach of emulating a DOM on the server side, only to have that DOM serialized down to a string to send over the wire and then be reinterpreted as another DOM. That kind of thing seems kind of inefficient to me. But I know a lot of people like that.

    But as far as running the same code in both browser and server, this is actually really easy. For instance, simple code that takes input data and runs validation rules against it and returns true or false… such a function can be written in naked JavaScript and that same code can run identically in both environments. There’s actually a lot of such middle-end code that can be dual-used. IMHO the biggest win is having a JavaScript based templating engine, so you can run the exact same templates in both places. I have built HandlebarJS for exactly that purpose.

  • mde says:

    Geddy is a web framework for Node.js built around this idea of full-stack JavaScript: http://geddyjs.org/ Model and validation code runs on both client and server, and templating is simple EJS that works equally well on both sides too.

  • Arthur Blake says:

    Nice post (although I really hate the term middle-end. It’s so oxy-moronic…) Is node.js stable and secure enough yet to be the front end proxy? I’d hate to see a node crash or hack effectively bring down the rest of my website. I was actually thinking of doing it in a side-by-side approach. Have apache or other legacy web servers handle the bulk of legacy requests, and then have node.js on the side as another server with it’s own IP address- Handling a gradual “sprinkling in” of high performance ajax tasks that are added to the application over time. Of course, how this would actually be accomplished depends entirely on how your specific web app architecture is laid out. This could bring up additional challenges with cross-domain (or cross-host) scripting that would be required to make it all work, but it’s doable. There are so many ways to put this together it’s not even funny. It will be interesting to see what eventually emerges as best practices in this exciting space.

  • getify says:

    Thanks for your comment! But why do you hate “middle-end” so much? What label (if any) do you think is appropriate for all these web app specific tasks?

  • Arthur Blake says:

    I just tweeted you about this! Mainly because of the obvious contradiction: If it’s in the middle, it’s not on an end!!! One thing I hadn’t considered though is maybe people are using this term in a half joking or humorous manner.. Or maybe you are using “end” as in “means to an end”. ?

  • Arthur Blake says:

    I didn’t answer your question. I would say the “node” part of nodejs was a very thoughtful, simple and insightful term to start with. Everything’s a node, in an event based distributed system. The old model of n-tier stacking doesn’t really apply anymore.

    I don’t really have a term, but something that uses terminology like this would be better, IMHO.

  • getify says:

    Everything is a “node” in a Node.js (aka, event-based) system. And it’s an exciting thought to realize that someday we may get to that someday.

    But I still think there’s a lot of standard architecture web apps out there… that is, there’s clearly a front-end and a back-end. I’ve seen people draw that line in different places, and I’ve seen others semantically argue things like “end-to-end”. But no matter what you label it, the stuff that’s “in between” usually gets ignored/forgotten, and I’m trying to call attention to it and help convince devs that these tasks need their own architecture. Without a label, it’s much harder to get any attention to it.

  • MK says:

    Once again, great article.

    You talked about an alternative to classical MVC pattern called CVC (Client-View-Controller) to allow simple implementation in current web apps architecture. The whole idea, as far as I understand it, whould be to talk to a backend black box API (preferably a RESTfull JSON Api) for pure bussiness logic. Let’s imagine that we could easily turn our MVC backend architecture to respond with views that returns only plain JSON response (as you have brilliantly suggested with WordPress templating system in CMS), how we could use this approach in a node layer? Would it be an additionnal MVC stack that use some HTTP client to talk to the backend? I’m currently working hard trying to figure out how to bring those wonderfull ideas in existing applications that don’t necesseraly use an as clean and robust extending playground as WordPress and other CMS might do.

    Anyway, please keep writing so good stuff around this genuinely exciting concept.

  • getify says:

    @MK –
    Thanks, I’m glad the articles have been helpful and thought-provoking.

    Yes, CVC is my alternative proposal to supercede MVC, as I think it stresses the more important parts of the architectural pattern for what is most needed in today’s complex web apps.

    The key idea is that your back-end application is, as you say, a “black box”, which means I don’t care what happens inside it. The reality is that it will probably be some sort of MVC (or MVC-like) implementation, and so you could turn the V-views into JSON serializers similar to what I suggested for WordPress in the CMS/Middle-end post.

    How Node.js might be useful layered with that approach would be to have your middle-end code written in JavaScript in a Node.js web server layer, and that code be a “client” of your back-end black-box. Basically, that would amount to sending and receving JSON from the back-end, which your Node.js middle-end could easily create and consume.

    Inside the Node.js middle-end layer, you could choose a variety of different approaches. The typical approach would be to try and find some pre-existing framework for Node.js, but I don’t really advocate turn-key frameworks for these tasks, as I think they’re typically what gets us into trouble by hiding away from us all the details we’re trying so hard to get to.

    Instead, I’d recommend an exceedingly simple set of modular code (each module being its own “UI Controller” in the CVC pattern), one module for each task. So, you’d have a simple module for handling your URL routing, another for data validation, another for templating, etc.

    I don’t think you’d need a full-blown MVC inside your middle-end. But your own personal tastes and preferences could dictate how simple or complex you wanted to architect that code. I can just say that I’ve written proof-of-concept middle-end code in such modules (not in Node.js, but should be very similar) and the code is extremely simple and straightforward — there’s no need in my mind for something too overblown.

  • Adam says:

    Really good stuff in here (both article and comments.)
    Any idea how to secure all this? I will need to run external written code in JS on server side, and would like to use node.js.

    Could I securely limit the type of modules available? prevent external code from touching internal written one?

    How would I handle multiple external code instances?

  • getify says:

    Those are great questions, Adam. I’m not sure I have great answers to them, though.

    Who are you trying to “securely limit” the type of modules from? Malicious modules bringing in tag-alongs? Or other developers?

    The CommonJS standard for modules is that they are completely sandboxed, meaning they cannot modify the global namespace on their own. That is a pretty effective means of keeping modules from interferring accidentally or intentionally.

    Not sure what you mean by “multiple external code instances”?

  • Adam says:

    @getify
    Well… I would like to allow untrusted code to run on my node.js server. That code would come from user entering code on a website (competing for best algorithm).

    I thought of requiring each code to be “wrapped” in a Module form. However, if that code could use node.js/CommonJS functionality, it is not really secure – right?

    I found the node-sandbox module, any experience with that?

  • Matt D says:

    @Adam and untrusted code
    i would run separate nodejs instance in an openvz container, and forward all user code to this. You can manage the memory consumption and processing power designated to user submitted code this way as well, keeping it from affecting the nodejs/environment running your site

Leave a Reply

Consider Registering or Logging in before commenting.

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Notify me of followup comments via e-mail. You can also subscribe without commenting.