Node.js for Asynchronous Linux Scripting

In other programs I’ve written, I’ve enjoyed the asynchronous aspects of node.js using promises

I would like to employ this same programming style (using node.js) for Linux scripting. In other words, I’d like the ability to simultaneously execute multiple Linux commands, and then after those commands are complete, I want the node.js script to then execute another grouping of commands asynchronously and so on (without blocking).

I came across an aritlce
, that shows how to perform synchronous Linux command using node.js, but I have yet to find a similar tutorial that covers the management of multiple asynchronous Linux command using node.js.

Is this currently possible? If so, could you direct me to some specific resources that could help me get started with this goal?

I’m not sure if I’m right, but I think you are looking for exec
and spawn
. Please see the related API documentation
. There are examples for both command in the documentation.

and spawn

is the “simple” version of spawn
. The former is using a single callback to report back to the user when the command is completed, and only when it’s completely finished/failed.

child = exec('cat *.js bad_file | wc -l',
  function (error, stdout, stderr) {
    console.log('stdout: ' + stdout);
    console.log('stderr: ' + stderr);
    if (error !== null) {
      console.log('exec error: ' + error);

So basically the supplied callback is only called, when everything
that was written to stdout/stderr is available, completely
. After the process terminates (with success or failure) only then the callback is called and users (you) can act upon it. If it failed error
is truthy

is different, because you can listen on stdout/stderr events
. First you “spawn” a process. The spawn
function returns a reference to the child process.

var spawn = require('child_process').spawn;
var ls = spawn('ls', ['-lh', '/usr']);

here is the child process you’ve spawned. It has two properties (of importance right now), stdout
and stderr
which are event emitters. They’re emitting the data
event. When stuff it’s written on either streams the callbacks registered on the data
event is called.

ls.stdout.on('data', function (data) {
  console.log('stdout: ' + data);

ls.stderr.on('data', function (data) {
  console.log('stderr: ' + data);

There are other important events of course (check the documentation for the most up-to-date and relevant information of course).

ls.on('close', function (code) {
  console.log('child process exited with code ' + code);

You would use spawn
when you want to capture stuff on stdout for example while the process is running. A good example would be if you were spawning an ffmpeg encoding task which takes minutes to finish. You could listen on stderr (because ffmpeg writes progress information to stderr instead of stdout) to parse “progress” information.

Going line-by-line with carrier

There is a nice additional library you can use together with spawn
. It’s called carrier
. It helps reading “lines” from the stdout/stderr of spawned processes. It’s useful, because the data
parameter passed to the callbacks doesn’t necessarily contain “complete” lines separated by n
. carrier
helps with that. (However, it won’t help you to capture ffmpeg’s progress on stderr, because there are no newlines written by ffmpeg in this case, it’s only a carriage return, the line is always rewritten basically.)

You would use it like this

var carry = require("carrier").carry;
var child = spawn("command");
carry(child.stdout, function(line) {
   console.log("stdout", line);

Promises and Deferreds

If you would like to use a promise/deferred style approach then you could do something like the following
using Q

which is used by AngularJS – or at least something very
similar (see the link for a full tutorial on promises).

returns an Emitter
object which is not a promise. So you have to wrap the call to spawn ( see Using Deferreds

var q = require("q");
var spawn = require("child_process").spawn;

var ls = function() {
  var deferred = q.defer();
  var ls = spawn("ls", ["-lh", "/usr"]);
  ls.stdout.on("data", function(data) {
    deferred.notify({stdout: true, data: data});
  ls.stderr.on("data", function(data) {
    deferred.notify({stderr: true, data: data});
  ls.on("close", function(code) {
    if (code === 0) {
    } else {
  return deferred.promise;

By executing ls()
now a promise is returned which you would use like any other promise. When it’ll get resolved completely the first callback is called. If an error occurs (the process exist with a non zero exit code) the error handler is called. While the command progresses the third callback will be called (notify callback).

ls().then(function() {
  console.log("child process exited successfully");
}, function(err) {
  console.log("child process exited with code " + err);
}, function(args) {
  if (args.stdout) {
    console.log("stdout: " +;
  } else {
    console.log("stderr: " +;

When something gets written to stderr you could call reject immediately however, that is a design decision. Going back to the ffmpeg example this wouldn’t do you any good, because ffmpeg spits general information to stderr. However it could work with other commands.

I think you’ll get it :)

Examples are taken from nodejs’s documentation, because they are well understood.

Hello, buddy!稿源:Hello, buddy! (源链) | 关于 | 阅读提示

本站遵循[CC BY-NC-SA 4.0]。如您有版权、意见投诉等问题,请通过eMail联系我们处理。
酷辣虫 » 前端开发 » Node.js for Asynchronous Linux Scripting

喜欢 (0)or分享给?

专业 x 专注 x 聚合 x 分享 CC BY-NC-SA 4.0

使用声明 | 英豪名录