From 154ac8a659a668037ee3885de857d56d87b6afe3 Mon Sep 17 00:00:00 2001 From: Michael Keller Date: Wed, 11 Nov 2015 15:27:44 +0100 Subject: [PATCH 1/2] doc: added notes on callback behaviour of process.stdout, process.stderr The documentation of stream.write, process.stdout and process.stderr now makes clear that ending Node.js via process.exit might result in data loss despite the callback being called. --- doc/api/process.markdown | 5 +++++ doc/api/stream.markdown | 4 ++++ 2 files changed, 9 insertions(+) diff --git a/doc/api/process.markdown b/doc/api/process.markdown index 0c32799a66b6c6..bb52a9569c84e1 100644 --- a/doc/api/process.markdown +++ b/doc/api/process.markdown @@ -286,6 +286,11 @@ event and that writes can block when output is redirected to a file (although disks are fast and operating systems normally employ write-back caching so it should be a very rare occurrence indeed.) +Note that a callback function on `stream.write` might be called before all +data is flushed completly. The only way to ensure that all data to +`process.stderr` and `process.stdout` is written and flushed is to let +Node.js end itself. + To check if Node.js is being run in a TTY context, read the `isTTY` property on `process.stderr`, `process.stdout`, or `process.stdin`: diff --git a/doc/api/stream.markdown b/doc/api/stream.markdown index 4dfc36e09e6648..fe3f099c3f14a7 100644 --- a/doc/api/stream.markdown +++ b/doc/api/stream.markdown @@ -566,6 +566,10 @@ even if it returns `false`. However, writes will be buffered in memory, so it is best not to do this excessively. Instead, wait for the `drain` event before writing more data. +Note that on `process.stdout` and `process.stderr` the callback might +be called before all data has been fully handled. This might result in +data loss if Node.js is ended via `process.exit`. + #### Event: 'drain' If a [`writable.write(chunk)`][] call returns false, then the `drain` From c4987453e596b980a6a57b30646348da567a39a3 Mon Sep 17 00:00:00 2001 From: Michael Keller Date: Sat, 14 Nov 2015 12:21:50 +0100 Subject: [PATCH 2/2] doc: notes on process.stdout,.stderr when used with process.exit The documentation of stream.write, process.stdout and process.stderr now makes clear that ending Node.js via process.exit might result in data loss despite the callback being called. The documentation of process.exit now makes clear that Node.js will end as fast as possible ignoring outstanding writes. --- doc/api/process.markdown | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/doc/api/process.markdown b/doc/api/process.markdown index bb52a9569c84e1..f61e6e0f8696b2 100644 --- a/doc/api/process.markdown +++ b/doc/api/process.markdown @@ -286,10 +286,9 @@ event and that writes can block when output is redirected to a file (although disks are fast and operating systems normally employ write-back caching so it should be a very rare occurrence indeed.) -Note that a callback function on `stream.write` might be called before all -data is flushed completly. The only way to ensure that all data to -`process.stderr` and `process.stdout` is written and flushed is to let -Node.js end itself. +Note that on `process.stdout` and `process.stderr`, the callback passed to +`stream.write` might be called before all written data is flushed completely. +This can result in data loss if Node.js is ended prematurely using `process.exit`. To check if Node.js is being run in a TTY context, read the `isTTY` property on `process.stderr`, `process.stdout`, or `process.stdin`: @@ -465,6 +464,10 @@ To exit with a 'failure' code: The shell that executed Node.js should see the exit code as 1. +Note that Node.js will shutdown as fast as possible. Consumers of `process.stdout` +and `process.stderr` might not get all data even when the `stream.write` callback +suggests different. + ## process.exitCode