Skip to content

Commit 871a60c

Browse files
vsemozhetbyttargos
authored andcommitted
doc: fix nits in stream.md
* Unify periods and upper case in comments. * Add missing parentheses for methods. * Add missing backticks. * Fix sorting position of `writable.writableFinished` section. * Replace a one-letter variable with a more readable one. * `catch(console.log)` -> `catch(console.error)`. * Document missing `emitClose` option in `new stream.Readable()` section mentioned in https://nodejs.org/api/stream.html#stream_event_close_1 and https://nodejs.org/api/stream.html#stream_readable_destroy_error copying from the `new stream.Writable()` section. Refs: https://github.com/nodejs/node/blob/36fdf1aa6c87ccfaebabb8f9c8004baab0549b0b/lib/_stream_readable.js#L121 PR-URL: #28591 Reviewed-By: Luigi Pinca <luigipinca@gmail.com> Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de> Reviewed-By: Rich Trott <rtrott@gmail.com> Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Trivikram Kamat <trivikr.dev@gmail.com>
1 parent 0380a55 commit 871a60c

File tree

1 file changed

+57
-55
lines changed

1 file changed

+57
-55
lines changed

doc/api/stream.md

+57-55
Original file line numberDiff line numberDiff line change
@@ -115,20 +115,20 @@ that implements an HTTP server:
115115
const http = require('http');
116116

117117
const server = http.createServer((req, res) => {
118-
// `req` is an http.IncomingMessage, which is a Readable Stream
119-
// `res` is an http.ServerResponse, which is a Writable Stream
118+
// `req` is an http.IncomingMessage, which is a Readable Stream.
119+
// `res` is an http.ServerResponse, which is a Writable Stream.
120120

121121
let body = '';
122122
// Get the data as utf8 strings.
123123
// If an encoding is not set, Buffer objects will be received.
124124
req.setEncoding('utf8');
125125

126-
// Readable streams emit 'data' events once a listener is added
126+
// Readable streams emit 'data' events once a listener is added.
127127
req.on('data', (chunk) => {
128128
body += chunk;
129129
});
130130

131-
// The 'end' event indicates that the entire body has been received
131+
// The 'end' event indicates that the entire body has been received.
132132
req.on('end', () => {
133133
try {
134134
const data = JSON.parse(body);
@@ -250,7 +250,7 @@ function writeOneMillionTimes(writer, data, encoding, callback) {
250250
do {
251251
i--;
252252
if (i === 0) {
253-
// last time!
253+
// Last time!
254254
writer.write(data, encoding, callback);
255255
} else {
256256
// See if we should continue, or wait.
@@ -259,8 +259,8 @@ function writeOneMillionTimes(writer, data, encoding, callback) {
259259
}
260260
} while (i > 0 && ok);
261261
if (i > 0) {
262-
// had to stop early!
263-
// write some more once it drains
262+
// Had to stop early!
263+
// Write some more once it drains.
264264
writer.once('drain', write);
265265
}
266266
}
@@ -410,7 +410,7 @@ Calling the [`stream.write()`][stream-write] method after calling
410410
[`stream.end()`][stream-end] will raise an error.
411411

412412
```js
413-
// Write 'hello, ' and then end with 'world!'
413+
// Write 'hello, ' and then end with 'world!'.
414414
const fs = require('fs');
415415
const file = fs.createWriteStream('example.txt');
416416
file.write('hello, ');
@@ -480,6 +480,15 @@ added: v11.4.0
480480

481481
Is `true` if it is safe to call [`writable.write()`][stream-write].
482482

483+
##### writable.writableFinished
484+
<!-- YAML
485+
added: v12.6.0
486+
-->
487+
488+
* {boolean}
489+
490+
Is `true` if after the [`'finish'`][] event has been emitted.
491+
483492
##### writable.writableHighWaterMark
484493
<!-- YAML
485494
added: v9.3.0
@@ -499,16 +508,6 @@ This property contains the number of bytes (or objects) in the queue
499508
ready to be written. The value provides introspection data regarding
500509
the status of the `highWaterMark`.
501510

502-
##### writable.writableFinished
503-
<!-- YAML
504-
added: v12.6.0
505-
-->
506-
507-
* {boolean}
508-
509-
Is `true` if all data has been flushed to the underlying system. After
510-
the [`'finish'`][] event has been emitted.
511-
512511
##### writable.writableObjectMode
513512
<!-- YAML
514513
added: v12.3.0
@@ -694,11 +693,11 @@ const writable = new Writable();
694693

695694
pass.pipe(writable);
696695
pass.unpipe(writable);
697-
// readableFlowing is now false
696+
// readableFlowing is now false.
698697

699698
pass.on('data', (chunk) => { console.log(chunk.toString()); });
700-
pass.write('ok'); // Will not emit 'data'
701-
pass.resume(); // Must be called to make stream emit 'data'
699+
pass.write('ok'); // Will not emit 'data'.
700+
pass.resume(); // Must be called to make stream emit 'data'.
702701
```
703702

704703
While `readable.readableFlowing` is `false`, data may be accumulating
@@ -841,7 +840,7 @@ cause some amount of data to be read into an internal buffer.
841840
```javascript
842841
const readable = getReadableStreamSomehow();
843842
readable.on('readable', function() {
844-
// There is some data to read now
843+
// There is some data to read now.
845844
let data;
846845

847846
while (data = this.read()) {
@@ -986,7 +985,7 @@ named `file.txt`:
986985
const fs = require('fs');
987986
const readable = getReadableStreamSomehow();
988987
const writable = fs.createWriteStream('file.txt');
989-
// All the data from readable goes into 'file.txt'
988+
// All the data from readable goes into 'file.txt'.
990989
readable.pipe(writable);
991990
```
992991
It is possible to attach multiple `Writable` streams to a single `Readable`
@@ -1061,7 +1060,7 @@ readable.on('readable', () => {
10611060

10621061
The `while` loop is necessary when processing data with
10631062
`readable.read()`. Only after `readable.read()` returns `null`,
1064-
[`'readable'`]() will be emitted.
1063+
[`'readable'`][] will be emitted.
10651064

10661065
A `Readable` stream in object mode will always return a single item from
10671066
a call to [`readable.read(size)`][stream-read], regardless of the value of the
@@ -1192,7 +1191,7 @@ const fs = require('fs');
11921191
const readable = getReadableStreamSomehow();
11931192
const writable = fs.createWriteStream('file.txt');
11941193
// All the data from readable goes into 'file.txt',
1195-
// but only for the first second
1194+
// but only for the first second.
11961195
readable.pipe(writable);
11971196
setTimeout(() => {
11981197
console.log('Stop writing to file.txt.');
@@ -1231,9 +1230,9 @@ use of a [`Transform`][] stream instead. See the [API for Stream Implementers][]
12311230
section for more information.
12321231

12331232
```js
1234-
// Pull off a header delimited by \n\n
1235-
// use unshift() if we get too much
1236-
// Call the callback with (error, header, stream)
1233+
// Pull off a header delimited by \n\n.
1234+
// Use unshift() if we get too much.
1235+
// Call the callback with (error, header, stream).
12371236
const { StringDecoder } = require('string_decoder');
12381237
function parseHeader(stream, callback) {
12391238
stream.on('error', callback);
@@ -1245,13 +1244,13 @@ function parseHeader(stream, callback) {
12451244
while (null !== (chunk = stream.read())) {
12461245
const str = decoder.write(chunk);
12471246
if (str.match(/\n\n/)) {
1248-
// Found the header boundary
1247+
// Found the header boundary.
12491248
const split = str.split(/\n\n/);
12501249
header += split.shift();
12511250
const remaining = split.join('\n\n');
12521251
const buf = Buffer.from(remaining, 'utf8');
12531252
stream.removeListener('error', callback);
1254-
// Remove the 'readable' listener before unshifting
1253+
// Remove the 'readable' listener before unshifting.
12551254
stream.removeListener('readable', onReadable);
12561255
if (buf.length)
12571256
stream.unshift(buf);
@@ -1323,13 +1322,13 @@ const fs = require('fs');
13231322
async function print(readable) {
13241323
readable.setEncoding('utf8');
13251324
let data = '';
1326-
for await (const k of readable) {
1327-
data += k;
1325+
for await (const chunk of readable) {
1326+
data += chunk;
13281327
}
13291328
console.log(data);
13301329
}
13311330

1332-
print(fs.createReadStream('file')).catch(console.log);
1331+
print(fs.createReadStream('file')).catch(console.error);
13331332
```
13341333

13351334
If the loop terminates with a `break` or a `throw`, the stream will be
@@ -1425,7 +1424,7 @@ finished(rs, (err) => {
14251424
}
14261425
});
14271426

1428-
rs.resume(); // drain the stream
1427+
rs.resume(); // Drain the stream.
14291428
```
14301429

14311430
Especially useful in error handling scenarios where a stream is destroyed
@@ -1445,7 +1444,7 @@ async function run() {
14451444
}
14461445

14471446
run().catch(console.error);
1448-
rs.resume(); // drain the stream
1447+
rs.resume(); // Drain the stream.
14491448
```
14501449

14511450
### stream.pipeline(...streams, callback)
@@ -1508,6 +1507,7 @@ run().catch(console.error);
15081507
* `options` {Object} Options provided to `new stream.Readable([options])`.
15091508
By default, `Readable.from()` will set `options.objectMode` to `true`, unless
15101509
this is explicitly opted out by setting `options.objectMode` to `false`.
1510+
* Returns: {stream.Readable}
15111511

15121512
A utility method for creating Readable Streams out of iterators.
15131513

@@ -1555,10 +1555,10 @@ on the type of stream being created, as detailed in the chart below:
15551555

15561556
| Use-case | Class | Method(s) to implement |
15571557
| -------- | ----- | ---------------------- |
1558-
| Reading only | [`Readable`] | <code>[_read][stream-_read]</code> |
1559-
| Writing only | [`Writable`] | <code>[_write][stream-_write]</code>, <code>[_writev][stream-_writev]</code>, <code>[_final][stream-_final]</code> |
1560-
| Reading and writing | [`Duplex`] | <code>[_read][stream-_read]</code>, <code>[_write][stream-_write]</code>, <code>[_writev][stream-_writev]</code>, <code>[_final][stream-_final]</code> |
1561-
| Operate on written data, then read the result | [`Transform`] | <code>[_transform][stream-_transform]</code>, <code>[_flush][stream-_flush]</code>, <code>[_final][stream-_final]</code> |
1558+
| Reading only | [`Readable`] | <code>[_read()][stream-_read]</code> |
1559+
| Writing only | [`Writable`] | <code>[_write()][stream-_write]</code>, <code>[_writev()][stream-_writev]</code>, <code>[_final()][stream-_final]</code> |
1560+
| Reading and writing | [`Duplex`] | <code>[_read()][stream-_read]</code>, <code>[_write()][stream-_write]</code>, <code>[_writev()][stream-_writev]</code>, <code>[_final()][stream-_final]</code> |
1561+
| Operate on written data, then read the result | [`Transform`] | <code>[_transform()][stream-_transform]</code>, <code>[_flush()][stream-_flush]</code>, <code>[_final()][stream-_final]</code> |
15621562

15631563
The implementation code for a stream should *never* call the "public" methods
15641564
of a stream that are intended for use by consumers (as described in the
@@ -1643,7 +1643,7 @@ const { Writable } = require('stream');
16431643

16441644
class MyWritable extends Writable {
16451645
constructor(options) {
1646-
// Calls the stream.Writable() constructor
1646+
// Calls the stream.Writable() constructor.
16471647
super(options);
16481648
// ...
16491649
}
@@ -1886,6 +1886,8 @@ changes:
18861886
* `objectMode` {boolean} Whether this stream should behave
18871887
as a stream of objects. Meaning that [`stream.read(n)`][stream-read] returns
18881888
a single value instead of a `Buffer` of size `n`. **Default:** `false`.
1889+
* `emitClose` {boolean} Whether or not the stream should emit `'close'`
1890+
after it has been destroyed. **Default:** `true`.
18891891
* `read` {Function} Implementation for the [`stream._read()`][stream-_read]
18901892
method.
18911893
* `destroy` {Function} Implementation for the
@@ -1899,7 +1901,7 @@ const { Readable } = require('stream');
18991901

19001902
class MyReadable extends Readable {
19011903
constructor(options) {
1902-
// Calls the stream.Readable(options) constructor
1904+
// Calls the stream.Readable(options) constructor.
19031905
super(options);
19041906
// ...
19051907
}
@@ -2026,18 +2028,18 @@ class SourceWrapper extends Readable {
20262028

20272029
// Every time there's data, push it into the internal buffer.
20282030
this._source.ondata = (chunk) => {
2029-
// If push() returns false, then stop reading from source
2031+
// If push() returns false, then stop reading from source.
20302032
if (!this.push(chunk))
20312033
this._source.readStop();
20322034
};
20332035

2034-
// When the source ends, push the EOF-signaling `null` chunk
2036+
// When the source ends, push the EOF-signaling `null` chunk.
20352037
this._source.onend = () => {
20362038
this.push(null);
20372039
};
20382040
}
2039-
// _read will be called when the stream wants to pull more data in
2040-
// the advisory size argument is ignored in this case.
2041+
// _read() will be called when the stream wants to pull more data in.
2042+
// The advisory size argument is ignored in this case.
20412043
_read(size) {
20422044
this._source.readStart();
20432045
}
@@ -2070,7 +2072,7 @@ const myReadable = new Readable({
20702072
process.nextTick(() => this.emit('error', err));
20712073
return;
20722074
}
2073-
// do some work
2075+
// Do some work.
20742076
}
20752077
});
20762078
```
@@ -2208,7 +2210,7 @@ class MyDuplex extends Duplex {
22082210
}
22092211

22102212
_write(chunk, encoding, callback) {
2211-
// The underlying source only deals with strings
2213+
// The underlying source only deals with strings.
22122214
if (Buffer.isBuffer(chunk))
22132215
chunk = chunk.toString();
22142216
this[kSource].writeSomeData(chunk);
@@ -2241,12 +2243,12 @@ the `Readable` side.
22412243
```js
22422244
const { Transform } = require('stream');
22432245

2244-
// All Transform streams are also Duplex Streams
2246+
// All Transform streams are also Duplex Streams.
22452247
const myTransform = new Transform({
22462248
writableObjectMode: true,
22472249

22482250
transform(chunk, encoding, callback) {
2249-
// Coerce the chunk to a number if necessary
2251+
// Coerce the chunk to a number if necessary.
22502252
chunk |= 0;
22512253

22522254
// Transform the chunk into something else.
@@ -2385,7 +2387,7 @@ user programs.
23852387
[`stream.write()`][stream-write].
23862388
* `encoding` {string} If the chunk is a string, then this is the
23872389
encoding type. If chunk is a buffer, then this is the special
2388-
value - 'buffer', ignore it in this case.
2390+
value - `'buffer'`, ignore it in this case.
23892391
* `callback` {Function} A callback function (optionally with an error
23902392
argument and data) to be called after the supplied `chunk` has been
23912393
processed.
@@ -2493,12 +2495,12 @@ const writeable = fs.createWriteStream('./file');
24932495

24942496
(async function() {
24952497
for await (const chunk of iterator) {
2496-
// Handle backpressure on write
2498+
// Handle backpressure on write().
24972499
if (!writeable.write(chunk))
24982500
await once(writeable, 'drain');
24992501
}
25002502
writeable.end();
2501-
// Ensure completion without errors
2503+
// Ensure completion without errors.
25022504
await once(writeable, 'finish');
25032505
})();
25042506
```
@@ -2517,7 +2519,7 @@ const writeable = fs.createWriteStream('./file');
25172519
(async function() {
25182520
const readable = Readable.from(iterator);
25192521
readable.pipe(writeable);
2520-
// Ensure completion without errors
2522+
// Ensure completion without errors.
25212523
await once(writeable, 'finish');
25222524
})();
25232525
```
@@ -2560,7 +2562,7 @@ For example, consider the following code:
25602562
// WARNING! BROKEN!
25612563
net.createServer((socket) => {
25622564

2563-
// We add an 'end' listener, but never consume the data
2565+
// We add an 'end' listener, but never consume the data.
25642566
socket.on('end', () => {
25652567
// It will never get here.
25662568
socket.end('The message was received but was not processed.\n');
@@ -2576,7 +2578,7 @@ The workaround in this situation is to call the
25762578
[`stream.resume()`][stream-resume] method to begin the flow of data:
25772579

25782580
```js
2579-
// Workaround
2581+
// Workaround.
25802582
net.createServer((socket) => {
25812583
socket.on('end', () => {
25822584
socket.end('The message was received but was not processed.\n');

0 commit comments

Comments
 (0)