Since v17.0.0,v16.12.0 - Check
message.destroyed
from
stream.Readable.
Readonly
closed
Is true
after 'close'
has been
emitted.
The message.complete
property will be
true
if a complete HTTP message has been received
and successfully parsed.
This property is particularly useful as a means of determining if a client or server fully transmitted a message before a connection was terminated:
const req = http.request({
host: '127.0.0.1',
port: 8080,
method: 'POST',
}, (res) => {
res.resume();
res.on('end', () => {
if (!res.complete)
console.error(
'The connection was terminated while the message was still being sent');
});
});
Alias for message.socket
.
Is true
after readable.destroy()
has
been called.
Readonly
errored
Returns error if the stream has been destroyed with an error.
The request/response headers object.
Key-value pairs of header names and values. Header names are lower-cased.
// Prints something like:
//
// { 'user-agent': 'curl/7.22.0',
// host: '127.0.0.1:8000',
// accept: '*' }
console.log(request.headers);
Duplicates in raw headers are handled in the following ways, depending on the header name:
age
, authorization
,
content-length
, content-type
,
etag
, expires
, from
,
host
, if-modified-since
,
if-unmodified-since
,
last-modified
, location
,
max-forwards
, proxy-authorization
,
referer
, retry-after
,
server
, or user-agent
are
discarded. To allow duplicate values of the headers listed
above to be joined, use the option
joinDuplicateHeaders
in
request and
createServer. See RFC 9110 Section
5.3 for more information.
set-cookie
is always an array. Duplicates are
added to the array.
cookie
headers, the values are
joined together with ;
.
,
.
Similar to message.headers
, but there is no join
logic and the values are always arrays of strings, even for
headers received just once.
// Prints something like:
//
// { 'user-agent': ['curl/7.22.0'],
// host: ['127.0.0.1:8000'],
// accept: ['*'] }
console.log(request.headersDistinct);
In case of server request, the HTTP version sent by the
client. In the case of client response, the HTTP version of
the connected-to server. Probably either '1.1'
or
'1.0'
.
Also message.httpVersionMajor
is the first
integer and message.httpVersionMinor
is the
second.
Optional
method
Only valid for request obtained from Server.
The request method as a string. Read only. Examples:
'GET'
, 'DELETE'
.
The raw request/response headers list exactly as they were received.
The keys and values are in the same list. It is not a list of tuples. So, the even-numbered offsets are key values, and the odd-numbered offsets are the associated values.
Header names are not lowercased, and duplicates are not merged.
// Prints something like:
//
// [ 'user-agent',
// 'this is invalid because there can be only one',
// 'User-Agent',
// 'curl/7.22.0',
// 'Host',
// '127.0.0.1:8000',
// 'ACCEPT',
// '*' ]
console.log(request.rawHeaders);
The raw request/response trailer keys and values exactly as
they were received. Only populated at the
'end'
event.
Is true
if it is safe to call
read, which means the stream has not been destroyed or emitted
'error'
or 'end'
.
Readonly
Experimental
readable
Returns whether the stream was destroyed or errored before
emitting 'end'
.
Readonly
Experimental
readableReturns whether 'data'
has been emitted.
Readonly
readable
Getter for the property encoding
of a given
Readable
stream. The
encoding
property can be set using the
setEncoding
method.
Readonly
readable
Becomes true
when
'end'
event is emitted.
Readonly
readable
This property reflects the current state of a
Readable
stream as described in the
Three states
section.
Readonly
readable
Returns the value of highWaterMark
passed when
creating this Readable
.
Readonly
readable
This property contains the number of bytes (or objects) in the
queue ready to be read. The value provides introspection data
regarding the status of the highWaterMark
.
Readonly
readable
Getter for the property objectMode
of a given
Readable
stream.
The net.Socket
object associated with the
connection.
With HTTPS support, use
request.socket.getPeerCertificate()
to obtain the
client's authentication details.
This property is guaranteed to be an instance of the
net.Socket
class, a subclass of
stream.Duplex
, unless the user specified a socket
type other than net.Socket
or internally nulled.
Optional
statusOnly valid for response obtained from ClientRequest.
The 3-digit HTTP response status code. E.G. 404
.
Optional
statusOnly valid for response obtained from ClientRequest.
The HTTP response status message (reason phrase). E.G.
OK
or Internal Server Error
.
The request/response trailers object. Only populated at the
'end'
event.
Similar to message.trailers
, but there is no join
logic and the values are always arrays of strings, even for
headers received just once. Only populated at the
'end'
event.
Optional
url
Only valid for request obtained from Server.
Request URL string. This contains only the URL that is present in the actual HTTP request. Take the following request:
GET /status?name=ryan HTTP/1.1
Accept: text/plain
To parse the URL into its parts:
new URL(`http://${process.env.HOST ?? 'localhost'}${request.url}`);
When request.url
is
'/status?name=ryan'
and
process.env.HOST
is undefined:
$ node
> new URL(`http://${process.env.HOST ?? 'localhost'}${request.url}`);
URL {
href: 'http://localhost/status?name=ryan',
origin: 'http://localhost',
protocol: 'http:',
username: '',
password: '',
host: 'localhost',
hostname: 'localhost',
port: '',
pathname: '/status',
search: '?name=ryan',
searchParams: URLSearchParams { 'name' => 'ryan' },
hash: ''
}
Ensure that you set process.env.HOST
to the
server's host name, or consider replacing this part entirely.
If using req.headers.host
, ensure proper
validation is used, as clients may specify a custom
Host
header.
Static
captureValue: boolean
Change the default captureRejections
option on
all new EventEmitter
objects.
Static
Readonly
captureValue: Symbol.for('nodejs.rejection')
See how to write a custom rejection handler
.
Static
default
By default, a maximum of 10
listeners can be
registered for any single event. This limit can be changed for
individual EventEmitter
instances using the
emitter.setMaxListeners(n)
method. To change the
default for allEventEmitter
instances,
the events.defaultMaxListeners
property can be
used. If this value is not a positive number, a
RangeError
is thrown.
Take caution when setting the
events.defaultMaxListeners
because the change
affects all EventEmitter
instances,
including those created before the change is made. However,
calling emitter.setMaxListeners(n)
still has
precedence over events.defaultMaxListeners
.
This is not a hard limit. The
EventEmitter
instance will allow more listeners
to be added but will output a trace warning to stderr
indicating that a "possible EventEmitter memory
leak" has been detected. For any single
EventEmitter
, the
emitter.getMaxListeners()
and
emitter.setMaxListeners()
methods can be used to
temporarily avoid this warning:
import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.setMaxListeners(emitter.getMaxListeners() + 1);
emitter.once('event', () => {
// do stuff
emitter.setMaxListeners(Math.max(emitter.getMaxListeners() - 1, 0));
});
The --trace-warnings
command-line flag can be
used to display the stack trace for such warnings.
The emitted warning can be inspected with
process.on('warning')
and will have the
additional emitter
, type
, and
count
properties, referring to the event emitter
instance, the event's name and the number of attached
listeners, respectively. Its name
property is set
to 'MaxListenersExceededWarning'
.
Static
Readonly
error
This symbol shall be used to install a listener for only
monitoring 'error'
events. Listeners installed
using this symbol are called before the regular
'error'
listeners are called.
Installing a listener using this symbol does not change the
behavior once an 'error'
event is emitted.
Therefore, the process will still crash if no regular
'error'
listener is installed.
Optional
_construct
Optional
[captureEvent emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
Event emitter The defined events on documents including:
This method returns a new stream with chunks of the
underlying stream paired with a counter in the form
[index, chunk]
. The first index value is
0
and it increases by 1 for each chunk
produced.
Optional
options:
Pick<ArrayOptions,
"signal">
a stream of indexed pairs.
This method returns a new stream with the first limit chunks dropped from the start.
the number of chunks to drop from the readable.
Optional
options:
Pick<ArrayOptions,
"signal">
a stream with limit chunks dropped from the start.
Synchronously calls each of the listeners registered for
the event named eventName
, in the order they
were registered, passing the supplied arguments to each.
Returns true
if the event had listeners,
false
otherwise.
import { EventEmitter } from 'node:events';
const myEmitter = new EventEmitter();
// First listener
myEmitter.on('event', function firstListener() {
console.log('Helloooo! first listener');
});
// Second listener
myEmitter.on('event', function secondListener(arg1, arg2) {
console.log(`event with parameters ${arg1}, ${arg2} in second listener`);
});
// Third listener
myEmitter.on('event', function thirdListener(...args) {
const parameters = args.join(', ');
console.log(`event with parameters ${parameters} in third listener`);
});
console.log(myEmitter.listeners('event'));
myEmitter.emit('event', 1, 2, 3, 4, 5);
// Prints:
// [
// [Function: firstListener],
// [Function: secondListener],
// [Function: thirdListener]
// ]
// Helloooo! first listener
// event with parameters 1, 2 in second listener
// event with parameters 1, 2, 3, 4, 5 in third listener
Returns an array listing the events for which the emitter
has registered listeners. The values in the array are
strings or Symbol
s.
import { EventEmitter } from 'node:events';
const myEE = new EventEmitter();
myEE.on('foo', () => {});
myEE.on('bar', () => {});
const sym = Symbol('symbol');
myEE.on(sym, () => {});
console.log(myEE.eventNames());
// Prints: [ 'foo', 'bar', Symbol(symbol) ]
This method is similar to
Array.prototype.every
and calls
fn on each chunk in the stream to check if all
awaited return values are truthy value for fn.
Once an fn call on a chunk await
ed
return value is falsy, the stream is destroyed and the
promise is fulfilled with false
. If all of
the fn calls on the chunks return a truthy value,
the promise is fulfilled with true
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
ArrayOptions
a promise evaluating to true
if
fn returned a truthy value for every one of the
chunks.
This method allows filtering the stream. For each chunk in
the stream the fn function will be called and if
it returns a truthy value, the chunk will be passed to the
result stream. If the fn function returns a
promise - that promise will be await
ed.
a function to filter chunks from the stream. Async or not.
Optional
options:
ArrayOptions
a stream filtered with the predicate fn.
This method is similar to
Array.prototype.find
and calls fn on
each chunk in the stream to find a chunk with a truthy
value for fn. Once an fn call's awaited
return value is truthy, the stream is destroyed and the
promise is fulfilled with value for which
fn returned a truthy value. If all of the
fn calls on the chunks return a falsy value, the
promise is fulfilled with undefined
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
ArrayOptions
a promise evaluating to the first chunk for which
fn evaluated with a truthy value, or
undefined
if no element was found.
This method is similar to
Array.prototype.find
and calls fn on
each chunk in the stream to find a chunk with a truthy
value for fn. Once an fn call's awaited
return value is truthy, the stream is destroyed and the
promise is fulfilled with value for which
fn returned a truthy value. If all of the
fn calls on the chunks return a falsy value, the
promise is fulfilled with undefined
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
ArrayOptions
a promise evaluating to the first chunk for which
fn evaluated with a truthy value, or
undefined
if no element was found.
This method returns a new stream by applying the given callback to each chunk of the stream and then flattening the result.
It is possible to return a stream or another iterable or async iterable from fn and the result streams will be merged (flattened) into the returned stream.
a function to map over every chunk in the stream. May be async. May be a stream or generator.
Optional
options:
ArrayOptions
a stream flat-mapped with the function fn.
This method allows iterating a stream. For each chunk in
the stream the fn function will be called. If the
fn function returns a promise - that promise will
be await
ed.
This method is different from
for await...of
loops in that it can
optionally process chunks concurrently. In addition, a
forEach
iteration can only be stopped by
having passed a signal
option and aborting
the related AbortController while
for await...of
can be stopped with
break
or return
. In either case
the stream will be destroyed.
This method is different from listening to the
'data'
event in that it uses the
readable
event in the underlying machinary
and can limit the number of concurrent fn calls.
a function to call on each chunk of the stream. Async or not.
Optional
options:
ArrayOptions
a promise for when the stream has finished.
Returns the current max listener value for the
EventEmitter
which is either set by
emitter.setMaxListeners(n)
or defaults to
defaultMaxListeners.
The readable.isPaused()
method returns the
current operating state of the Readable
. This
is used primarily by the mechanism that underlies the
readable.pipe()
method. In most typical
cases, there will be no reason to use this method
directly.
const readable = new stream.Readable();
readable.isPaused(); // === false
readable.pause();
readable.isPaused(); // === true
readable.resume();
readable.isPaused(); // === false
The iterator created by this method gives users the option
to cancel the destruction of the stream if the
for await...of
loop is exited by
return
, break
, or
throw
, or if the iterator should destroy the
stream if the stream emitted an error during iteration.
Optional
options:
{
destroyOnReturn?:
boolean
}
Optional
destroyOnReturn?: boolean
When set to false
, calling
return
on the async iterator, or
exiting a for await...of
iteration
using a break
, return
,
or throw
will not destroy the
stream.
Default: true
.
Returns the number of listeners listening for the event
named eventName
. If listener
is
provided, it will return how many times the listener is
found in the list of the listeners of the event.
The name of the event being listened for
Optional
listener:
Function
The event handler function
Returns a copy of the array of listeners for the event
named eventName
.
server.on('connection', (stream) => {
console.log('someone connected!');
});
console.log(util.inspect(server.listeners('connection')));
// Prints: [ [Function] ]
This method allows mapping over the stream. The
fn function will be called for every chunk in the
stream. If the fn function returns a promise -
that promise will be await
ed before being
passed to the result stream.
a function to map over every chunk in the stream. Async or not.
Optional
options:
ArrayOptions
a stream mapped with the function fn.
Alias for emitter.removeListener()
.
Adds the listener
function to the end of the
listeners array for the event named
eventName
. No checks are made to see if the
listener
has already been added. Multiple
calls passing the same combination of
eventName
and listener
will
result in the listener
being added, and
called, multiple times.
server.on('connection', (stream) => {
console.log('someone connected!');
});
Returns a reference to the EventEmitter
, so
that calls can be chained.
By default, event listeners are invoked in the order they
are added. The
emitter.prependListener()
method can be used
as an alternative to add the event listener to the
beginning of the listeners array.
import { EventEmitter } from 'node:events';
const myEE = new EventEmitter();
myEE.on('foo', () => console.log('a'));
myEE.prependListener('foo', () => console.log('b'));
myEE.emit('foo');
// Prints:
// b
// a
The callback function
Adds a one-time
listener
function for the event named
eventName
. The next time
eventName
is triggered, this listener is
removed and then invoked.
server.once('connection', (stream) => {
console.log('Ah, we have our first user!');
});
Returns a reference to the EventEmitter
, so
that calls can be chained.
By default, event listeners are invoked in the order they
are added. The
emitter.prependOnceListener()
method can be
used as an alternative to add the event listener to the
beginning of the listeners array.
import { EventEmitter } from 'node:events';
const myEE = new EventEmitter();
myEE.once('foo', () => console.log('a'));
myEE.prependOnceListener('foo', () => console.log('b'));
myEE.emit('foo');
// Prints:
// b
// a
The callback function
The readable.pause()
method will cause a
stream in flowing mode to stop emitting
'data'
events, switching out of flowing mode.
Any data that becomes available will remain in the
internal buffer.
const readable = getReadableStreamSomehow();
readable.on('data', (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
readable.pause();
console.log('There will be no additional data for 1 second.');
setTimeout(() => {
console.log('Now data will start flowing again.');
readable.resume();
}, 1000);
});
The readable.pause()
method has no effect if
there is a 'readable'
event listener.
Adds the listener
function to the
beginning of the listeners array for the event
named eventName
. No checks are made to see if
the listener
has already been added. Multiple
calls passing the same combination of
eventName
and listener
will
result in the listener
being added, and
called, multiple times.
server.prependListener('connection', (stream) => {
console.log('someone connected!');
});
Returns a reference to the EventEmitter
, so
that calls can be chained.
The callback function
Adds a one-timelistener
function for the event named
eventName
to the beginning of the
listeners array. The next time eventName
is
triggered, this listener is removed, and then invoked.
server.prependOnceListener('connection', (stream) => {
console.log('Ah, we have our first user!');
});
Returns a reference to the EventEmitter
, so
that calls can be chained.
The callback function
Returns a copy of the array of listeners for the event
named eventName
, including any wrappers (such
as those created by .once()
).
import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.once('log', () => console.log('log once'));
// Returns a new Array with a function `onceWrapper` which has a property
// `listener` which contains the original listener bound above
const listeners = emitter.rawListeners('log');
const logFnWrapper = listeners[0];
// Logs "log once" to the console and does not unbind the `once` event
logFnWrapper.listener();
// Logs "log once" to the console and removes the listener
logFnWrapper();
emitter.on('log', () => console.log('log persistently'));
// Will return a new Array with a single function bound by `.on()` above
const newListeners = emitter.rawListeners('log');
// Logs "log persistently" twice
newListeners[0]();
emitter.emit('log');
The readable.read()
method reads data out of
the internal buffer and returns it. If no data is
available to be read, null
is returned. By
default, the data is returned as a
Buffer
object unless an encoding has been
specified using the
readable.setEncoding()
method or the stream
is operating in object mode.
The optional size
argument specifies a
specific number of bytes to read. If
size
bytes are not available to be read,
null
will be returned unless the
stream has ended, in which case all of the data remaining
in the internal buffer will be returned.
If the size
argument is not specified, all of
the data contained in the internal buffer will be
returned.
The size
argument must be less than or equal
to 1 GiB.
The readable.read()
method should only be
called on Readable
streams operating in
paused mode. In flowing mode,
readable.read()
is called automatically until
the internal buffer is fully drained.
const readable = getReadableStreamSomehow();
// 'readable' may be triggered multiple times as data is buffered in
readable.on('readable', () => {
let chunk;
console.log('Stream is readable (new data received in buffer)');
// Use a loop to make sure we read all currently available data
while (null !== (chunk = readable.read())) {
console.log(`Read ${chunk.length} bytes of data...`);
}
});
// 'end' will be triggered once when there is no more data available
readable.on('end', () => {
console.log('Reached end of stream.');
});
Each call to readable.read()
returns a chunk
of data, or null
. The chunks are not
concatenated. A while
loop is necessary to
consume all data currently in the buffer. When reading a
large file .read()
may return
null
, having consumed all buffered content so
far, but there is still more data to come not yet
buffered. In this case a new 'readable'
event
will be emitted when there is more data in the buffer.
Finally the 'end'
event will be emitted when
there is no more data to come.
Therefore to read a file's whole contents from a
readable
, it is necessary to collect chunks
across multiple 'readable'
events:
const chunks = [];
readable.on('readable', () => {
let chunk;
while (null !== (chunk = readable.read())) {
chunks.push(chunk);
}
});
readable.on('end', () => {
const content = chunks.join('');
});
A Readable
stream in object mode will always
return a single item from a call to
readable.read(size)
, regardless of the value
of the size
argument.
If the readable.read()
method returns a chunk
of data, a 'data'
event will also be emitted.
Calling
read
after the 'end'
event has been emitted will
return null
. No runtime error will be raised.
Optional
size:
number
Optional argument to specify how much data to read.
This method calls fn on each chunk of the stream in order, passing it the result from the calculation on the previous element. It returns a promise for the final value of the reduction.
If no initial value is supplied the first chunk
of the stream is used as the initial value. If the stream
is empty, the promise is rejected with a
TypeError
with the
ERR_INVALID_ARGS
code property.
The reducer function iterates the stream
element-by-element which means that there is no
concurrency parameter or parallelism. To perform
a reduce concurrently, you can extract the async function
to readable.map
method.
a reducer function to call over every chunk in the stream. Async or not.
Optional
initial:
undefined
the initial value to use in the reduction.
Optional
options:
Pick<ArrayOptions,
"signal">
a promise for the final value of the reduction.
This method calls fn on each chunk of the stream in order, passing it the result from the calculation on the previous element. It returns a promise for the final value of the reduction.
If no initial value is supplied the first chunk
of the stream is used as the initial value. If the stream
is empty, the promise is rejected with a
TypeError
with the
ERR_INVALID_ARGS
code property.
The reducer function iterates the stream
element-by-element which means that there is no
concurrency parameter or parallelism. To perform
a reduce concurrently, you can extract the async function
to readable.map
method.
a promise for the final value of the reduction.
Removes all listeners, or those of the specified
eventName
.
It is bad practice to remove listeners added elsewhere in
the code, particularly when the
EventEmitter
instance was created by some
other component or module (e.g. sockets or file streams).
Returns a reference to the EventEmitter
, so
that calls can be chained.
Optional
eventName:
string
|
symbol
Removes the specified listener
from the
listener array for the event named eventName
.
const callback = (stream) => {
console.log('someone connected!');
};
server.on('connection', callback);
// ...
server.removeListener('connection', callback);
removeListener()
will remove, at most, one
instance of a listener from the listener array. If any
single listener has been added multiple times to the
listener array for the specified eventName
,
then removeListener()
must be called multiple
times to remove each instance.
Once an event is emitted, all listeners attached to it at
the time of emitting are called in order. This implies
that any removeListener()
or
removeAllListeners()
calls
after emitting and before the last
listener finishes execution will not remove them fromemit()
in progress. Subsequent events behave as expected.
import { EventEmitter } from 'node:events';
class MyEmitter extends EventEmitter {}
const myEmitter = new MyEmitter();
const callbackA = () => {
console.log('A');
myEmitter.removeListener('event', callbackB);
};
const callbackB = () => {
console.log('B');
};
myEmitter.on('event', callbackA);
myEmitter.on('event', callbackB);
// callbackA removes listener callbackB but it will still be called.
// Internal listener array at time of emit [callbackA, callbackB]
myEmitter.emit('event');
// Prints:
// A
// B
// callbackB is now removed.
// Internal listener array [callbackA]
myEmitter.emit('event');
// Prints:
// A
Because listeners are managed using an internal array,
calling this will change the position indices of any
listener registered after the listener being
removed. This will not impact the order in which listeners
are called, but it means that any copies of the listener
array as returned by the
emitter.listeners()
method will need to be
recreated.
When a single function has been added as a handler
multiple times for a single event (as in the example
below), removeListener()
will remove the most
recently added instance. In the example the
once('ping')
listener is removed:
import { EventEmitter } from 'node:events';
const ee = new EventEmitter();
function pong() {
console.log('pong');
}
ee.on('ping', pong);
ee.once('ping', pong);
ee.removeListener('ping', pong);
ee.emit('ping');
ee.emit('ping');
Returns a reference to the EventEmitter
, so
that calls can be chained.
The readable.resume()
method causes an
explicitly paused Readable
stream to resume
emitting 'data'
events, switching the stream
into flowing mode.
The readable.resume()
method can be used to
fully consume the data from a stream without actually
processing any of that data:
getReadableStreamSomehow()
.resume()
.on('end', () => {
console.log('Reached the end, but did not read anything.');
});
The readable.resume()
method has no effect if
there is a 'readable'
event listener.
The readable.setEncoding()
method sets the
character encoding for data read from the
Readable
stream.
By default, no encoding is assigned and stream data will
be returned as Buffer
objects. Setting an
encoding causes the stream data to be returned as strings
of the specified encoding rather than as
Buffer
objects. For instance, calling
readable.setEncoding('utf8')
will cause the
output data to be interpreted as UTF-8 data, and passed as
strings. Calling
readable.setEncoding('hex')
will cause the
data to be encoded in hexadecimal string format.
The Readable
stream will properly handle
multi-byte characters delivered through the stream that
would otherwise become improperly decoded if simply pulled
from the stream as Buffer
objects.
const readable = getReadableStreamSomehow();
readable.setEncoding('utf8');
readable.on('data', (chunk) => {
assert.equal(typeof chunk, 'string');
console.log('Got %d characters of string data:', chunk.length);
});
The encoding to use.
By default EventEmitter
s will print a warning
if more than 10
listeners are added for a
particular event. This is a useful default that helps
finding memory leaks. The
emitter.setMaxListeners()
method allows the
limit to be modified for this specific
EventEmitter
instance. The value can be set
to Infinity
(or 0
) to indicate
an unlimited number of listeners.
Returns a reference to the EventEmitter
, so
that calls can be chained.
This method is similar to
Array.prototype.some
and calls fn on
each chunk in the stream until the awaited return value is
true
(or any truthy value). Once an
fn call on a chunk await
ed return
value is truthy, the stream is destroyed and the promise
is fulfilled with true
. If none of the
fn calls on the chunks return a truthy value, the
promise is fulfilled with false
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
ArrayOptions
a promise evaluating to true
if
fn returned a truthy value for at least one of the
chunks.
This method returns a new stream with the first limit chunks.
the number of chunks to take from the readable.
Optional
options:
Pick<ArrayOptions,
"signal">
a stream with limit chunks taken.
This method allows easily obtaining the contents of a stream.
As this method reads the entire stream into memory, it negates the benefits of streams. It's intended for interoperability and convenience, not as the primary way to consume streams.
Optional
options:
Pick<ArrayOptions,
"signal">
a promise containing an array with the contents of the stream.
The readable.unpipe()
method detaches a
Writable
stream previously attached using the
pipe
method.
If the destination
is not specified, then
all pipes are detached.
If the destination
is specified, but no pipe
is set up for it, then the method does nothing.
const fs = require('node:fs');
const readable = getReadableStreamSomehow();
const writable = fs.createWriteStream('file.txt');
// All the data from readable goes into 'file.txt',
// but only for the first second.
readable.pipe(writable);
setTimeout(() => {
console.log('Stop writing to file.txt.');
readable.unpipe(writable);
console.log('Manually close the file stream.');
writable.end();
}, 1000);
Optional
destination:
WritableStream
Optional specific stream to unpipe
Passing chunk
as null
signals
the end of the stream (EOF) and behaves the same as
readable.push(null)
, after which no more data
can be written. The EOF signal is put at the end of the
buffer and any buffered data will still be flushed.
The readable.unshift()
method pushes a chunk
of data back into the internal buffer. This is useful in
certain situations where a stream is being consumed by
code that needs to "un-consume" some amount of
data that it has optimistically pulled out of the source,
so that the data can be passed on to some other party.
The stream.unshift(chunk)
method cannot be
called after the 'end'
event has been emitted
or a runtime error will be thrown.
Developers using stream.unshift()
often
should consider switching to use of a
Transform
stream instead. See the
API for stream implementers
section for more
information.
// Pull off a header delimited by \n\n.
// Use unshift() if we get too much.
// Call the callback with (error, header, stream).
const { StringDecoder } = require('node:string_decoder');
function parseHeader(stream, callback) {
stream.on('error', callback);
stream.on('readable', onReadable);
const decoder = new StringDecoder('utf8');
let header = '';
function onReadable() {
let chunk;
while (null !== (chunk = stream.read())) {
const str = decoder.write(chunk);
if (str.includes('\n\n')) {
// Found the header boundary.
const split = str.split(/\n\n/);
header += split.shift();
const remaining = split.join('\n\n');
const buf = Buffer.from(remaining, 'utf8');
stream.removeListener('error', callback);
// Remove the 'readable' listener before unshifting.
stream.removeListener('readable', onReadable);
if (buf.length)
stream.unshift(buf);
// Now the body of the message can be read from the stream.
callback(null, header, stream);
return;
}
// Still reading the header.
header += str;
}
}
}
Unlike
push, stream.unshift(chunk)
will not end the
reading process by resetting the internal reading state of
the stream. This can cause unexpected results if
readable.unshift()
is called during a read
(i.e. from within a
_read
implementation on a custom stream). Following the call to
readable.unshift()
with an immediate
push
will reset the reading state appropriately, however it is
best to simply avoid calling
readable.unshift()
while in the process of
performing a read.
Chunk of data to unshift onto the read queue. For
streams not operating in object mode,
chunk
must be a {string}, {Buffer},
{TypedArray}, {DataView} or null
. For
object mode streams, chunk
may be any
JavaScript value.
Optional
encoding:
BufferEncoding
Encoding of string chunks. Must be a valid
Buffer
encoding, such as
'utf8'
or 'ascii'
.
Prior to Node.js 0.10, streams did not implement the
entire node:stream
module API as it is
currently defined. (See Compatibility
for
more information.)
When using an older Node.js library that emits
'data'
events and has a
pause
method that is advisory only, the
readable.wrap()
method can be used to create
a Readable
stream that uses the old stream as
its data source.
It will rarely be necessary to use
readable.wrap()
but the method has been
provided as a convenience for interacting with older
Node.js applications and libraries.
const { OldReader } = require('./old-api-module.js');
const { Readable } = require('node:stream');
const oreader = new OldReader();
const myReader = new Readable().wrap(oreader);
myReader.on('readable', () => {
myReader.read(); // etc.
});
An "old style" readable stream
Static
addExperimental
Listens once to the abort
event on the
provided signal
.
Listening to the abort
event on abort signals
is unsafe and may lead to resource leaks since another
third party with the signal can call
e.stopImmediatePropagation()
. Unfortunately
Node.js cannot change this since it would violate the web
standard. Additionally, the original API makes it easy to
forget to remove listeners.
This API allows safely using AbortSignal
s in
Node.js APIs by solving these two issues by listening to
the event such that
stopImmediatePropagation
does not prevent the
listener from running.
Returns a disposable so that it may be unsubscribed from more easily.
import { addAbortListener } from 'node:events';
function example(signal) {
let disposable;
try {
signal.addEventListener('abort', (e) => e.stopImmediatePropagation());
disposable = addAbortListener(signal, (e) => {
// Do something when signal is aborted.
});
} finally {
disposable?.[Symbol.dispose]();
}
}
Disposable that removes the abort
listener.
Static
from
A utility method for creating Readable Streams out of iterators.
Object implementing the
Symbol.asyncIterator
or
Symbol.iterator
iterable protocol.
Emits an 'error' event if a null value is passed.
Optional
options:
ReadableOptions
Options provided to
new stream.Readable([options])
. By
default, Readable.from()
will set
options.objectMode
to
true
, unless this is explicitly opted
out by setting options.objectMode
to
false
.
Static
fromExperimental
A utility method for creating a Readable
from
a web ReadableStream
.
Optional
options:
Pick<ReadableOptions,
"signal"
|
"encoding"
|
"highWaterMark"
|
"objectMode">
Static
get
Returns a copy of the array of listeners for the event
named eventName
.
For EventEmitter
s this behaves exactly the
same as calling .listeners
on the emitter.
For EventTarget
s this is the only way to get
the event listeners for the event target. This is useful
for debugging and diagnostic purposes.
import { getEventListeners, EventEmitter } from 'node:events';
{
const ee = new EventEmitter();
const listener = () => console.log('Events are fun');
ee.on('foo', listener);
console.log(getEventListeners(ee, 'foo')); // [ [Function: listener] ]
}
{
const et = new EventTarget();
const listener = () => console.log('Events are fun');
et.addEventListener('foo', listener);
console.log(getEventListeners(et, 'foo')); // [ [Function: listener] ]
}
Static
getReturns the currently set max amount of listeners.
For EventEmitter
s this behaves exactly the
same as calling .getMaxListeners
on the
emitter.
For EventTarget
s this is the only way to get
the max event listeners for the event target. If the
number of event handlers on a single EventTarget exceeds
the max set, the EventTarget will print a warning.
import { getMaxListeners, setMaxListeners, EventEmitter } from 'node:events';
{
const ee = new EventEmitter();
console.log(getMaxListeners(ee)); // 10
setMaxListeners(11, ee);
console.log(getMaxListeners(ee)); // 11
}
{
const et = new EventTarget();
console.log(getMaxListeners(et)); // 10
setMaxListeners(11, et);
console.log(getMaxListeners(et)); // 11
}
Static
isStatic
listener
A class method that returns the number of listeners for
the given eventName
registered on the given
emitter
.
import { EventEmitter, listenerCount } from 'node:events';
const myEmitter = new EventEmitter();
myEmitter.on('event', () => {});
myEmitter.on('event', () => {});
console.log(listenerCount(myEmitter, 'event'));
// Prints: 2
The emitter to query
The event name
Static
on
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo')) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
Returns an AsyncIterator
that iterates
eventName
events. It will throw if the
EventEmitter
emits 'error'
. It
removes all listeners when exiting the loop. The
value
returned by each iteration is an array
composed of the emitted event arguments.
An AbortSignal
can be used to cancel waiting
on events:
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ac = new AbortController();
(async () => {
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo', { signal: ac.signal })) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
})();
process.nextTick(() => ac.abort());
Use the close
option to specify an array of
event names that will end the iteration:
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
ee.emit('close');
});
for await (const event of on(ee, 'foo', { close: ['close'] })) {
console.log(event); // prints ['bar'] [42]
}
// the loop will exit after 'close' is emitted
console.log('done'); // prints 'done'
Optional
options:
StaticEventEmitterIteratorOptions
An AsyncIterator
that iterates
eventName
events emitted by the
emitter
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo')) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
Returns an AsyncIterator
that iterates
eventName
events. It will throw if the
EventEmitter
emits 'error'
. It
removes all listeners when exiting the loop. The
value
returned by each iteration is an array
composed of the emitted event arguments.
An AbortSignal
can be used to cancel waiting
on events:
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ac = new AbortController();
(async () => {
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo', { signal: ac.signal })) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
})();
process.nextTick(() => ac.abort());
Use the close
option to specify an array of
event names that will end the iteration:
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
ee.emit('close');
});
for await (const event of on(ee, 'foo', { close: ['close'] })) {
console.log(event); // prints ['bar'] [42]
}
// the loop will exit after 'close' is emitted
console.log('done'); // prints 'done'
Optional
options:
StaticEventEmitterIteratorOptions
An AsyncIterator
that iterates
eventName
events emitted by the
emitter
Static
once
Creates a Promise
that is fulfilled when the
EventEmitter
emits the given event or that is
rejected if the EventEmitter
emits
'error'
while waiting. The
Promise
will resolve with an array of all the
arguments emitted to the given event.
This method is intentionally generic and works with the
web platform
EventTarget
interface, which has no special'error'
event
semantics and does not listen to the
'error'
event.
import { once, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
process.nextTick(() => {
ee.emit('myevent', 42);
});
const [value] = await once(ee, 'myevent');
console.log(value);
const err = new Error('kaboom');
process.nextTick(() => {
ee.emit('error', err);
});
try {
await once(ee, 'myevent');
} catch (err) {
console.error('error happened', err);
}
The special handling of the 'error'
event is
only used when events.once()
is used to wait
for another event. If events.once()
is used
to wait for the 'error'
event itself, then it
is treated as any other kind of event without special
handling:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
once(ee, 'error')
.then(([err]) => console.log('ok', err.message))
.catch((err) => console.error('error', err.message));
ee.emit('error', new Error('boom'));
// Prints: ok boom
An AbortSignal
can be used to cancel waiting
for the event:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
const ac = new AbortController();
async function foo(emitter, event, signal) {
try {
await once(emitter, event, { signal });
console.log('event emitted!');
} catch (error) {
if (error.name === 'AbortError') {
console.error('Waiting for the event was canceled!');
} else {
console.error('There was an error', error.message);
}
}
}
foo(ee, 'foo', ac.signal);
ac.abort(); // Abort waiting for the event
ee.emit('foo'); // Prints: Waiting for the event was canceled!
Optional
options:
StaticEventEmitterOptions
Creates a Promise
that is fulfilled when the
EventEmitter
emits the given event or that is
rejected if the EventEmitter
emits
'error'
while waiting. The
Promise
will resolve with an array of all the
arguments emitted to the given event.
This method is intentionally generic and works with the
web platform
EventTarget
interface, which has no special'error'
event
semantics and does not listen to the
'error'
event.
import { once, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
process.nextTick(() => {
ee.emit('myevent', 42);
});
const [value] = await once(ee, 'myevent');
console.log(value);
const err = new Error('kaboom');
process.nextTick(() => {
ee.emit('error', err);
});
try {
await once(ee, 'myevent');
} catch (err) {
console.error('error happened', err);
}
The special handling of the 'error'
event is
only used when events.once()
is used to wait
for another event. If events.once()
is used
to wait for the 'error'
event itself, then it
is treated as any other kind of event without special
handling:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
once(ee, 'error')
.then(([err]) => console.log('ok', err.message))
.catch((err) => console.error('error', err.message));
ee.emit('error', new Error('boom'));
// Prints: ok boom
An AbortSignal
can be used to cancel waiting
for the event:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
const ac = new AbortController();
async function foo(emitter, event, signal) {
try {
await once(emitter, event, { signal });
console.log('event emitted!');
} catch (error) {
if (error.name === 'AbortError') {
console.error('Waiting for the event was canceled!');
} else {
console.error('There was an error', error.message);
}
}
}
foo(ee, 'foo', ac.signal);
ac.abort(); // Abort waiting for the event
ee.emit('foo'); // Prints: Waiting for the event was canceled!
Optional
options:
StaticEventEmitterOptions
Static
setimport { setMaxListeners, EventEmitter } from 'node:events';
const target = new EventTarget();
const emitter = new EventEmitter();
setMaxListeners(5, target, emitter);
Optional
n:
number
A non-negative number. The maximum number of
listeners per EventTarget
event.
Static
to
The
message.aborted
property will betrue
if the request has been aborted.