Private
Optional
#accept_Private
Optional
#authorization
Private
Optional
#body
Private
Optional
#content_Private
Optional
#content_Private
Optional
#cookies
Private
Optional
#forwarded
Private
Optional
#id
Private
Optional
#if_Private
Optional
#query
Private
Optional
#range
Readonly
closed
Is true
after 'close'
has been
emitted.
v18.0.0
The message.complete
property will be
true
if a complete HTTP message has been received
and successfully parsed.
This property is particularly useful as a means of determining if a client or server fully transmitted a message before a connection was terminated:
const req = http.request({
host: '127.0.0.1',
port: 8080,
method: 'POST',
}, (res) => {
res.resume();
res.on('end', () => {
if (!res.complete)
console.error(
'The connection was terminated while the message was still being sent');
});
});
v0.3.0
Alias for message.socket
.
v0.1.90
Since v16.0.0 - Use socket
.
Is true
after readable.destroy()
has
been called.
v8.0.0
Readonly
errored
Returns error if the stream has been destroyed with an error.
v18.0.0
The request/response headers object.
Key-value pairs of header names and values. Header names are lower-cased.
// Prints something like:
//
// { 'user-agent': 'curl/7.22.0',
// host: '127.0.0.1:8000',
// accept: '*' }
console.log(request.headers);
Duplicates in raw headers are handled in the following ways, depending on the header name:
age
, authorization
,
content-length
,
content-type
,etag
,
expires
, from
, host
,
if-modified-since
,
if-unmodified-since
,last-modified
,
location
, max-forwards
,
proxy-authorization
,
referer
,retry-after
,
server
, or user-agent
are discarded.
To allow duplicate values of the headers listed above to be
joined, use the option joinDuplicateHeaders
in
request and createServer. See RFC 9110 Section 5.3 for more
information.
set-cookie
is always an array. Duplicates are
added to the array.
cookie
headers, the values are
joined together with ;
.
,
.
v0.1.5
Similar to message.headers
, but there is no join
logic and the values are always arrays of strings, even for
headers received just once.
// Prints something like:
//
// { 'user-agent': ['curl/7.22.0'],
// host: ['127.0.0.1:8000'],
// accept: ['*'] }
console.log(request.headersDistinct);
v18.3.0, v16.17.0
In case of server request, the HTTP version sent by the client.
In the case of client response, the HTTP version of the
connected-to server. Probably either
'1.1'
or '1.0'
.
Also message.httpVersionMajor
is the first integer
andmessage.httpVersionMinor
is the second.
v0.1.1
Optional
method
Only valid for request obtained from Server.
The request method as a string. Read only. Examples:
'GET'
, 'DELETE'
.
v0.1.1
The raw request/response headers list exactly as they were received.
The keys and values are in the same list. It is not a list of tuples. So, the even-numbered offsets are key values, and the odd-numbered offsets are the associated values.
Header names are not lowercased, and duplicates are not merged.
// Prints something like:
//
// [ 'user-agent',
// 'this is invalid because there can be only one',
// 'User-Agent',
// 'curl/7.22.0',
// 'Host',
// '127.0.0.1:8000',
// 'ACCEPT',
// '*' ]
console.log(request.rawHeaders);
v0.11.6
The raw request/response trailer keys and values exactly as they
were received. Only populated at the
'end'
event.
v0.11.6
Is true
if it is safe to call
readable.read()
, which means the stream has not
been destroyed or emitted 'error'
or
'end'
.
v11.4.0
Readonly
Experimental
readable
Returns whether the stream was destroyed or errored before
emitting 'end'
.
v16.8.0
Readonly
Experimental
readable
Returns whether 'data'
has been emitted.
v16.7.0, v14.18.0
Readonly
readable
Getter for the property encoding
of a given
Readable
stream. The encoding
property
can be set using the readable.setEncoding()
method.
v12.7.0
Readonly
readable
Becomes true
when 'end'
event
is emitted.
v12.9.0
Readonly
readable
This property reflects the current state of a
Readable
stream as described in the
Three states
section.
v9.4.0
Readonly
readable
Returns the value of highWaterMark
passed when
creating this Readable
.
v9.3.0
Readonly
readable
This property contains the number of bytes (or objects) in the
queue ready to be read. The value provides introspection data
regarding the status of the highWaterMark
.
v9.4.0
Readonly
readable
Getter for the property objectMode
of a given
Readable
stream.
v12.3.0
The net.Socket
object associated with the
connection.
With HTTPS support, use
request.socket.getPeerCertificate()
to obtain the
client's authentication details.
This property is guaranteed to be an instance of the
net.Socket
class, a subclass of
stream.Duplex
, unless the user specified a socket
type other than net.Socket
or internally nulled.
v0.3.0
Optional
statusOnly valid for response obtained from ClientRequest.
The 3-digit HTTP response status code. E.G. 404
.
v0.1.1
Optional
statusOnly valid for response obtained from ClientRequest.
The HTTP response status message (reason phrase). E.G.
OK
or Internal Server Error
.
v0.11.10
The request/response trailers object. Only populated at the
'end'
event.
v0.3.0
Similar to message.trailers
, but there is no join
logic and the values are always arrays of strings, even for
headers received just once. Only populated at the
'end'
event.
v18.3.0, v16.17.0
Optional
url
Only valid for request obtained from Server.
Request URL string. This contains only the URL that is present in the actual HTTP request. Take the following request:
GET /status?name=ryan HTTP/1.1
Accept: text/plain
To parse the URL into its parts:
new URL(request.url, `http://${request.headers.host}`);
When request.url
is
'/status?name=ryan'
and
request.headers.host
is'localhost:3000'
:
$ node
> new URL(request.url, `http://${request.headers.host}`)
URL {
href: 'http://localhost:3000/status?name=ryan',
origin: 'http://localhost:3000',
protocol: 'http:',
username: '',
password: '',
host: 'localhost:3000',
hostname: 'localhost',
port: '3000',
pathname: '/status',
search: '?name=ryan',
searchParams: URLSearchParams { 'name' => 'ryan' },
hash: ''
}
v0.1.90
Static
Readonly
captureValue: Symbol.for('nodejs.rejection')
See how to write a custom rejection handler
.
v13.4.0, v12.16.0
Static
captureValue: boolean
Change the default captureRejections
option on all
new EventEmitter
objects.
v13.4.0, v12.16.0
Static
default
By default, a maximum of 10
listeners can be
registered for any single event. This limit can be changed for
individual EventEmitter
instances using the
emitter.setMaxListeners(n)
method. To change the
default for allEventEmitter
instances, the
events.defaultMaxListeners
property can be used. If
this value is not a positive number, a RangeError
is
thrown.
Take caution when setting the
events.defaultMaxListeners
because the change
affects allEventEmitter
instances,
including those created before the change is made. However,
calling emitter.setMaxListeners(n)
still has
precedence over events.defaultMaxListeners
.
This is not a hard limit. The EventEmitter
instance
will allow more listeners to be added but will output a trace
warning to stderr indicating that a "possible EventEmitter
memory leak" has been detected. For any
singleEventEmitter
, the
emitter.getMaxListeners()
and
emitter.setMaxListeners()
methods can be used to
temporarily avoid this warning:
import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.setMaxListeners(emitter.getMaxListeners() + 1);
emitter.once('event', () => {
// do stuff
emitter.setMaxListeners(Math.max(emitter.getMaxListeners() - 1, 0));
});
The --trace-warnings
command-line flag can be used
to display the stack trace for such warnings.
The emitted warning can be inspected with
process.on('warning')
and will have the
additional emitter
, type
, and
count
properties, referring to the event emitter
instance, the event's name and the number of attached
listeners, respectively. Its name
property is set
to 'MaxListenersExceededWarning'
.
v0.11.2
Static
Readonly
error
This symbol shall be used to install a listener for only
monitoring 'error'
events. Listeners
installed using this symbol are called before the regular'error'
listeners are called.
Installing a listener using this symbol does not change the
behavior once an'error'
event is emitted.
Therefore, the process will still crash if no regular
'error'
listener is installed.
v13.6.0, v12.17.0
Optional
[captureRest
...args:
AnyRest
Optional
_construct
Event emitter The defined events on documents including:
Rest
...args:
any[]
This method returns a new stream with chunks of the
underlying stream paired with a counter in the form
[index, chunk]
. The first index value is
0
and it increases by 1 for each chunk
produced.
Optional
options:
Pick<ArrayOptions, "signal">
a stream of indexed pairs.
v17.5.0
This method returns a new stream with the first limit chunks dropped from the start.
the number of chunks to drop from the readable.
Optional
options:
Pick<ArrayOptions, "signal">
a stream with limit chunks dropped from the start.
v17.5.0
Rest
...args:
any[]
Returns an array listing the events for which the emitter
has registered listeners. The values in the array are
strings or Symbol
s.
import { EventEmitter } from 'node:events';
const myEE = new EventEmitter();
myEE.on('foo', () => {});
myEE.on('bar', () => {});
const sym = Symbol('symbol');
myEE.on(sym, () => {});
console.log(myEE.eventNames());
// Prints: [ 'foo', 'bar', Symbol(symbol) ]
v6.0.0
This method is similar to
Array.prototype.every
and calls fn on
each chunk in the stream to check if all awaited return
values are truthy value for fn. Once an
fn call on a chunk await
ed return
value is falsy, the stream is destroyed and the promise is
fulfilled with false
. If all of the
fn calls on the chunks return a truthy value, the
promise is fulfilled with true
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a promise evaluating to true
if
fn returned a truthy value for every one of the
chunks.
v17.5.0
This method allows filtering the stream. For each chunk in
the stream the fn function will be called and if it
returns a truthy value, the chunk will be passed to the
result stream. If the fn function returns a promise
- that promise will be await
ed.
a function to filter chunks from the stream. Async or not.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a stream filtered with the predicate fn.
v17.4.0, v16.14.0
This method is similar to
Array.prototype.find
and calls fn on
each chunk in the stream to find a chunk with a truthy value
for fn. Once an fn call's awaited
return value is truthy, the stream is destroyed and the
promise is fulfilled with value for which
fn returned a truthy value. If all of the
fn calls on the chunks return a falsy value, the
promise is fulfilled with undefined
.
a promise evaluating to the first chunk for which
fn evaluated with a truthy value, or
undefined
if no element was found.
v17.5.0
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
This method returns a new stream by applying the given callback to each chunk of the stream and then flattening the result.
It is possible to return a stream or another iterable or async iterable from fn and the result streams will be merged (flattened) into the returned stream.
a function to map over every chunk in the stream. May be async. May be a stream or generator.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a stream flat-mapped with the function fn.
v17.5.0
This method allows iterating a stream. For each chunk in the
stream the fn function will be called. If the
fn function returns a promise - that promise will
be await
ed.
This method is different from
for await...of
loops in that it can optionally
process chunks concurrently. In addition, a
forEach
iteration can only be stopped by having
passed a signal
option and aborting the related
AbortController while for await...of
can be
stopped with break
or return
. In
either case the stream will be destroyed.
This method is different from listening to the
'data'
event in that it uses the
readable
event in the underlying machinary and
can limit the number of concurrent fn calls.
a function to call on each chunk of the stream. Async or not.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a promise for when the stream has finished.
v17.5.0
Returns the current max listener value for the
EventEmitter
which is either set by
emitter.setMaxListeners(n)
or defaults to
defaultMaxListeners.
v1.0.0
The readable.isPaused()
method returns the
current operating state of theReadable
. This is
used primarily by the mechanism that underlies thereadable.pipe()
method. In most typical cases, there will be no reason to
use this method directly.
const readable = new stream.Readable();
readable.isPaused(); // === false
readable.pause();
readable.isPaused(); // === true
readable.resume();
readable.isPaused(); // === false
v0.11.14
The iterator created by this method gives users the option
to cancel the destruction of the stream if the
for await...of
loop is exited by
return
, break
, or
throw
, or if the iterator should destroy the
stream if the stream emitted an error during iteration.
Optional
options:
{ Optional
destroy
When set to false
, calling
return
on the async iterator, or
exiting a for await...of
iteration
using a break
, return
,
or throw
will not destroy the stream.
Default: true
.
v16.3.0
Returns the number of listeners listening for the event
named eventName
. If listener
is
provided, it will return how many times the listener is
found in the list of the listeners of the event.
The name of the event being listened for
Optional
listener:
Function
The event handler function
v3.2.0
Returns a copy of the array of listeners for the event named
eventName
.
server.on('connection', (stream) => {
console.log('someone connected!');
});
console.log(util.inspect(server.listeners('connection')));
// Prints: [ [Function] ]
v0.1.26
This method allows mapping over the stream. The
fn function will be called for every chunk in the
stream. If the fn function returns a promise - that
promise will be await
ed before being passed to
the result stream.
a function to map over every chunk in the stream. Async or not.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a stream mapped with the function fn.
v17.4.0, v16.14.0
Alias for emitter.removeListener()
.
Rest
...args:
any[]
v10.0.0
Rest
...args:
any[]
Rest
...args:
any[]
The readable.pause()
method will cause a stream
in flowing mode to stop emitting
'data'
events, switching out of flowing
mode. Any data that becomes available will remain in the
internal buffer.
const readable = getReadableStreamSomehow();
readable.on('data', (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
readable.pause();
console.log('There will be no additional data for 1 second.');
setTimeout(() => {
console.log('Now data will start flowing again.');
readable.resume();
}, 1000);
});
The readable.pause()
method has no effect if
there is a 'readable'
event listener.
v0.9.4
Rest
...args:
any[]
Rest
...args:
any[]
Returns a copy of the array of listeners for the event named
eventName
, including any wrappers (such as
those created by .once()
).
import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.once('log', () => console.log('log once'));
// Returns a new Array with a function `onceWrapper` which has a property
// `listener` which contains the original listener bound above
const listeners = emitter.rawListeners('log');
const logFnWrapper = listeners[0];
// Logs "log once" to the console and does not unbind the `once` event
logFnWrapper.listener();
// Logs "log once" to the console and removes the listener
logFnWrapper();
emitter.on('log', () => console.log('log persistently'));
// Will return a new Array with a single function bound by `.on()` above
const newListeners = emitter.rawListeners('log');
// Logs "log persistently" twice
newListeners[0]();
emitter.emit('log');
v9.4.0
The readable.read()
method reads data out of
the internal buffer and returns it. If no data is available
to be read, null
is returned. By default, the
data is returned as a Buffer
object unless an
encoding has been specified using the
readable.setEncoding()
method or the stream is
operating in object mode.
The optional size
argument specifies a specific
number of bytes to read. Ifsize
bytes are not
available to be read, null
will be returned
_unless_the stream has ended, in which case all of the data
remaining in the internal buffer will be returned.
If the size
argument is not specified, all of
the data contained in the internal buffer will be returned.
The size
argument must be less than or equal to
1 GiB.
The readable.read()
method should only be
called on Readable
streams operating in paused
mode. In flowing mode, readable.read()
is
called automatically until the internal buffer is fully
drained.
const readable = getReadableStreamSomehow();
// 'readable' may be triggered multiple times as data is buffered in
readable.on('readable', () => {
let chunk;
console.log('Stream is readable (new data received in buffer)');
// Use a loop to make sure we read all currently available data
while (null !== (chunk = readable.read())) {
console.log(`Read ${chunk.length} bytes of data...`);
}
});
// 'end' will be triggered once when there is no more data available
readable.on('end', () => {
console.log('Reached end of stream.');
});
Each call to readable.read()
returns a chunk of
data, or null
. The chunks are not concatenated.
A while
loop is necessary to consume all data
currently in the buffer. When reading a large file
.read()
may return null
, having
consumed all buffered content so far, but there is still
more data to come not yet buffered. In this case a new
'readable'
event will be emitted when
there is more data in the buffer. Finally the
'end'
event will be emitted when there
is no more data to come.
Therefore to read a file's whole contents from a
readable
, it is necessary to collect chunks
across multiple 'readable'
events:
const chunks = [];
readable.on('readable', () => {
let chunk;
while (null !== (chunk = readable.read())) {
chunks.push(chunk);
}
});
readable.on('end', () => {
const content = chunks.join('');
});
A Readable
stream in object mode will always
return a single item from a call to
readable.read(size)
, regardless of the value of
thesize
argument.
If the readable.read()
method returns a chunk
of data, a 'data'
event will also be
emitted.
Calling
read
after the 'end'
event has been emitted
will return null
. No runtime error will be
raised.
Optional
size:
number
Optional argument to specify how much data to read.
v0.9.4
This method calls fn on each chunk of the stream in order, passing it the result from the calculation on the previous element. It returns a promise for the final value of the reduction.
If no initial value is supplied the first chunk of
the stream is used as the initial value. If the stream is
empty, the promise is rejected with a
TypeError
with the
ERR_INVALID_ARGS
code property.
The reducer function iterates the stream element-by-element
which means that there is no concurrency parameter
or parallelism. To perform a reduce concurrently, you can
extract the async function to
readable.map
method.
a reducer function to call over every chunk in the stream. Async or not.
Optional
initial:
undefined
the initial value to use in the reduction.
Optional
options:
Pick<ArrayOptions, "signal">
a promise for the final value of the reduction.
v17.5.0
Removes all listeners, or those of the specified
eventName
.
It is bad practice to remove listeners added elsewhere in
the code, particularly when the
EventEmitter
instance was created by some other
component or module (e.g. sockets or file streams).
Returns a reference to the EventEmitter
, so
that calls can be chained.
Optional
eventName:
string | symbol
v0.1.26
Rest
...args:
any[]
The readable.resume()
method causes an
explicitly paused Readable
stream to resume
emitting 'data'
events, switching the
stream into flowing mode.
The readable.resume()
method can be used to
fully consume the data from a stream without actually
processing any of that data:
getReadableStreamSomehow()
.resume()
.on('end', () => {
console.log('Reached the end, but did not read anything.');
});
The readable.resume()
method has no effect if
there is a 'readable'
event listener.
v0.9.4
The readable.setEncoding()
method sets the
character encoding for data read from the
Readable
stream.
By default, no encoding is assigned and stream data will be
returned asBuffer
objects. Setting an encoding
causes the stream data to be returned as strings of the
specified encoding rather than as
Buffer
objects. For instance, calling
readable.setEncoding('utf8')
will cause
the output data to be interpreted as UTF-8 data, and passed
as strings. Callingreadable.setEncoding('hex')
will cause the data to be encoded in hexadecimal string
format.
The Readable
stream will properly handle
multi-byte characters delivered through the stream that
would otherwise become improperly decoded if simply pulled
from the stream as Buffer
objects.
const readable = getReadableStreamSomehow();
readable.setEncoding('utf8');
readable.on('data', (chunk) => {
assert.equal(typeof chunk, 'string');
console.log('Got %d characters of string data:', chunk.length);
});
The encoding to use.
v0.9.4
By default EventEmitter
s will print a warning
if more than 10
listeners are added for a
particular event. This is a useful default that helps
finding memory leaks. The
emitter.setMaxListeners()
method allows the
limit to be modified for this specific
EventEmitter
instance. The value can be set
toInfinity
(or 0
) to indicate an
unlimited number of listeners.
Returns a reference to the EventEmitter
, so
that calls can be chained.
v0.3.5
This method is similar to
Array.prototype.some
and calls fn on
each chunk in the stream until the awaited return value is
true
(or any truthy value). Once an
fn call on a chunk await
ed return
value is truthy, the stream is destroyed and the promise is
fulfilled with true
. If none of the
fn calls on the chunks return a truthy value, the
promise is fulfilled with false
.
a function to call on each chunk of the stream. Async or not.
Optional
options:
Pick<ArrayOptions, "signal">
Optional
options:
ArrayOptions
a promise evaluating to true
if
fn returned a truthy value for at least one of the
chunks.
v17.5.0
This method allows easily obtaining the contents of a stream.
As this method reads the entire stream into memory, it negates the benefits of streams. It's intended for interoperability and convenience, not as the primary way to consume streams.
Optional
options:
Pick<ArrayOptions, "signal">
a promise containing an array with the contents of the stream.
v17.5.0
The readable.unpipe()
method detaches a
Writable
stream previously attached using the
pipe
method.
If the destination
is not specified, then
all pipes are detached.
If the destination
is specified, but no pipe is
set up for it, then the method does nothing.
const fs = require('node:fs');
const readable = getReadableStreamSomehow();
const writable = fs.createWriteStream('file.txt');
// All the data from readable goes into 'file.txt',
// but only for the first second.
readable.pipe(writable);
setTimeout(() => {
console.log('Stop writing to file.txt.');
readable.unpipe(writable);
console.log('Manually close the file stream.');
writable.end();
}, 1000);
Optional
destination:
WritableStream
Optional specific stream to unpipe
v0.9.4
Passing chunk
as null
signals the
end of the stream (EOF) and behaves the same as
readable.push(null)
, after which no more data
can be written. The EOF signal is put at the end of the
buffer and any buffered data will still be flushed.
The readable.unshift()
method pushes a chunk of
data back into the internal buffer. This is useful in
certain situations where a stream is being consumed by code
that needs to "un-consume" some amount of data
that it has optimistically pulled out of the source, so that
the data can be passed on to some other party.
The stream.unshift(chunk)
method cannot be
called after the 'end'
event has been
emitted or a runtime error will be thrown.
Developers using stream.unshift()
often should
consider switching to use of a Transform
stream
instead. See the
API for stream implementers
section for more
information.
// Pull off a header delimited by \n\n.
// Use unshift() if we get too much.
// Call the callback with (error, header, stream).
const { StringDecoder } = require('node:string_decoder');
function parseHeader(stream, callback) {
stream.on('error', callback);
stream.on('readable', onReadable);
const decoder = new StringDecoder('utf8');
let header = '';
function onReadable() {
let chunk;
while (null !== (chunk = stream.read())) {
const str = decoder.write(chunk);
if (str.includes('\n\n')) {
// Found the header boundary.
const split = str.split(/\n\n/);
header += split.shift();
const remaining = split.join('\n\n');
const buf = Buffer.from(remaining, 'utf8');
stream.removeListener('error', callback);
// Remove the 'readable' listener before unshifting.
stream.removeListener('readable', onReadable);
if (buf.length)
stream.unshift(buf);
// Now the body of the message can be read from the stream.
callback(null, header, stream);
return;
}
// Still reading the header.
header += str;
}
}
}
Unlike
push, stream.unshift(chunk)
will not end the
reading process by resetting the internal reading state of
the stream. This can cause unexpected results if
readable.unshift()
is called during a read
(i.e. from within a
_read
implementation on a custom stream). Following the call to
readable.unshift()
with an immediate
push
will reset the reading state appropriately, however it is
best to simply avoid calling
readable.unshift()
while in the process of
performing a read.
Chunk of data to unshift onto the read queue. For
streams not operating in object mode,
chunk
must be a string,
Buffer
, Uint8Array
, or
null
. For object mode streams,
chunk
may be any JavaScript value.
Optional
encoding:
BufferEncoding
Encoding of string chunks. Must be a valid
Buffer
encoding, such as
'utf8'
or
'ascii'
.
v0.9.11
Prior to Node.js 0.10, streams did not implement the entire
node:stream
module API as it is currently
defined. (See Compatibility
for more
information.)
When using an older Node.js library that emits
'data'
events and has a
pause
method that is advisory only, thereadable.wrap()
method can be used to create a Readable
stream
that uses the old stream as its data source.
It will rarely be necessary to use
readable.wrap()
but the method has been
provided as a convenience for interacting with older Node.js
applications and libraries.
const { OldReader } = require('./old-api-module.js');
const { Readable } = require('node:stream');
const oreader = new OldReader();
const myReader = new Readable().wrap(oreader);
myReader.on('readable', () => {
myReader.read(); // etc.
});
An "old style" readable stream
v0.9.4
Static
addExperimental
Listens once to the abort
event on the provided
signal
.
Listening to the abort
event on abort signals
is unsafe and may lead to resource leaks since another third
party with the signal can call
e.stopImmediatePropagation()
. Unfortunately
Node.js cannot change this since it would violate the web
standard. Additionally, the original API makes it easy to
forget to remove listeners.
This API allows safely using AbortSignal
s in
Node.js APIs by solving these two issues by listening to the
event such that stopImmediatePropagation
does
not prevent the listener from running.
Returns a disposable so that it may be unsubscribed from more easily.
import { addAbortListener } from 'node:events';
function example(signal) {
let disposable;
try {
signal.addEventListener('abort', (e) => e.stopImmediatePropagation());
disposable = addAbortListener(signal, (e) => {
// Do something when signal is aborted.
});
} finally {
disposable?.[Symbol.dispose]();
}
}
Disposable that removes the abort
listener.
v20.5.0
Static
from
Static
fromStatic
get
Returns a copy of the array of listeners for the event named
eventName
.
For EventEmitter
s this behaves exactly the same
as calling .listeners
on the emitter.
For EventTarget
s this is the only way to get
the event listeners for the event target. This is useful for
debugging and diagnostic purposes.
import { getEventListeners, EventEmitter } from 'node:events';
{
const ee = new EventEmitter();
const listener = () => console.log('Events are fun');
ee.on('foo', listener);
console.log(getEventListeners(ee, 'foo')); // [ [Function: listener] ]
}
{
const et = new EventTarget();
const listener = () => console.log('Events are fun');
et.addEventListener('foo', listener);
console.log(getEventListeners(et, 'foo')); // [ [Function: listener] ]
}
v15.2.0, v14.17.0
Static
getReturns the currently set max amount of listeners.
For EventEmitter
s this behaves exactly the same
as calling .getMaxListeners
on the emitter.
For EventTarget
s this is the only way to get
the max event listeners for the event target. If the number
of event handlers on a single EventTarget exceeds the max
set, the EventTarget will print a warning.
import { getMaxListeners, setMaxListeners, EventEmitter } from 'node:events';
{
const ee = new EventEmitter();
console.log(getMaxListeners(ee)); // 10
setMaxListeners(11, ee);
console.log(getMaxListeners(ee)); // 11
}
{
const et = new EventTarget();
console.log(getMaxListeners(et)); // 10
setMaxListeners(11, et);
console.log(getMaxListeners(et)); // 11
}
v19.9.0
Static
isStatic
listener
A class method that returns the number of listeners for the
given eventName
registered on the given
emitter
.
import { EventEmitter, listenerCount } from 'node:events';
const myEmitter = new EventEmitter();
myEmitter.on('event', () => {});
myEmitter.on('event', () => {});
console.log(listenerCount(myEmitter, 'event'));
// Prints: 2
The emitter to query
The event name
v0.9.12
Since v3.2.0 - Use listenerCount
instead.
Static
on
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo')) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
Returns an AsyncIterator
that iterates
eventName
events. It will throw if the
EventEmitter
emits
'error'
. It removes all listeners when
exiting the loop. The value
returned by each
iteration is an array composed of the emitted event
arguments.
An AbortSignal
can be used to cancel waiting on
events:
import { on, EventEmitter } from 'node:events';
import process from 'node:process';
const ac = new AbortController();
(async () => {
const ee = new EventEmitter();
// Emit later on
process.nextTick(() => {
ee.emit('foo', 'bar');
ee.emit('foo', 42);
});
for await (const event of on(ee, 'foo', { signal: ac.signal })) {
// The execution of this inner block is synchronous and it
// processes one event at a time (even with await). Do not use
// if concurrent execution is required.
console.log(event); // prints ['bar'] [42]
}
// Unreachable here
})();
process.nextTick(() => ac.abort());
The name of the event being listened for
Optional
options:
StaticEventEmitterOptions
that iterates eventName
events emitted by the
emitter
v13.6.0, v12.16.0
Static
once
Creates a Promise
that is fulfilled when the
EventEmitter
emits the given event or that is
rejected if the EventEmitter
emits
'error'
while waiting. The
Promise
will resolve with an array of all the
arguments emitted to the given event.
This method is intentionally generic and works with the web
platform
EventTarget
interface, which has no special'error'
event semantics and does not listen to the
'error'
event.
import { once, EventEmitter } from 'node:events';
import process from 'node:process';
const ee = new EventEmitter();
process.nextTick(() => {
ee.emit('myevent', 42);
});
const [value] = await once(ee, 'myevent');
console.log(value);
const err = new Error('kaboom');
process.nextTick(() => {
ee.emit('error', err);
});
try {
await once(ee, 'myevent');
} catch (err) {
console.error('error happened', err);
}
The special handling of the
'error'
event is only used when
events.once()
is used to wait for another event.
If events.once()
is used to wait for the
'error'
event itself, then it is
treated as any other kind of event without special handling:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
once(ee, 'error')
.then(([err]) => console.log('ok', err.message))
.catch((err) => console.error('error', err.message));
ee.emit('error', new Error('boom'));
// Prints: ok boom
An AbortSignal
can be used to cancel waiting
for the event:
import { EventEmitter, once } from 'node:events';
const ee = new EventEmitter();
const ac = new AbortController();
async function foo(emitter, event, signal) {
try {
await once(emitter, event, { signal });
console.log('event emitted!');
} catch (error) {
if (error.name === 'AbortError') {
console.error('Waiting for the event was canceled!');
} else {
console.error('There was an error', error.message);
}
}
}
foo(ee, 'foo', ac.signal);
ac.abort(); // Abort waiting for the event
ee.emit('foo'); // Prints: Waiting for the event was canceled!
Optional
options:
StaticEventEmitterOptions
v11.13.0, v10.16.0
Optional
options:
StaticEventEmitterOptions
Static
setimport { setMaxListeners, EventEmitter } from 'node:events';
const target = new EventTarget();
const emitter = new EventEmitter();
setMaxListeners(5, target, emitter);
Optional
n:
number
A non-negative number. The maximum number of listeners
per EventTarget
event.
Rest
...eventTargets:
(EventEmitter<DefaultEventMap> | _DOMEventTarget)[]
v15.4.0
Static
to
The
message.aborted
property will betrue
if the request has been aborted.