Skip to content

Add async iterator support#301

Draft
danmactough wants to merge 1 commit intomasterfrom
modern-stream
Draft

Add async iterator support#301
danmactough wants to merge 1 commit intomasterfrom
modern-stream

Conversation

@danmactough
Copy link
Owner

Summary

  • Adds Symbol.asyncIterator to FeedParser.prototype so instances can be consumed directly with for await...of
  • Updates ESLint config to allow ES2018 syntax (async function*, for await...of)
  • Adds TypeScript declaration for the new method
  • Adds a passing test in test/examples.js

How it works

A Node.js Readable stream and an async iterator are two different ways of consuming sequential data. The stream world is push-based and event-driven — data arrives and fires events. The async iterator world is pull-based — a consumer asks "give me the next item" and waits.

readable-stream v2 only knows about events. To use for await...of, we need to bridge between them.

Why for await...of needs Symbol.asyncIterator

When JavaScript sees:

for await (const item of feedparser) { ... }

It looks for feedparser[Symbol.asyncIterator](). That method must return an object with a next() method that returns a Promise resolving to { value, done }. An async generator function (async function*) automatically produces exactly that object — the language handles the plumbing.

The bridge: how the async generator works

Step 1 — Drain what's already buffered:

while ((item = this.read()) !== null) {
  yield item;
}

The stream buffers items internally. read() returns them synchronously if they're already there. yield hands each one back to the for await consumer.

Step 2 — Wait when the buffer is empty:

await new Promise(function (r) { resolve = r; });

When read() returns null, there's nothing buffered yet. We create a Promise and store its resolver in the outer resolve variable — then await it. The generator is now suspended, releasing the event loop.

Step 3 — Events wake it up:

function onReadable() {
  if (resolve) { resolve(); resolve = null; }
}

When the stream has more data, it fires readable. Our handler calls the stored resolver, which fulfills the Promise — which wakes the suspended generator. It loops back to read() and drains again. The same pattern handles end (sets ended = true, wakes up → generator breaks the loop) and error (stores the error, wakes up → generator throws it).

Step 4 — Cleanup:

} finally {
  this.removeListener('readable', onReadable);
  // ...
}

If the consumer breaks early (break inside for await) the generator is garbage-collected, but first the finally block runs. Without this, the event listeners would leak — the stream would hold references to handlers that call a stale resolver.

Why not upgrade readable-stream?

readable-stream v3+ implements this same bridge internally on every Readable. But upgrading would change the stream implementation for everyone using this library — that's a behavioral change we don't control. Implementing it directly on FeedParser.prototype adds the bridge on top of the existing, stable stream behavior, touching nothing underneath.

Usage

var FeedParser = require('feedparser');
var fs = require('fs');

var feedparser = new FeedParser();
fs.createReadStream('./feed.xml').pipe(feedparser);

for await (const item of feedparser) {
  console.log(item.title);
}

🤖 Generated with Claude Code

FeedParser instances can now be consumed directly with for await...of.
The implementation bridges the push-based stream event model to the
pull-based async iteration protocol using an async generator, without
changing the underlying readable-stream v2 implementation.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant