Explain Codes LogoExplain Codes Logo

How to create streams from a string in Node.js?

javascript
stream-engineering
node-js
stream-handling
Anton ShumikhinbyAnton Shumikhin·Feb 16, 2025
TLDR

Here's the quick solution to convert a string to a stream in Node.js with Readable.from:

const { Readable } = require('stream'); const myStream = Readable.from('Hello, World!'); myStream.on('data', (chunk) => { console.log(chunk.toString()); // Outputs: Hello, World! // Yeah, I know, we're living in a world run by JavaScript, wild right? });

This single expression harnesses the power of Readable.from to morph a string into a stream.

But talking about Node.js versions before the 12.x era, you can always replicate the above with a homemade recipe using the Readable class:

const stringToStream = (str) => { let stream = new Readable({ read() {} // Yes, it does nothing. Sitting idle is also an art. }); stream.push(str); stream.push(null); // We're calling it a day. Time to go home. return stream; }; const myStream = stringToStream('Hello, World!'); myStream.on('data', (chunk) => { console.log(chunk.toString()); // And, there you have it! });

You will need to implement _read method. Even if it's a no-op, it helps to meet the rules.

Making it work and doing it right

The simpler method with PassThrough

Here's an alternative – using PassThrough streams for a simpler solution:

const { PassThrough } = require('stream'); const stringToStream = (str) => { const passThrough = new PassThrough(); passThrough.write(str); passThrough.end(); return passThrough; }; const myStream = stringToStream('Hello, World!'); // Yes, you can still pipe it myStream.pipe(process.stdout); // See, I told ya!

The PassThrough stream offers a simpler way when you need to create a readable stream without any complex transformations.

Handling stream data with the data event

When working with streams formed from strings, listening to the 'data' event is crucial:

myStream.on('data', (chunk) => { process.stdout.write(chunk.toString()); // Chunk by chunk, just like how I eat my favorite cookie. });

The 'data' event handler ensures you process each chunk of your string stream as it arrives.

Pipe it to optimize data flow

To keep data flowing smoothly, go for stream piping:

myStream.pipe(someWritableStream); // You don't need to stand in line, there's a pipe for you.

Piping improves data flow, perfect for writing to files, network responses, or interfacing with other streams.

Look before you pass

Before passing your newly created stream to libraries like ya-csv, make sure it's in flowing mode:

myStream.resume(); // Off pause and play. Kinda reminds you of your old cassette player, doesn't it?

This is required for the library to handle the stream data right.

Handle with care

Wake up a sleeping stream

Paused? No worries, you can always wake up a paused stream:

// What happens when your stream decides to take a nap? myStream.on('pause', () => { console.log('Stream is paused, resuming...'); myStream.resume(); // Hey Stream, wake up! We have work to do. });

Resuming ensures the data journeys on, and your processing doesn't stand still.

Sign off in style

When preparing a stream using the Readable class, don't forget to sign off:

stream.push('Your data here'); stream.push(null); // Okay folks, that's all we have for today.

This works as the EOF (End-Of-File signal), informing consumers that we're done with the stream.