Explain Codes LogoExplain Codes Logo

Retrieve data from a ReadableStream object?

javascript
fetch-api
readablestream
stream-parsing
Anton ShumikhinbyAnton Shumikhin·Jan 4, 2025
TLDR

Get data from a ReadableStream by getReader() and read() syntax:

const reader = stream.getReader(); let chunks = [], decoder = new TextDecoder(); reader.read().then(function processData({ done, value }) { if (done) { console.log('Stream closed, all chunks received', decoder.decode(concat(chunks))); return; } chunks.push(value); // Chunks! 'Natural habitat' of stream busters. reader.read().then(processData); }); // Quick concat helper function concat(chunks) { return chunks.reduce((acc, chunk) => new Uint8Array([...acc, ...chunk]), new Uint8Array()); }

This code repeatedly uses read(), collects data chunks, then decodes, and logs the entire text.

Utilize Fetch API with ReadableStream

Fetch from APIs with fetch(), parse JSON data in style with response.json() or get text data flexy with response.text(). To flex your binary data muscles, use the ever-powerful response.arrayBuffer().

fetch('https://api.example.com/data') .then(response => response.json()) // "JSON, assemble!" .then(data => console.log(data)) .catch(error => console.error('Oops, watch your step!', error)); // Non-JSON data fetch('https://api.example.com/data') .then(response => response.text()) // "I'm a textual being." .then(text => console.log(text)) .catch(error => console.error('Oops, watch your step!', error)); // Binary data fun fetch('https://api.example.com/binary-data') .then(response => response.arrayBuffer()) // "Binary, roll out!" .then(buff => console.log(new Uint8Array(buff))) .catch(error => console.error('Oops, watch your step!', error));

Psst... all calls to response.json(), response.text(), and response.arrayBuffer() automatically concatenates and decodes the stream for you. Handy, right?

Manual ReadableStream parsing for the brave

Manual parsing of a ReadableStream is as thrilling as a blockbuster movie. Use getReader() and a stream-reading loop to act out every chunk drama. TextDecoder is your best buddy—use it to concatenate and decode the chunks.

async function streamToString(stream) { const reader = stream.getReader(); const decoder = new TextDecoder(); // Decoder - The unsung hero let result = ''; while (true) { const { done, value } = await reader.read(); if (done) break; result += decoder.decode(value, { stream: true }); } result += decoder.decode(); // A final toast to the stream return result; } // Casting time! If data is JSON streamToString(yourReadableStream) .then(str => JSON.parse(str)) .then(data => console.log('And... Action!', data)) .catch(error => console.error('Cut, cut, cut!', error));

Taming errors and the less-traveled path

Catch potential bugs in action using catch() (always implement error handling, no exceptions!). Juggle with decoding method or custom parser production for supporting non-standard data types or encodings.

fetch('https://api.example.com/data').then(response => { if (!response.ok) { throw new Error('Eagle down, I repeat, eagle is down!'); } return response.json(); }) .catch(error => { // Catch and release... console.error('Caught one, boss:', error); });

Node.js follows a different stream flow. Node provides a stream and the all-new shiny stream/consumers API to soak up streams in an environment-friendly fashion.

For scenario-specific data transformations, refer to the shiny TransformStream class or pipeline-streams using libraries.