CSVTransform for Node.js

IssuesGitHub

Node.js Stream API

The main module exported by the package is a native Node.js Transform stream. Transform streams implement both the Readable and Writable interfaces.

This is the recommended approach if you need a maximum of power. It ensures scalability by treating your data as a stream from the source to the destination and support all the options available.

The signature is const stream = transform(records, [options], handler, [options], [callback]).

Both a readable and writable stream

In the stream example, records in the form of an array are sent through the write function and the transformed records are obtained within the "readable" event by calling the read function.

import { transform } from 'stream-transform';
import assert from 'assert';

const output = [];
// Initialize the transformer
const transformer = transform(function(data){
  data.push(data.shift());
  return data;
});
// Use the readable stream api to consume transformed records
transformer.on('readable', function(){
  let row; while((row = transformer.read()) !== null){
    output.push(row);
  }
});
// Catch any error
transformer.on('error', function(err){
  console.error(err.message);
});
// When finished, validate the records with the expected value
transformer.on('finish', function(){
  assert.deepEqual(output, [
    [ '2', '3', '4', '1' ],
    [ 'b', 'c', 'd', 'a' ]
  ]);
});
// Write records to the stream
transformer.write(['1','2','3','4']);
transformer.write(['a','b','c','d']);
// Close the writable stream
transformer.end();

This example is available with the command node samples/api.stream.js.