Stream API

The main module exported by this package implements the native Node.js transform stream. Transform streams implement both the Readable and Writable interfaces.

This is the recommended approach if you need a maximum of power. It ensures scalability by treating your data as a stream from the source to the destination.

The signature is const stream = parse([options]).

Both a readable and writable stream

In the stream example, CSV data is sent through the write function and the resulting data is obtained within the readable event by calling the read function.

const parse = require('csv-parse')
const assert = require('assert')
const output = []
// Create the parser
const parser = parse({
  delimiter: ':'
// Use the readable stream api
parser.on('readable', function(){
  let record
  while (record = {
// Catch any error
parser.on('error', function(err){
// When we are done, test that the parsed output matched what expected
parser.on('end', function(){
      [ 'root','x','0','0','root','/root','/bin/bash' ],
      [ 'someone','x','1022','1022','','/home/someone','/bin/bash' ]
// Write data to the stream
// Close the readable stream

Using the pipe function

One useful function of the Stream API is pipe to interact between multiple streams. You may use this function to connect a stream.Readable source to a stream.Writable destination.

The pipe example reads a file, parses its content, transforms it and print the result to the standard output.

This example is available with the command node samples/api.pipe.js.

const parse = require('csv-parse')
const generate = require('csv-generate')
const transform = require('stream-transform')

const generator = generate({
  length: 20
const parser = parse({
  delimiter: ':'
const transformer = transform(function(record, callback){
    callback(null, record.join(' ')+'\n')
  }, 500)
}, {
  parallel: 5