CSVParse for Node.js

IssuesGitHub

Node.js Stream API

The main module exported by this package implements the native Node.js transform stream. Transform streams implement both the Readable and Writable interfaces.

This is the recommended approach if you need a maximum of power. The stream API might not be the most pleasant API to use but it ensures scalability by treating your data as a stream from the source to the destination.

The signature is const stream = parse([options]).

Both a readable and writable stream

In the stream example, CSV data is sent through the write function and the resulting data is obtained within the readable event by calling the read function.

import assert from 'node:assert';
import { parse } from 'csv-parse';

const records = [];
// Initialize the parser
const parser = parse({
  delimiter: ':'
});
// Use the readable stream api to consume records
parser.on('readable', function(){
  let record;
  while ((record = parser.read()) !== null) {
    records.push(record);
  }
});
// Catch any error
parser.on('error', function(err){
  console.error(err.message);
});
// Test that the parsed records matched the expected records
parser.on('end', function(){
  assert.deepStrictEqual(
    records,
    [
      [ 'root','x','0','0','root','/root','/bin/bash' ],
      [ 'someone','x','1022','1022','','/home/someone','/bin/bash' ]
    ]
  );
});
// Write data to the stream
parser.write('root:x:0:0:root:/root:/bin/bash\n');
parser.write('someone:x:1022:1022::/home/someone:/bin/bash\n');
// Close the readable stream
parser.end();

Using the pipe function

The stream API is extensive and connecting multiple streams together is a complex task for newcomers. Part of the stream API, the pipe function does just that. The pipe recipe explains its usage and provides an example.

About

The Node.js CSV project is an open source product hosted on GitHub and developed by Adaltas.