Skip to content

Commit

Permalink
feat(csv): new api and examples section
Browse files Browse the repository at this point in the history
  • Loading branch information
wdavidw committed Jan 9, 2024
1 parent 7321894 commit 93a47e0
Show file tree
Hide file tree
Showing 10 changed files with 113 additions and 42 deletions.
2 changes: 1 addition & 1 deletion src/md/parse/examples/async_iterator.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Async iterator
description: CSV Parse - how to use ES6 async iterator to traverse your records.
keywords: ['csv', 'parse', 'parser', 'recipe', 'async', 'iterator', 'stream', 'pipe', 'read', 'promise']
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'async', 'iterator', 'stream', 'pipe', 'read', 'promise']
---

# Async iterator
Expand Down
11 changes: 8 additions & 3 deletions src/md/parse/examples/file_interaction.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
---
title: File system interaction
description: Read and write UTF-8 CSV files
keywords: ['csv', 'parse', 'parser', 'recipe', 'file', 'fs', 'read', 'write', 'utf8', 'utf-8', 'bom']
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'file', 'fs', 'read', 'write', 'utf8', 'utf-8', 'bom']
---

# File system interaction

This recipe illustrates how to read and write to an UTF-8 file with a byte order mark (BOM).
This page provides 2 recipes to illustrate how.

- Using the `sync` API to read and write to an UTF-8 file with a byte order mark (BOM)
- Using the `sync` API to read an alternate encoding

The native Node.js File System module named `fs` is used to read the content of a file. The parser doesn't provide any file access method, it is not its responsibility, and using the native `fs` module conjointly with the `csv-parse` is easy and natural.

You must first choose the right API. This package exposed multiple API all backed by the same parsing algorithm and supporting the same options. Whether you select one API over another one encompasses the scope of this page and is documented inside the [API section](/parse/api/).

The easiest way is using the sync API. You read the file and get its content. You then inject this content into the parser and get the result as an array of records. Records may be printed to the console and written to a file one JSON per line for each record. The [final code](https://github.com/adaltas/node-csv/blob/master/packages/csv-parse/samples/recipe.file.js) looks like:
## Using the `sync` API

The easiest way is using the [sync API](/parse/api/sync/). You read the file and get its content. Then, you inject this content into the parser and get the result as an array of records. Records may be printed to the console and written to a file one JSON per line for each record. The [`bom` option](/parse/options/bom/) detect en remove the BOM present inside the data source if present. The [final code](https://github.com/adaltas/node-csv/blob/master/packages/csv-parse/samples/recipe.file.js) looks like:

`embed:packages/csv-parse/samples/recipe.file.js`

Expand Down
2 changes: 1 addition & 1 deletion src/md/parse/examples/promises.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Promises usage
description: CSV Parse - how to use promises with the latest Nodejs Stream API.
keywords: ['csv', 'parse', 'parser', 'recipe', 'promises', 'stream', 'pipe', 'read', 'async']
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'promises', 'stream', 'pipe', 'read', 'async']
---

# Promises usage
Expand Down
4 changes: 2 additions & 2 deletions src/md/parse/examples/stream_pipe.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Stream pipe
description: CSV Parse - stream, callback and sync APIs
keywords: ['csv', 'parse', 'parser', 'recipe', 'stream', 'sync', 'pipe', 'read', 'write']
description: CSV Parse - learn how to leverage the Node.js stream pipe API with CSV
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'stream', 'sync', 'pipe', 'read', 'write']
---

# Using pipe to connect multiple streams
Expand Down
24 changes: 24 additions & 0 deletions src/md/project/api.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
---
title: API
description: CSV - stream, callback and sync APIs
keywords: ['csv', 'parse', 'parser', 'api', 'callback', 'stream', 'sync', 'promise']
sort: 3
---

# CSV API

There are multiple APIs and styles available, each with their own advantages and disadvantages. Under the hood, they are all based on the same implementation.

* [Sync API](/parse/api/sync/)
The sync API provides simplicity, readability and convenience. Like for the callback API, it is meant for small dataset which fit in memory.
* [Stream API](/parse/api/stream/)
The stream API might not be the most pleasant API to use but is scalable.
* [Callback API](/parse/api/callback/)
The callback API buffers all the emitted data from the stream API into a single object which is passed to a user provided function. Passing a function is easier than implementing the stream events function but it implies that the all dataset must fit into the available memory and it will only be available after the last record has been processed. This is usually not recommanded, use the Sync API instead.

For additional usages and examples, you may refer to:

* [the parser API pages](/parse/api/),
* [the stringifier API pages](/stringify/api/),
* [the "samples" folder](https://github.com/adaltas/node-csv/tree/master/packages/csv/samples)
* [the "test" folder](https://github.com/adaltas/node-csv/tree/master/packages/csv/test).
12 changes: 12 additions & 0 deletions src/md/project/api/callback.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
---
title: Callback
description: CSV - learn how to leverage the Node.js stream pipe API with CSV
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'stream', 'async', 'pipe', 'read', 'write']
sort: 3.2
---

# Callback API

Also available in the `csv` module is the callback API. The all dataset is available in the second callback argument. Thus it will not scale with large dataset. The [callback example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/callback.js) initialize each CSV function sequentially, with the output of the previous one. Note, for the sake of clarity, the example doesn't deal with error management. It is enough spaghetti code.

`embed:packages/csv/samples/callback.js`
24 changes: 24 additions & 0 deletions src/md/project/api/stream.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
---
title: Stream
description: CSV - learn how to leverage the Node.js stream pipe API with CSV
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'stream', 'async', 'pipe', 'read', 'write']
sort: 3.1
---

# Node.js stream API

The Node.js stream API is scalable and offers the greatest control over the data flow.

## Using the pipe API

Pipes in Node.js is a native functionnality provided by the [stream API](https://nodejs.org/api/stream.html). It behave just like Unix pipes where the output of a process, here a stream reader, is redirected as the input of the following process, here a stream writer.

The [pipe example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/pipe.js) is quite readable while also scalable:

`embed:packages/csv/samples/pipe.js`

## Using the native stream functions

The native stream functions provide flexibity but comes at the cost of being more verbose and harder to write. Data is consumed inside the `readable` event with the `stream.read` function. It is then written by calling the `stream.write` function. The [stream example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/stream.js) illustrates how to initialize each packages and how to plug them.

`embed:packages/csv/samples/stream.js`
16 changes: 16 additions & 0 deletions src/md/project/api/sync.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
title: Sync
description: CSV - learn how to leverage the Node.js stream pipe API with CSV
keywords: ['csv', 'parse', 'parser', 'example', 'recipe', 'stream', 'async', 'pipe', 'read', 'write']
sort: 3.3
---

# Sync API

The sync API behave like [pure functions](https://en.wikipedia.org/wiki/Pure_function). For a given input, it always produce the same output.

Because of its simplicity, this is the recommended approach if you don't need scalability and if your dataset fit in memory.

The module to import is `csv/sync`. The [sync example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/sync.js) illustrate its usage.

```embed:packages/csv/samples/sync.js```
47 changes: 12 additions & 35 deletions src/md/project/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,38 +6,15 @@ sort: 5

# CSV Examples

## Introduction

This package proposes different API flavors. Every example is available on [GitHub](https://github.com/adaltas/node-csv/tree/master/samples).

## Using the stream API

The Node.js stream API is scalable and offers the greatest control over the data flow. It comes at the cost of being more verbose and harder to write. Data is consumed inside the `readable` event with the `stream.read` function. It is then written by calling the `stream.write` function. The [stream example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/stream.js) illustrates how to initialize each packages and how to plug them.

`embed:packages/csv/samples/stream.js`

## Using the pipe API

Piping in Node.js is part of the stream API and behave just like Unix pipes where the output of a process, here a function, is redirected as the input of the following process. A [pipe example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/pipe.funny.js) is provided with an unconventional syntax:

`embed:packages/csv/samples/pipe.funny.js`

A [more conventional pipe example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/pipe.js) is:

`embed:packages/csv/samples/pipe.js`

## Using the callback API

Also available in the `csv` module is the callback API. The all dataset is available in the second callback argument. Thus it will not scale with large dataset. The [callback example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/callback.js) initialize each CSV function sequentially, with the output of the previous one. Note, for the sake of clarity, the example doesn't deal with error management. It is enough spaghetti code.

`embed:packages/csv/samples/callback.js`

## Using the sync API

The sync API behave like [pure functions](https://en.wikipedia.org/wiki/Pure_function). For a given input, it always produce the same output.

Because of its simplicity, this is the recommended approach if you don't need scalability and if your dataset fit in memory.

The module to import is `csv/sync`. The [sync example](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/sync.js) illustrate its usage.

```embed:packages/csv/samples/sync.js```
Learn how to use the parser with multiple examples.

* [File interaction](/project/examples/file_interaction/)
Read and write CSV content into a file.

Additionnally, the [API](/parse/api/) and [options](/parse/options/) documentations come with numerous examples.

Also, you may refer to the examples of each sub project:

* [the parser examples](/parse/examples/),
* [the stringifier examples](/stringify/examples/),
* [the generator examples](/generate/examples/).
13 changes: 13 additions & 0 deletions src/md/project/examples/file_interaction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
---
title: File system interaction
description: Read and write files with pipes
keywords: ['csv', 'example', 'recipe', 'file', 'fs', 'read', 'write']
---

# File system interaction

The native Node.js File System module named `fs` is used to read and write the content of a file.

This [file system recipe](https://github.com/adaltas/node-csv/blob/master/packages/csv/samples/example.fs.js) illustrates how to use the pipe function.

`embed:packages/csv/samples/example.fs.js`

0 comments on commit 93a47e0

Please sign in to comment.