-
Notifications
You must be signed in to change notification settings - Fork 889
/
transports.md
1197 lines (900 loc) · 36.2 KB
/
transports.md
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
# Transports
Pino transports can be used for both transmitting and transforming log output.
The way Pino generates logs:
1. Reduces the impact of logging on an application to the absolute minimum.
2. Gives greater flexibility in how logs are processed and stored.
It is recommended that any log transformation or transmission is performed either
in a separate thread or a separate process.
Before Pino v7 transports would ideally operate in a separate process - these are
now referred to as [Legacy Transports](#legacy-transports).
From Pino v7 and upwards transports can also operate inside a [Worker Thread][worker-thread]
and can be used or configured via the options object passed to `pino` on initialization.
In this case the transports would always operate asynchronously, and logs would be
flushed as quickly as possible (there is nothing to do).
[worker-thread]: https://nodejs.org/dist/latest-v14.x/docs/api/worker_threads.html
## v7+ Transports
A transport is a module that exports a default function that returns a writable stream:
```js
import { createWriteStream } from 'node:fs'
export default (options) => {
return createWriteStream(options.destination)
}
```
Let's imagine the above defines our "transport" as the file `my-transport.mjs`
(ESM files are supported even if the project is written in CJS).
We would set up our transport by creating a transport stream with `pino.transport`
and passing it to the `pino` function:
```js
const pino = require('pino')
const transport = pino.transport({
target: '/absolute/path/to/my-transport.mjs'
})
pino(transport)
```
The transport code will be executed in a separate worker thread. The main thread
will write logs to the worker thread, which will write them to the stream returned
from the function exported from the transport file/module.
The exported function can also be async. If we use an async function we can throw early
if the transform could not be opened. As an example:
```js
import fs from 'node:fs'
import { once } from 'events'
export default async (options) => {
const stream = fs.createWriteStream(options.destination)
await once(stream, 'open')
return stream
}
```
While initializing the stream we're able to use `await` to perform asynchronous operations. In this
case, waiting for the write streams `open` event.
Let's imagine the above was published to npm with the module name `some-file-transport`.
The `options.destination` value can be set when creating the transport stream with `pino.transport` like so:
```js
const pino = require('pino')
const transport = pino.transport({
target: 'some-file-transport',
options: { destination: '/dev/null' }
})
pino(transport)
```
Note here we've specified a module by package rather than by relative path. The options object we provide
is serialized and injected into the transport worker thread, then passed to the module's exported function.
This means that the options object can only contain types that are supported by the
[Structured Clone Algorithm][sca] which is used to (de)serialize objects between threads.
What if we wanted to use both transports, but send only error logs to `my-transport.mjs` while
sending all logs to `some-file-transport`? We can use the `pino.transport` function's `level` option:
```js
const pino = require('pino')
const transport = pino.transport({
targets: [
{ target: '/absolute/path/to/my-transport.mjs', level: 'error' },
{ target: 'some-file-transport', options: { destination: '/dev/null' }}
]
})
pino(transport)
```
If we're using custom levels, they should be passed in when using more than one transport.
```js
const pino = require('pino')
const transport = pino.transport({
targets: [
{ target: '/absolute/path/to/my-transport.mjs', level: 'error' },
{ target: 'some-file-transport', options: { destination: '/dev/null' }
],
levels: { foo: 35 }
})
pino(transport)
```
It is also possible to use the `dedupe` option to send logs only to the stream with the higher level.
```js
const pino = require('pino')
const transport = pino.transport({
targets: [
{ target: '/absolute/path/to/my-transport.mjs', level: 'error' },
{ target: 'some-file-transport', options: { destination: '/dev/null' }
],
dedupe: true
})
pino(transport)
```
For more details on `pino.transport` see the [API docs for `pino.transport`][pino-transport].
[pino-transport]: /docs/api.md#pino-transport
[sca]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm
<a id="writing"></a>
### Writing a Transport
The module [pino-abstract-transport](https://github.com/pinojs/pino-abstract-transport) provides
a simple utility to parse each line. Its usage is highly recommended.
You can see an example using an async iterator with ESM:
```js
import build from 'pino-abstract-transport'
import SonicBoom from 'sonic-boom'
import { once } from 'events'
export default async function (opts) {
// SonicBoom is necessary to avoid loops with the main thread.
// It is the same of pino.destination().
const destination = new SonicBoom({ dest: opts.destination || 1, sync: false })
await once(destination, 'ready')
return build(async function (source) {
for await (let obj of source) {
const toDrain = !destination.write(obj.msg.toUpperCase() + '\n')
// This block will handle backpressure
if (toDrain) {
await once(destination, 'drain')
}
}
}, {
async close (err) {
destination.end()
await once(destination, 'close')
}
})
}
```
or using Node.js streams and CommonJS:
```js
'use strict'
const build = require('pino-abstract-transport')
const SonicBoom = require('sonic-boom')
module.exports = function (opts) {
const destination = new SonicBoom({ dest: opts.destination || 1, sync: false })
return build(function (source) {
source.pipe(destination)
}, {
close (err, cb) {
destination.end()
destination.on('close', cb.bind(null, err))
}
})
}
```
(It is possible to use the async iterators with CommonJS and streams with ESM.)
To consume async iterators in batches, consider using the [hwp](https://github.com/mcollina/hwp) library.
The `close()` function is needed to make sure that the stream is closed and flushed when its
callback is called or the returned promise resolves. Otherwise, log lines will be lost.
### Writing to a custom transport & stdout
In case you want to both use a custom transport, and output the log entries with default processing to STDOUT, you can use 'pino/file' transport configured with `destination: 1`:
```js
const transports = [
{
target: 'pino/file',
options: { destination: 1 } // this writes to STDOUT
},
{
target: 'my-custom-transport',
options: { someParameter: true }
}
]
const logger = pino(pino.transport({ targets: transports })
```
### Creating a transport pipeline
As an example, the following transport returns a `Transform` stream:
```js
import build from 'pino-abstract-transport'
import { pipeline, Transform } from 'node:stream'
export default async function (options) {
return build(function (source) {
const myTransportStream = new Transform({
// Make sure autoDestroy is set,
// this is needed in Node v12 or when using the
// readable-stream module.
autoDestroy: true,
objectMode: true,
transform (chunk, enc, cb) {
// modifies the payload somehow
chunk.service = 'pino'
// stringify the payload again
this.push(`${JSON.stringify(chunk)}\n`)
cb()
}
})
pipeline(source, myTransportStream, () => {})
return myTransportStream
}, {
// This is needed to be able to pipeline transports.
enablePipelining: true
})
}
```
Then you can pipeline them with:
```js
import pino from 'pino'
const logger = pino({
transport: {
pipeline: [{
target: './my-transform.js'
}, {
// Use target: 'pino/file' with STDOUT descriptor 1 to write
// logs without any change.
target: 'pino/file',
options: { destination: 1 }
}]
}
})
logger.info('hello world')
```
__NOTE: there is no "default" destination for a pipeline but
a terminating target, i.e. a `Writable` stream.__
### TypeScript compatibility
Pino provides basic support for transports written in TypeScript.
Ideally, they should be transpiled to ensure maximum compatibility, but sometimes
you might want to use tools such as TS-Node, to execute your TypeScript
code without having to go through an explicit transpilation step.
You can use your TypeScript code without explicit transpilation, but there are
some known caveats:
- For "pure" TypeScript code, ES imports are still not supported (ES imports are
supported once the code is transpiled).
- Only TS-Node is supported for now, there's no TSM support.
- Running transports TypeScript code on TS-Node seems to be problematic on
Windows systems, there's no official support for that yet.
### Notable transports
#### `pino/file`
The `pino/file` transport routes logs to a file (or file descriptor).
The `options.destination` property may be set to specify the desired file destination.
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino/file',
options: { destination: '/path/to/file' }
})
pino(transport)
```
By default, the `pino/file` transport assumes the directory of the destination file exists. If it does not exist, the transport will throw an error when it attempts to open the file for writing. The `mkdir` option may be set to `true` to configure the transport to create the directory, if it does not exist, before opening the file for writing.
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino/file',
options: { destination: '/path/to/file', mkdir: true }
})
pino(transport)
```
By default, the `pino/file` transport appends to the destination file if it exists. The `append` option may be set to `false` to configure the transport to truncate the file upon opening it for writing.
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino/file',
options: { destination: '/path/to/file', append: false }
})
pino(transport)
```
The `options.destination` property may also be a number to represent a file descriptor. Typically this would be `1` to write to STDOUT or `2` to write to STDERR. If `options.destination` is not set, it defaults to `1` which means logs will be written to STDOUT. If `options.destination` is a string integer, e.g. `'1'`, it will be coerced to a number and used as a file descriptor. If this is not desired, provide a full path, e.g. `/tmp/1`.
The difference between using the `pino/file` transport builtin and using `pino.destination` is that `pino.destination` runs in the main thread, whereas `pino/file` sets up `pino.destination` in a worker thread.
#### `pino-pretty`
The [`pino-pretty`][pino-pretty] transport prettifies logs.
By default the `pino-pretty` builtin logs to STDOUT.
The `options.destination` property may be set to log pretty logs to a file descriptor or file. The following would send the prettified logs to STDERR:
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino-pretty',
options: { destination: 1 } // use 2 for stderr
})
pino(transport)
```
### Asynchronous startup
The new transports boot asynchronously and calling `process.exit()` before the transport
starts will cause logs to not be delivered.
```js
const pino = require('pino')
const transport = pino.transport({
targets: [
{ target: '/absolute/path/to/my-transport.mjs', level: 'error' },
{ target: 'some-file-transport', options: { destination: '/dev/null' } }
]
})
const logger = pino(transport)
logger.info('hello')
// If logs are printed before the transport is ready when process.exit(0) is called,
// they will be lost.
transport.on('ready', function () {
process.exit(0)
})
```
## Legacy Transports
A legacy Pino "transport" is a supplementary tool that consumes Pino logs.
Consider the following example for creating a transport:
```js
const { pipeline, Writable } = require('node:stream')
const split = require('split2')
const myTransportStream = new Writable({
write (chunk, enc, cb) {
// apply a transform and send to STDOUT
console.log(chunk.toString().toUpperCase())
cb()
}
})
pipeline(process.stdin, split(JSON.parse), myTransportStream)
```
The above defines our "transport" as the file `my-transport-process.js`.
Logs can now be consumed using shell piping:
```sh
node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
```
Ideally, a transport should consume logs in a separate process to the application,
Using transports in the same process causes unnecessary load and slows down
Node's single-threaded event loop.
## Known Transports
PRs to this document are welcome for any new transports!
### Pino v7+ Compatible
+ [@logtail/pino](#@logtail/pino)
+ [pino-elasticsearch](#pino-elasticsearch)
+ [pino-pretty](#pino-pretty)
+ [pino-loki](#pino-loki)
+ [pino-seq-transport](#pino-seq-transport)
+ [pino-sentry-transport](#pino-sentry-transport)
+ [pino-airbrake-transport](#pino-airbrake-transport)
+ [pino-datadog-transport](#pino-datadog-transport)
+ [pino-slack-webhook](#pino-slack-webhook)
+ [pino-axiom](#pino-axiom)
+ [pino-opentelemetry-transport](#pino-opentelemetry-transport)
+ [@axiomhq/pino](#@axiomhq/pino)
+ [pino-discord-webhook](#pino-discord-webhook)
+ [pino-logfmt](#pino-logfmt)
+ [pino-telegram-webhook](#pino-telegram-webhook)
+ [pino-yc-transport](#pino-yc-transport)
+ [@macfja/pino-fingers-crossed](#macfja-pino-fingers-crossed)
+ [pino-hana](#pino-hana)
### Legacy
+ [pino-applicationinsights](#pino-applicationinsights)
+ [pino-azuretable](#pino-azuretable)
+ [pino-cloudwatch](#pino-cloudwatch)
+ [pino-couch](#pino-couch)
+ [pino-datadog](#pino-datadog)
+ [pino-gelf](#pino-gelf)
+ [pino-http-send](#pino-http-send)
+ [pino-kafka](#pino-kafka)
+ [pino-logdna](#pino-logdna)
+ [pino-logflare](#pino-logflare)
+ [pino-loki](#pino-loki)
+ [pino-mq](#pino-mq)
+ [pino-mysql](#pino-mysql)
+ [pino-papertrail](#pino-papertrail)
+ [pino-pg](#pino-pg)
+ [pino-redis](#pino-redis)
+ [pino-sentry](#pino-sentry)
+ [pino-seq](#pino-seq)
+ [pino-socket](#pino-socket)
+ [pino-stackdriver](#pino-stackdriver)
+ [pino-syslog](#pino-syslog)
+ [pino-websocket](#pino-websocket)
<a id="@logtail/pino"></a>
### @logtail/pino
The [@logtail/pino](https://www.npmjs.com/package/@logtail/pino) NPM package is a transport that forwards logs to [Logtail](https://logtail.com) by [Better Stack](https://betterstack.com).
[Quick start guide ⇗](https://betterstack.com/docs/logs/javascript/pino)
<a id="pino-applicationinsights"></a>
### pino-applicationinsights
The [pino-applicationinsights](https://www.npmjs.com/package/pino-applicationinsights) module is a transport that will forward logs to [Azure Application Insights](https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview).
Given an application `foo` that logs via pino, you would use `pino-applicationinsights` like so:
``` sh
$ node foo | pino-applicationinsights --key blablabla
```
For full documentation of command line switches read [README](https://github.com/ovhemert/pino-applicationinsights#readme)
<a id="pino-azuretable"></a>
### pino-azuretable
The [pino-azuretable](https://www.npmjs.com/package/pino-azuretable) module is a transport that will forward logs to the [Azure Table Storage](https://azure.microsoft.com/en-us/services/storage/tables/).
Given an application `foo` that logs via pino, you would use `pino-azuretable` like so:
``` sh
$ node foo | pino-azuretable --account storageaccount --key blablabla
```
For full documentation of command line switches read [README](https://github.com/ovhemert/pino-azuretable#readme)
<a id="pino-cloudwatch"></a>
### pino-cloudwatch
[pino-cloudwatch][pino-cloudwatch] is a transport that buffers and forwards logs to [Amazon CloudWatch][].
```sh
$ node app.js | pino-cloudwatch --group my-log-group
```
[pino-cloudwatch]: https://github.com/dbhowell/pino-cloudwatch
[Amazon CloudWatch]: https://aws.amazon.com/cloudwatch/
<a id="pino-couch"></a>
### pino-couch
[pino-couch][pino-couch] uploads each log line as a [CouchDB][CouchDB] document.
```sh
$ node app.js | pino-couch -U https://couch-server -d mylogs
```
[pino-couch]: https://github.com/IBM/pino-couch
[CouchDB]: https://couchdb.apache.org
<a id="pino-datadog"></a>
### pino-datadog
The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through its API.
Given an application `foo` that logs via pino, you would use `pino-datadog` like so:
``` sh
$ node foo | pino-datadog --key blablabla
```
For full documentation of command line switches read [README](https://github.com/ovhemert/pino-datadog#readme)
<a id="pino-elasticsearch"></a>
### pino-elasticsearch
[pino-elasticsearch][pino-elasticsearch] uploads the log lines in bulk
to [Elasticsearch][elasticsearch], to be displayed in [Kibana][kibana].
It is extremely simple to use and setup
```sh
$ node app.js | pino-elasticsearch
```
Assuming Elasticsearch is running on localhost.
To connect to an external Elasticsearch instance (recommended for production):
* Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [Elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
* Launch:
```sh
$ node app.js | pino-elasticsearch --node http://192.168.1.42:9200
```
Assuming Elasticsearch is running on `192.168.1.42`.
To connect to AWS Elasticsearch:
```sh
$ node app.js | pino-elasticsearch --node https://es-url.us-east-1.es.amazonaws.com --es-version 6
```
Then [create an index pattern](https://www.elastic.co/guide/en/kibana/current/setup.html) on `'pino'` (the default index key for `pino-elasticsearch`) on the Kibana instance.
[pino-elasticsearch]: https://github.com/pinojs/pino-elasticsearch
[elasticsearch]: https://www.elastic.co/products/elasticsearch
[kibana]: https://www.elastic.co/products/kibana
<a id="pino-gelf"></a>
### pino-gelf
Pino GELF ([pino-gelf]) is a transport for the Pino logger. Pino GELF receives Pino logs from stdin and transforms them into [GELF format][gelf] before sending them to a remote [Graylog server][graylog] via UDP.
```sh
$ node your-app.js | pino-gelf log
```
[pino-gelf]: https://github.com/pinojs/pino-gelf
[gelf]: https://docs.graylog.org/en/2.1/pages/gelf.html
[graylog]: https://www.graylog.org/
<a id="pino-http-send"></a>
### pino-http-send
[pino-http-send](https://npmjs.com/package/pino-http-send) is a configurable and low overhead
transport that will batch logs and send to a specified URL.
```console
$ node app.js | pino-http-send -u http://localhost:8080/logs
```
<a id="pino-kafka"></a>
### pino-kafka
[pino-kafka](https://github.com/ayZagen/pino-kafka) transport to send logs to [Apache Kafka](https://kafka.apache.org/).
```sh
$ node index.js | pino-kafka -b 10.10.10.5:9200 -d mytopic
```
<a id="pino-logdna"></a>
### pino-logdna
[pino-logdna](https://github.com/logdna/pino-logdna) transport to send logs to [LogDNA](https://logdna.com).
```sh
$ node index.js | pino-logdna --key YOUR_INGESTION_KEY
```
Tags and other metadata can be included using the available command line options. See the [pino-logdna README](https://github.com/logdna/pino-logdna#options) for a full list.
<a id="pino-logflare"></a>
### pino-logflare
[pino-logflare](https://github.com/Logflare/pino-logflare) transport to send logs to a [Logflare](https://logflare.app) `source`.
```sh
$ node index.js | pino-logflare --key YOUR_KEY --source YOUR_SOURCE
```
<a id="pino-mq"></a>
### pino-mq
The `pino-mq` transport will take all messages received on `process.stdin` and send them over a message bus using JSON serialization.
This is useful for:
* moving backpressure from application to broker
* transforming messages pressure to another component
```
node app.js | pino-mq -u "amqp://guest:guest@localhost/" -q "pino-logs"
```
Alternatively, a configuration file can be used:
```
node app.js | pino-mq -c pino-mq.json
```
A base configuration file can be initialized with:
```
pino-mq -g
```
For full documentation of command line switches and configuration see [the `pino-mq` README](https://github.com/itavy/pino-mq#readme)
<a id="pino-loki"></a>
### pino-loki
pino-loki is a transport that will forwards logs into [Grafana Loki](https://grafana.com/oss/loki/).
Can be used in CLI version in a separate process or in a dedicated worker:
CLI :
```console
node app.js | pino-loki --hostname localhost:3100 --labels='{ "application": "my-application"}' --user my-username --password my-password
```
Worker :
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino-loki',
options: { host: 'localhost:3100' }
})
pino(transport)
```
For full documentation and configuration, see the [README](https://github.com/Julien-R44/pino-loki).
<a id="pino-papertrail"></a>
### pino-papertrail
pino-papertrail is a transport that will forward logs to the [papertrail](https://papertrailapp.com) log service through an UDPv4 socket.
Given an application `foo` that logs via pino, and a papertrail destination that collects logs on port UDP `12345` on address `bar.papertrailapp.com`, you would use `pino-papertrail`
like so:
```
node yourapp.js | pino-papertrail --host bar.papertrailapp.com --port 12345 --appname foo
```
for full documentation of command line switches read [README](https://github.com/ovhemert/pino-papertrail#readme)
<a id="pino-pg"></a>
### pino-pg
[pino-pg](https://www.npmjs.com/package/pino-pg) stores logs into PostgreSQL.
Full documentation in the [README](https://github.com/Xstoudi/pino-pg).
<a id="pino-mysql"></a>
### pino-mysql
[pino-mysql][pino-mysql] loads pino logs into [MySQL][MySQL] and [MariaDB][MariaDB].
```sh
$ node app.js | pino-mysql -c db-configuration.json
```
`pino-mysql` can extract and save log fields into corresponding database fields
and/or save the entire log stream as a [JSON Data Type][JSONDT].
For full documentation and command line switches read the [README][pino-mysql].
[pino-mysql]: https://www.npmjs.com/package/pino-mysql
[MySQL]: https://www.mysql.com/
[MariaDB]: https://mariadb.org/
[JSONDT]: https://dev.mysql.com/doc/refman/8.0/en/json.html
<a id="pino-redis"></a>
### pino-redis
[pino-redis][pino-redis] loads pino logs into [Redis][Redis].
```sh
$ node app.js | pino-redis -U redis://username:password@localhost:6379
```
[pino-redis]: https://github.com/buianhthang/pino-redis
[Redis]: https://redis.io/
<a id="pino-sentry"></a>
### pino-sentry
[pino-sentry][pino-sentry] loads pino logs into [Sentry][Sentry].
```sh
$ node app.js | pino-sentry --dsn=https://******@sentry.io/12345
```
For full documentation of command line switches see the [pino-sentry README](https://github.com/aandrewww/pino-sentry/blob/master/README.md).
[pino-sentry]: https://www.npmjs.com/package/pino-sentry
[Sentry]: https://sentry.io/
<a id="pino-seq"></a>
### pino-seq
[pino-seq][pino-seq] supports both out-of-process and in-process log forwarding to [Seq][Seq].
```sh
$ node app.js | pino-seq --serverUrl http://localhost:5341 --apiKey 1234567890 --property applicationName=MyNodeApp
```
[pino-seq]: https://www.npmjs.com/package/pino-seq
[Seq]: https://datalust.co/seq
<a id="pino-seq-transport"></a>
### pino-seq-transport
[pino-seq-transport][pino-seq-transport] is a Pino v7+ compatible transport to forward log events to [Seq][Seq]
from a dedicated worker:
```js
const pino = require('pino')
const transport = pino.transport({
target: '@autotelic/pino-seq-transport',
options: { serverUrl: 'http://localhost:5341' }
})
pino(transport)
```
[pino-seq-transport]: https://github.com/autotelic/pino-seq-transport
[Seq]: https://datalust.co/seq
<a id="pino-sentry-transport"></a>
### pino-sentry-transport
[pino-sentry-transport][pino-sentry-transport] is a Pino v7+ compatible transport to forward log events to [Sentry][Sentry]
from a dedicated worker:
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino-sentry-transport',
options: {
sentry: {
dsn: 'https://******@sentry.io/12345',
}
}
})
pino(transport)
```
[pino-sentry-transport]: https://github.com/tomer-yechiel/pino-sentry-transport
[Sentry]: https://sentry.io/
<a id="pino-airbrake-transport"></a>
### pino-airbrake-transport
[pino-airbrake-transport][pino-airbrake-transport] is a Pino v7+ compatible transport to forward log events to [Airbrake][Airbrake]
from a dedicated worker:
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino-airbrake-transport',
options: {
airbrake: {
projectId: 1,
projectKey: "REPLACE_ME",
environment: "production",
// additional options for airbrake
performanceStats: false,
},
},
level: "error", // minimum log level that should be sent to airbrake
})
pino(transport)
```
[pino-airbrake-transport]: https://github.com/enricodeleo/pino-airbrake-transport
[Airbrake]: https://airbrake.io/
<a id="pino-socket"></a>
### pino-socket
[pino-socket][pino-socket] is a transport that will forward logs to an IPv4
UDP or TCP socket.
As an example, use `socat` to fake a listener:
```sh
$ socat -v udp4-recvfrom:6000,fork exec:'/bin/cat'
```
Then run an application that uses `pino` for logging:
```sh
$ node app.js | pino-socket -p 6000
```
Logs from the application should be observed on both consoles.
[pino-socket]: https://www.npmjs.com/package/pino-socket
<a id="pino-datadog-transport"></a>
### pino-datadog-transport
[pino-datadog-transport][pino-datadog-transport] is a Pino v7+ compatible transport to forward log events to [Datadog][Datadog]
from a dedicated worker:
```js
const pino = require('pino')
const transport = pino.transport({
target: 'pino-datadog-transport',
options: {
ddClientConf: {
authMethods: {
apiKeyAuth: <your datadog API key>
}
},
},
level: "error", // minimum log level that should be sent to datadog
})
pino(transport)
```
[pino-datadog-transport]: https://github.com/theogravity/pino-datadog-transport
[Datadog]: https://www.datadoghq.com/
#### Logstash
The [pino-socket][pino-socket] module can also be used to upload logs to
[Logstash][logstash] via:
```
$ node app.js | pino-socket -a 127.0.0.1 -p 5000 -m tcp
```
Assuming logstash is running on the same host and configured as
follows:
```
input {
tcp {
port => 5000
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
}
}
```
See <https://www.elastic.co/guide/en/kibana/current/setup.html> to learn
how to setup [Kibana][kibana].
For Docker users, see
https://github.com/deviantony/docker-elk to setup an ELK stack.
<a id="pino-stackdriver"></a>
### pino-stackdriver
The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through its API.
Given an application `foo` that logs via pino, a stackdriver log project `bar`, and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
like so:
``` sh
$ node foo | pino-stackdriver --project bar --credentials /credentials.json
```
For full documentation of command line switches read [README](https://github.com/ovhemert/pino-stackdriver#readme)
<a id="pino-syslog"></a>
### pino-syslog
[pino-syslog][pino-syslog] is a transforming transport that converts
`pino` NDJSON logs to [RFC3164][rfc3164] compatible log messages. The `pino-syslog` module does not
forward the logs anywhere, it merely re-writes the messages to `stdout`. But
when used in combination with `pino-socket` the log messages can be relayed to a syslog server:
```sh
$ node app.js | pino-syslog | pino-socket -a syslog.example.com
```
Example output for the "hello world" log:
```
<134>Apr 1 16:44:58 MacBook-Pro-3 none[94473]: {"pid":94473,"hostname":"MacBook-Pro-3","level":30,"msg":"hello world","time":1459529098958}
```
[pino-syslog]: https://www.npmjs.com/package/pino-syslog
[rfc3164]: https://tools.ietf.org/html/rfc3164
[logstash]: https://www.elastic.co/products/logstash
<a id="pino-websocket"></a>
### pino-websocket
[pino-websocket](https://www.npmjs.com/package/@abeai/pino-websocket) is a transport that will forward each log line to a websocket server.
```sh
$ node app.js | pino-websocket -a my-websocket-server.example.com -p 3004
```
For full documentation of command line switches read the [README](https://github.com/abeai/pino-websocket#readme).
<a id="pino-slack-webhook"></a>
### pino-slack-webhook
[pino-slack-webhook][pino-slack-webhook] is a Pino v7+ compatible transport to forward log events to [Slack][Slack]
from a dedicated worker:
```js
const pino = require('pino')
const transport = pino.transport({
target: '@youngkiu/pino-slack-webhook',
options: {
webhookUrl: 'https://hooks.slack.com/services/xxx/xxx/xxx',
channel: '#pino-log',
username: 'webhookbot',
icon_emoji: ':ghost:'
}
})
pino(transport)
```
[pino-slack-webhook]: https://github.com/youngkiu/pino-slack-webhook
[Slack]: https://slack.com/
[pino-pretty]: https://github.com/pinojs/pino-pretty
For full documentation of command line switches read the [README](https://github.com/abeai/pino-websocket#readme).
<a id="pino-axiom"></a>
### pino-axiom
[pino-axiom](https://www.npmjs.com/package/pino-axiom) is a transport that will forward logs to [Axiom](https://axiom.co).
```javascript
const pino = require('pino')
const transport = pino.transport({
target: 'pino-axiom',
options: {
orgId: 'YOUR-ORG-ID',
token: 'YOUR-TOKEN',
dataset: 'YOUR-DATASET',
},
})
pino(transport)
```
<a id="pino-opentelemetry-transport"></a>
### pino-opentelemetry-transport
[pino-opentelemetry-transport](https://www.npmjs.com/package/pino-opentelemetry-transport) is a transport that will forward logs to an [OpenTelemetry log collector](https://opentelemetry.io/docs/collector/) using [OpenTelemetry JS instrumentation](https://opentelemetry.io/docs/instrumentation/js/).
```javascript
const pino = require('pino')
const transport = pino.transport({
target: 'pino-opentelemetry-transport',
options: {
resourceAttributes: {
'service.name': 'test-service',
'service.version': '1.0.0'
}