This application simulates delivery of log messages to IBM Event Streams using Apache Kafka. It is intended to be used with the solution tutorial Big data log analytics with Streaming Analytics and SQL.
Usage: index --file <file> --parser <name> (--messages | --csv) --broker-list <brokers> --api-key <secret> --topic <name> --rate [speed]
Options:
-f, --file [file] Log file to create messages from
-p, --parser [parser] File parser
-m, --messages [parser] Stream log messages to Event Streams
-c, --csv Stream log messages to CSV file
-b, --broker-list [brokerList] Event Streams brokers list (multiple brokers comma separated)
-k, --api-key [apiKey] Event Streams API key
-t, --topic [topic] Event Streams topic
-r, --rate [rate] Adjusts the message send rate
-h, --help output usage information
- Install node.js.
- Run
npm install
. - Run
npm build
. - Follow the below examples commands.
Convert an Apache web server log file to CSV.
node dist/index.js --file /Users/ibmcloud/Downloads/NASA_access_log_Jul95 --parser httpd --csv --out-file /Users/ibmcloud/Downloads/NASA_access_log_Jul95.csv
Stream an Apache web server log file to Event Streams.
node dist/index.js --file /Users/vanstaub/Downloads/NASA_access_log_Jul95 --parser httpd --broker-list "kafka02-prod02.messagehub.services.us-south.bluemix.net:9093" --api-key 0ErVFpnxvRqdfsSDDWQjymc1sdfDF7iRfGsvSv3cp2OOlJ4m --topic webserver --rate 100
NASA's sample HTTP web server file Jul 01 to Jul 31, ASCII format, 20.7 MB gzip compressed can be used to get started.
To read custom log files, do the following.
- Create a new
/parser/my-parser.ts
file that implementsParser
. - Add the new parser as named export in
Parsers.ts
. - Update
getParser()
inindex.ts
to define a new alias for your parser. - Run
npm build
. - Run
node dist/index.js --parser my-new-parser ...
.