Skip to content

jeetendersinghc/flume-1

 
 

Repository files navigation

CISC 525 Apache Flume Project

Running a Flume agent

source spooldir - channel file - sink logger

mkdir /tmp/spooldir

flume-ng agent --conf-file spool-to-logger.properties --name agent1 --conf $FLUME_HOME/conf -Dflume.root.logger=INFO,console

Wait for the agent to get started. Then, run this:

echo "hello, world!" > /tmp/spooldir/file1.txt
echo "hello, USA!" > /tmp/spooldir/file2.txt
echo "hello, Pennsylvania, USA!" > /tmp/spooldir/file3.txt
echo "hello, Harrisburg, PA, USA!" > /tmp/spooldir/file4.txt
echo "hello, Harrisburg University of Science and Technologies, PA, USA!" > /tmp/spooldir/file5.txt

After each run of the above commands, verify the output in the log file.

source exec with tail -F ... - channel memory - sink hdfs

flume-ng agent --conf-file tail-in-memory-hdfs.conf --name agent1 --conf $FLUME_HOME/conf -Dflume.root.logger=INFO,console

Watch the logs file from your own tail -F command:

tail -F /usr/local/hadoop/logs/hadoop-student-namenode-student-VirtualBox.log

When you see output from the above log, you should verify that you see the same text from one of the file under Hadoop stored under /tmp/system.log directory.

Capturing ActiveMQ logs into HDFS

flume-ng agent --conf-file activemq-memory-hdfs.conf --name activemq_agent --conf $FLUME_HOME/conf -Dflume.root.logger=INFO,console

hdfs dfs -ls /tmp/activemq.log | awk '{print $8}' | while read f; do hdfs dfs -cat $f | grep -i hello && echo $f; done

Capturing ActiveMQ logs and forward to the custom sink

flume-ng agent --classpath ../custom_sink/target/custom_sink-jar-with-dependencies.jar --conf-file activemq-memory-custom-sink.conf --name activemq_agent --conf $FLUME_HOME/conf -Dflume.root.logger=DEBUG,console

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 96.4%
  • Shell 3.6%