Skip to content

Toy cluster with Hadoop and HBase nodes. Created for experiments

License

Notifications You must be signed in to change notification settings

dzavodnikov/Hadoop-Toy-Cluster

Repository files navigation

Hadoop and HBase Toy Cluster

This is Toy Cluster with Hadoop and HBase nodes. Created for experiments.

Hadoop MapReduce

It includes following services:

Run

Just execute:

$ ./start.sh

Current state of all nodes will be saved at data directory.

Directory logs will be created automatically and will contain logs for all services.

Shared data between host machine and cluster nodes can be done through shared directory.

To stop:

$ ./stop.sh

Note: If you want clean-up all working data just execute $ ./clean.sh script.

Web UIs

Run code examples on cluster

  1. Go to examples folder.

  2. Deploy examples:

    $ ./deploy.sh

    Note: This script build examples and upload them to local Maven cache (~/.m2/repository). Then it download them and push to the cluster.

  3. Run some file:

    $ ./mapreduce.sh

    or

    $ ./hbase.sh

Run code examples in Azkaban

  1. Go to examples folder.

  2. Deploy examples:

    $ ./deploy.sh

    Note: This script build examples and upload them to local Maven cache (~/.m2/repository). Then it download them and push to the cluster.

  3. Upload input data:

    $ ./wc_input.sh
  4. Go to azkaban directory and create ZIP archive:

    $ ./azkaban-arch.sh

    It will generate azkaban-jobs.zip.

  5. Go to Azkaban and login (username/password are azkaban/azkaban).

  6. Click on "Create Project" button.

  7. Input "Name" and "Description". After that click on "Create Project" button.

  8. Click on "Upload" and select generated ZIP-file. After that click on "Upload" button.

  9. Click on "Execute Flow". Update "Flow Parameters" if needed (you are can see all parameters in shared.properties file).

  10. Click on "Execute" button, then press "Continue" button.

    Note: To get "Job Logs" click on "Job List" and find "Details" link in latest column of the table.

  11. Download output data:

    $ ./wc_output.sh

License

Distributed under MIT License.

About

Toy cluster with Hadoop and HBase nodes. Created for experiments

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published