Skip to content

Commit

Permalink
[INTERNAL] [BUILD] Build apache kafka docker image using github actio…
Browse files Browse the repository at this point in the history
…ns (#29)

When `kafka-*` tags are created a github action is triggered
to build and push a new adobe/kafka docker image version
  • Loading branch information
amuraru committed Jul 2, 2021
1 parent 9854012 commit c0c6a1f
Show file tree
Hide file tree
Showing 5 changed files with 152 additions and 0 deletions.
1 change: 1 addition & 0 deletions .github/workflows/build-push-docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ jobs:
id: vars
run: echo ::set-output name=tag::${GITHUB_REF:10}
- uses: docker/build-push-action@v1
if: ${{ !startsWith( steps.vars.outputs.tag, 'kafka-' ) }}
with:
dockerfile: Dockerfile
build_args: VERSION=${{ steps.vars.outputs.tag }},GIT_SHA=${{ github.sha }}
Expand Down
44 changes: 44 additions & 0 deletions .github/workflows/build-push-kafka-docker.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
name: build-publish-kafka-docker-image

on:
push:
tags:
- 'kafka-*'

jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Prepare
id: prep
run: |
DOCKER_IMAGE=adobe/kafka
VERSION=latest
if [[ $GITHUB_REF == refs/tags/kafka-* ]]; then
VERSION=${GITHUB_REF#refs/tags/kafka-}
elif [[ $GITHUB_REF == refs/heads/* ]]; then
VERSION=$(echo ${GITHUB_REF#refs/heads/} | sed -r 's#/+#-#g')
fi
TAGS="${DOCKER_IMAGE}:${VERSION}"
echo ::set-output name=version::${VERSION}
echo ::set-output name=tags::${TAGS}
echo ::set-output name=created::$(date -u +'%Y-%m-%dT%H:%M:%SZ')
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
if: startsWith(github.ref, 'refs/tags/')
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@v2
with:
context: docker/kafka
push: ${{ startsWith(github.ref, 'refs/tags/') }}
tags: ${{ steps.prep.outputs.tags }}
labels: |
org.opencontainers.image.source=${{ github.event.repository.html_url }}
org.opencontainers.image.created=${{ steps.prep.outputs.created }}
org.opencontainers.image.revision=${{ github.sha }}
44 changes: 44 additions & 0 deletions docker/kafka/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
FROM alpine:latest AS kafka_dist

ARG scala_version=2.13
ARG kafka_version=2.6.2
ARG kafka_distro_base_url=https://downloads.apache.org/kafka

ENV kafka_distro=kafka_$scala_version-$kafka_version.tgz
ENV kafka_distro_asc=$kafka_distro.asc

RUN apk add --no-cache gnupg

WORKDIR /var/tmp

RUN wget -q $kafka_distro_base_url/$kafka_version/$kafka_distro
RUN wget -q $kafka_distro_base_url/$kafka_version/$kafka_distro_asc
RUN wget -q $kafka_distro_base_url/KEYS

RUN gpg --import KEYS
RUN gpg --verify $kafka_distro_asc $kafka_distro

RUN tar -xzf $kafka_distro
RUN rm -r kafka_$scala_version-$kafka_version/bin/windows


FROM azul/zulu-openjdk:16.0.1

ARG scala_version=2.13
ARG kafka_version=2.6.2

ENV KAFKA_VERSION=$kafka_version \
SCALA_VERSION=$scala_version \
KAFKA_HOME=/opt/kafka

ENV PATH=${PATH}:${KAFKA_HOME}/bin

RUN mkdir ${KAFKA_HOME} && apt-get update && apt-get install curl -y && apt-get clean

COPY --from=kafka_dist /var/tmp/kafka_$scala_version-$kafka_version ${KAFKA_HOME}
COPY opt/kafka/config/log4j.properties ${KAFKA_HOME}/config/log4j.properties


RUN chmod a+x ${KAFKA_HOME}/bin/*.sh

CMD ["kafka-server-start.sh"]
13 changes: 13 additions & 0 deletions docker/kafka/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Kafka docker image

`adobe/kafka` docker image build configuration.

A new `kafka-*` tag created in this repo triggers the image build and push to [adobe/kafka](https://hub.docker.com/r/adobe/kafka/tags?page=1&ordering=last_updated) docker hub repo.

Tags should be `kafka-<scala_version>-<kafka_version>` e.g `kafak-2.13-2.6.2`

# Upstream base

This is based on [wurstmeister/kafka-docker](https://github.com/wurstmeister/kafka-docker) with the following additions:
1. Use `openjdk 16` in order to support container based resource monitoring
2. Use custom `log4j.properties` to get all kafka logs to stdout only
50 changes: 50 additions & 0 deletions docker/kafka/opt/kafka/config/log4j.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Unspecified loggers and loggers with additivity=true output to server.log and stdout
# Note that INFO only applies to unspecified loggers, the log level of the child logger is used otherwise
log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c)%n

# Change the line below to adjust ZK client logging
log4j.logger.org.apache.zookeeper=INFO

# Change the two lines below to adjust the general broker logging level (output to server.log and stdout)
log4j.logger.kafka=INFO
log4j.logger.org.apache.kafka=INFO

# Change to DEBUG or TRACE to enable request logging
log4j.logger.kafka.request.logger=WARN

# Uncomment the lines below and change log4j.logger.kafka.network.RequestChannel$ to TRACE for additional output
# related to the handling of requests
#log4j.logger.kafka.network.Processor=TRACE, requestAppender
#log4j.logger.kafka.server.KafkaApis=TRACE, requestAppender
#
log4j.logger.kafka.network.RequestChannel$=WARN

log4j.logger.kafka.controller=DEBUG


log4j.logger.kafka.log.LogCleaner=INFO


log4j.logger.state.change.logger=INFO

# Access denials are logged at INFO level, change to DEBUG to also log allowed accesses
log4j.logger.kafka.authorizer.logger=INFO

0 comments on commit c0c6a1f

Please sign in to comment.