Cloud Build Log Browser is a web app that displays Cloud Build logs that are truncated in the default log viewer. This application and supporting architecture allow you to see large Cloud Build Logs per job and per step.
Cloud Build can not display logs of build jobs that have more than 9000 lines of logs. The raw logs can be viewed but without any formatting. This becomes an issue if you have multiple steps running in parallel that are generating a lot of log lines. One good example would be ansible deployment of the SAP products on GCP.
One good example of a project that generates a log of log lines is SAP Deployment Automation
The code in this repo will create a log forwarder that will push cloud build logs into a Pub/Sub topic. A Dataflow job will take the logs from the Pub/Sub, parse it, and then push it into the BigQuery. A user can view the logs via the web app (Cloud Build Log Browser) that is deployed with AppEngine.
You will need:
- A GCP project with a VPC and a subnet. Choose a project that is running the Cloud Build jobs
- Firebase project
- terraform installed
- nodejs installed (v16.17.0 when this doc was written)
In order to deploy the terraform resources your user or service account will need the following roles
- roles/iam.serviceAccountAdmin
- roles/resourcemanager.projectIamAdmin
- roles/bigquery.admin
- roles/logging.admin
- roles/pubsub.admin
- roles/dataflow.admin
- roles/storage.admin
- clone this repo
- cd terraform
- cp terraform.tfvars.example terraform.tfvars
- Set the variables in
terraform.tfvars
- terraform init
- terraform apply
- Get the gcloud command that will start the dataflow job by running
terraform -chdir=terraform/ output -raw dataflow_command
- Run the gcloud command from the above step.
Example:
gcloud dataflow jobs run transform-log-to-bq \
--project my-awesome-project-01 \
--gcs-location gs://dataflow-templates-us-central1/latest/PubSub_Subscription_to_BigQuery \
--region us-central1 \
--max-workers 3 \
--num-workers 1 \
--service-account-email dataflow-log-to-bq-sa@my-awesome-project-01.iam.gserviceaccount.com \
--staging-location gs://my-awesome-project-01-dataflow-transform-log-to-bq/temp/ \
--subnetwork https://www.googleapis.com/compute/v1/projects/my-awesome-project-01/regions/us-central1/subnetworks/dataflow \
--network dataflow \
--disable-public-ips \
--worker-machine-type n1-standard-2 \
--enable-streaming-engine \
--additional-experiments enable_streaming_engine \
--parameters inputSubscription=projects/my-awesome-project-01/subscriptions/cloud-build-log-sub,javascriptTextTransformGcsPath=gs://my-awesome-project-01-dataflow-transform-log-to-bq/main.js,javascriptTextTransformFunctionName=transform,outputTableSpec=my-awesome-project-01:cloud_build_logs.logs
This procedure will build the frontend
- cp .env.local.example .env.local
- Set the variables in the
.env.local
- You can see how to setup the firbase in Firebase_setup.md
- Value for the
REACT_APP_BACKEND_URL
should behttps://logbrowserfronted-dot-PROJECT_ID.uc.r.appspot.com
wherePROJECT_ID
is the value of theproject
variable in the terraform.tfvars
- npm install
- npm run build
In the following steps the PROJECT_ID should be the value of the project
variable in terraform.tfvars
- terraform -chdir=terraform/ output -raw backend_yaml > backend/backend.yaml
- gcloud app deploy backend/backend.yaml --project PROJECT_ID
- gcloud app deploy fronted.yaml --project PROJECT_ID gcloud app deploy backend/dispatch.yaml --project PROJECT_ID
- Parsing of the logs in the Dataflow job is described in main.js
- User authentication is done using Firebase. The Firbase setup is described in Firebase_setup.md
- Backend code that queries BQ data is in main.py
- User authorization is done in main.py. Only users that have at least one of the roles described in the
ALLOWED_ROLES
global variable will be able to see the logs.
If you want to ingest old logs into the BQ so that you can view them in the app then please have a look at Ingesting old cloud build logs document.
Frontend:
- Run npm install
- Set
REACT_APP_BACKEND_URL
in.env.local
tohttp://127.0.0.1:8080
- npm start
Backend:
- Set the following environmental variables
APP_LB_DEBUG=true GOOGLE_CLOUD_PROJECT=$(terraform -chdir=terraform/ output -raw project_id) DATASET_ID=$(terraform -chdir=terraform/ output -raw dataset_id) TABLE_ID=$(terraform -chdir=terraform/ output -raw table_id)
- Create a virtual environment
- pip install -r backend/requirements.txt
- python backend/main.py
- This will start a flask app on http://127.0.0.1:8080