Repository with Example references for GitHub Actions Metrics visualizations.
Note
The content in this repository is for demonstration purposes only and should not be used in a production environment directly.
This directory contains the necessary configuration and code to set up a Grafana dashboard with OpenSearch as the datasource. It's designed to collect and visualize data from webhooks using OpenSearch and Grafana.
.env
and.env.example
: Environment variable files. Copy.env.example
to.env
and update the values according to your OpenSearch setup.docker-compose.yml
: Docker Compose file to spin up the OpenSearch cluster, Grafana, and the webhook collector service.data/grafana/provisioning
: Contains Grafana provisioning files for datasources and dashboards, allowing Grafana to automatically load the OpenSearch datasource and predefined dashboards on startup.dashboards/default.yaml
: Configuration for dashboard provisioning.datasources/default.yaml
: Configuration for datasource provisioning, including the OpenSearch datasource.
- Uses the ./webhook-collector service to collect GitHub webhooks and store them in OpenSearch, configured with environment variables in the ./grafa-opensearch/.env file.
- Ensure Docker, Docker Compose, Node.js, and npm are installed on your system.
- Node.js and npm are only rerquired if you plan to run the webhook collector service outside of Docker or make development changes.
- Copy
grafana-opensearch/.env.example
tografana-opensearch/.env
and update the environment variables to match your OpenSearch setup.- Environment variables:
OPENSEARCH_HOST
: The hostname of the OpenSearch cluster.OPENSEARCH_PROTOCOL
: The protocol (http
orhttps
) to use.OPENSEARCH_PORT
: The port on which the OpenSearch cluster is accessible.OPENSEARCH_USERNAME
: The username for OpenSearch authentication.OPENSEARCH_PASSWORD
: The password for OpenSearch authentication.TARGET_TYPE
: The backend type to use for storing the data. Currently, onlyopensearch
is supported.SEED_DATA
:true
orfalse
to seed the OpenSearch database with initial sample data.
- From the
grafana-opensearch
directory, rundocker-compose up
to start the OpenSearch cluster, Grafana, and the webhook collector service.- This will automatically run the
seed
container once to seed the OpenSearch database with initial data.
- This will automatically run the
- Access Grafana at
http://localhost:13000
(default credentials are admin/admin, but it's recommended to change these). - To seed the OpenSearch database with more data, the
seed
container can be run manually usingdocker-compose run seed
more times. - To collect real data, setup webhooks from GitHub for
workflow_run
andworkflow_job
events to point to the webhook collector service.- The webhook collector service listens on port 3000 by default. Tools such as smee.io can be used to forward GitHub webhooks to the service.
- The webhook collector service will automatically index these events into OpenSearch for visualization in Grafana.
For more detailed instructions and configuration options, refer to the individual README files within each subdirectory.
This Node.js application serves as a bridge between GitHub webhooks and backend data stores. Different data store backends are configured as part of the application and specified with the TARGET_TYPE
environment variable.
Supported backends include:
opensearch
: Indexes incoming webhook data into an OpenSearch cluster.
webhook-collector
: A simple Node.js application that collects GitHub webhooks and stores them in OpenSearch.src/index.js
: The main application file.src/opensearch.js
: The OpenSearch backend module.package.json
: Defines the project dependencies and scripts.src/seed-data.js
: A script to seed the OpenSearch database with initial data..gitignore
: Git ignore file for Node.js projects.dist/
: Contains the compiled JavaScript files, used by containerized deployments from docker-compose.
- Server Initialization: An Express server is initialized and configured to listen for incoming HTTP requests on port 3000.
- Route Handling:
- Route data is configured in the target module, which is selected based on the
TARGET_TYPE
environment variable. - A GET route at the root (
/
) that simply returns a server status message. - A POST route at the root (
/
) designed to receive GitHub webhook payloads.
- Route data is configured in the target module, which is selected based on the
- Webhook Processing:
- For
workflow_run
andworkflow_job
events, it calculates the duration of the event. Forworkflow_job
events, it also calculates the queue duration and the duration of each step within the job. - Adds a timestamp to the event data.
- Stores the event data in the configured backend.
- For
- Dependencies: Requires Node.js, npm, and access to a backend data store.
- Environment Variables: Set the following environment variables to configure application:
TARGET_TYPE
: The backend type to use for storing the data. Currently, onlyopensearch
is supported.- OpenSearch Options:
OPENSEARCH_HOST
: The hostname of the OpenSearch cluster.OPENSEARCH_PROTOCOL
: The protocol (http
orhttps
) to use.OPENSEARCH_PORT
: The port on which the OpenSearch cluster is accessible.OPENSEARCH_USERNAME
: The username for OpenSearch authentication.OPENSEARCH_PASSWORD
: The password for OpenSearch authentication.
- Running the Server: Execute
npm run dev
to start the server locally for development. Ensure that the environment variables are set before starting the server.
- The webhook-collector server does not implement authentication for incoming webhook requests.
- The OpenSearch client is configured with
rejectUnauthorized: false
for SSL, which is only for development purposes.