Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add testing strategy for Infrastructure Manager #133

Merged
merged 23 commits into from
Feb 22, 2024
Merged
Changes from 15 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 94 additions & 0 deletions docs/contributor/testing-strategy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
# Testing Strategy for Infrastructure Manager
grego952 marked this conversation as resolved.
Show resolved Hide resolved

## Introduction
This testing strategy describes how the Framefrog team tests Kyma Infrastructure Manager. It outlines the approach and methodologies used for testing all layers of this product to ensure stability, reliability, and correctness.
grego952 marked this conversation as resolved.
Show resolved Hide resolved


## Testing Methodology

We investigate the product by separating it into layers:
grego952 marked this conversation as resolved.
Show resolved Hide resolved

1. Code

Includes the technical frameworks (e.g. Kubebuilder) and custom Golang code.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
grego952 marked this conversation as resolved.
Show resolved Hide resolved

2. Business Features

Combines the code into feature which is consumed by our customers.
grego952 marked this conversation as resolved.
Show resolved Hide resolved

3. Product Integration

Verifies how our product is integarted into the technical landscape, how it interacts with 3rd party systems and how it is accessible by customers or remote systems.
grego952 marked this conversation as resolved.
Show resolved Hide resolved

For each layer, there is a dedicated testing approach used:
grego952 marked this conversation as resolved.
Show resolved Hide resolved

1. **Unit Testing for Code:** Writing and executing tests for individual functions, methods, and components to verify their behavior and correctness in isolation.
2. **Integration Testing for Business Features:** Validating the integration and interaction between different components, modules, and services in the project.
3. **End-to-End Testing:** Testing the application as a whole in a production-like environment, mimicking real-world scenarios to ensure the entire system functions correctly, is performing well and secure.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please consider using bullet points here.

grego952 marked this conversation as resolved.
Show resolved Hide resolved


## Testing Approach

### Unit Testing
grego952 marked this conversation as resolved.
Show resolved Hide resolved
1. Identify critical functions, methods, and components that require testing.
2. Write unit tests using GoUnit tests, Ginkgo and Gomega frameworks.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
3. Ensure tests cover various scenarios, edge cases, and possible failure scenarios. We try to verify business relevant logic with at least 65% code coverage.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
4. Test for both positive and negative inputs to validate the expected behavior.
5. Mock external dependencies and use stubs or fakes to isolate the unit under test.
6. Run unit tests periodically during development and before each PR to prevent regressions.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
7. Unit tests have to be executed as fast as possible to minimize roundtrip times. Long running tests should be excluded from requently executed test runs and be triggered periodically (e.g. 4 times a day)
grego952 marked this conversation as resolved.
Show resolved Hide resolved

### Integration Testing
1. The PO and the team create a registry of implemented business features and define a suitable test scenario for each feature.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
2. Create a separate test suite for integration testing.
3. Each test scenario is implemented in a separte test case. Use the Kubebuilder Test Framework and others to create test cases that interact with the Kubernetes cluster.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
4. Test the interaction and integration of your custom resources, controllers, and other components with the Kubernetes API.
5. Ensure test cases cover various aspects such as resource creation, updating, deletion, and handling of edge cases.
6. Validate the correctness of event handling, reconciliation, and other control logic.
7. Integration tests have to be executed fast to minimize roundtrip times and be applied for each PR. Long-running tests should be excluded from frequently executed test runs and be triggered periodically (e.g. 4 times a day)
grego952 marked this conversation as resolved.
Show resolved Hide resolved

### End-to-End Testing
1. Use a mainstream Kubernetes management tool (e.g. [Helm](https://helm.sh/) or [Kustomize](https://kustomize.io/)) to create, deploy, and manage test clusters and environments that closely resemble the productive execution context.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
2. For short-living Kubernetes clusters, use k3d or other lightweight Kubernetes cluster providers.
3. Run regularly, but at least once per release, a performance test that measures product KPIs to indicate KPI violations or performance differences between release candidates.

|Testing Approach|Per Commit|Per PR|Per Release|In intervals|
grego952 marked this conversation as resolved.
Show resolved Hide resolved
|--|--|--|--|--|
|Unit Testing|X|X||Only long running tests daily|
grego952 marked this conversation as resolved.
Show resolved Hide resolved
|Integration Testing||X||Only long running tests daily|
grego952 marked this conversation as resolved.
Show resolved Hide resolved
|End-to-End Testing|||X|Daily|

### Testing Tools and Frameworks
Use the following tools and frameworks to implement the above-mentioned testing levels:

- **GoTest**: For unit testing of Golang code.
grego952 marked this conversation as resolved.
Show resolved Hide resolved
- **Kubebuilder Test Framework and EnvTest**: For creating and executing integration tests that interact with the Kubernetes API.
- **Ginkgo and Gomega**: For writing and executing unit tests with a BDD-style syntax and assertions.
- **k3d**: For creating short-living and lightweight Kubernetes clusters running within a Docker context.
- **Helm**: For deploying and managing test clusters and environments for end-to-end testing.
- **k6:**: For performance and stress testing
grego952 marked this conversation as resolved.
Show resolved Hide resolved

|Framework|Unit Testing|Integration Testing|End-to-End Testing|
|--|--|--|--|
tobiscr marked this conversation as resolved.
Show resolved Hide resolved
|GoTest| X | | |
|Kubebuilder Test Framework| X | X | |
|EnvTest| X | X | |
|Ginkgo| X | X | |
|Gomega| X | X | |
|k3d| | | X |
|Helm| | | X |
|k6| | | X |


## Test Automation

The following CI/CD jobs are a part of the development cycle and executing quality assurance related steps:
grego952 marked this conversation as resolved.
Show resolved Hide resolved

> **NOTE:** Jobs marked with `pull_request` are triggered with each pull request. Jobs marked with `push` are executed after the merge.

- `golangci-lint / lint (pull_request/push)` - Is responsible for linting and static code analysis. It's configured [here](https://github.com/kyma-project/infrastructure-manager/blob/main/.golangci.yaml) and [here](https://github.com/kyma-project/infrastructure-manager/blob/main/.github/workflows/golangci-lint.yaml).
- `PR Markdown Link Check / markdown-link-check (pull_request)` - Checks if there are no broken links in the pull request `.md` files. It's configured [here](https://github.com/kyma-project/infrastructure-manager/blob/main/mlc.config.json).
- `Run unit tests / validate (pull_request/push)` - Executes basic create/update/delete functional tests of the reconciliation logic. It's configured [here](https://github.com/kyma-project/infrastructure-manager/blob/main/.github/workflows/run-tests.yaml).
- `Run vuln check / test (pull_request/push)` - Runs [govulncheck](https://pkg.go.dev/golang.org/x/vuln/cmd/govulncheck) on the code to detect known vulnerabilities. It's configured [here](https://github.com/kyma-project/infrastructure-manager/blob/main/.github/workflows/run-vuln-check.yaml).
- `pull-infrastructure-manager-build` - Triggered with each pull request. It builds the Docker image and pushes it to the registry. It's configured [here](https://github.com/kyma-project/test-infra/blob/a3c2a07da4ba42e468f69cf42f1960d7bfcc3fff/prow/jobs/kyma-project/infrastructure-manager/infrastructure-manager.yaml).
- `main-infrastructure-manager-build` - Triggered after the merge. Rebuilds the image and pushes it to the registry. It's configured [here](https://github.com/kyma-project/test-infra/blob/a3c2a07da4ba42e468f69cf42f1960d7bfcc3fff/prow/jobs/kyma-project/infrastructure-manager/infrastructure-manager.yaml).
grego952 marked this conversation as resolved.
Show resolved Hide resolved
Loading