Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running devfile integration tests on kubernetes cluster #3233

Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 58 additions & 4 deletions docs/dev/development.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -178,15 +178,21 @@ NOTE: Refer https://github.com/golang/go/wiki/LearnTesting for Go best practices

=== Integration and e2e tests

*Prerequisites:*
*Prerequisites for Openshift cluster:*

* A `minishift` or OpenShift environment with Service Catalog enabled:
+
----
$ MINISHIFT_ENABLE_EXPERIMENTAL=y minishift start --extra-clusterup-flags "--enable=*,service-catalog,automation-service-broker,template-service-broker"
----

* `odo` and `oc` binaries in `$PATH`.
*Prerequisites for Kubernetes cluster:*

* A `kubernetes` environment set up with single node cluster:
+
For `kubernetes` single node cluster install link:https://kubernetes.io/docs/tasks/tools/install-minikube/[`Minikube`]
Copy link
Contributor

@mik-dass mik-dass May 26, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for a single node kubernetes cluster

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


* `odo`, `oc` and `kubectl` binaries in `$PATH`.

*Integration tests:*

Expand Down Expand Up @@ -331,7 +337,7 @@ There are some test environment variable that helps to get more control over the

* UNIT_TEST_ARGS: Env variable UNIT_TEST_ARGS is used to get control over enabling test flags along with go test. For example, To enable verbosity export or set env UNIT_TEST_ARGS like `UNIT_TEST_ARGS=-v`.

*Running integration tests:*
*Running integration tests on Openshift:*

By default, tests are run against the `odo` binary placed in the PATH which is created by command `make`. Integration tests can be run in two (parallel and sequential) ways. To control the parallel run use environment variable `TEST_EXEC_NODES`. For example component test can be run

Expand All @@ -353,7 +359,55 @@ Run component command integration tests
$ TEST_EXEC_NODES=1 make test-cmd-cmp
----

NOTE: To see the number of available integration test file for validation, press `tab` just after writing `make test-cmd-`. However there is a test file `generic_test.go` which handles certain test spec easily and can run the spec in parallel by calling `make test-generic`. By calling make `test-integration`, the whole suite can run all the spec in parallel on two ginkgo test node except `service` and `link` irrespective of service catalog status in the cluster. However `make test-integration-service-catalog` runs all spec of service and link tests successfully in parallel on cluster having service catalog enabled. `make test-odo-login-e2e` doesn't honour environment variable `TEST_EXEC_NODES`. So by default it runs login and logout command integration test suite on a single ginkgo test node sequentially to avoid race conditions in a parallel run.
NOTE: To see the number of available integration test file for validation, press `tab` just after writing `make test-cmd-`. However there is a test file `generic_test.go` which handles certain test spec easily and can run the spec in parallel by calling `make test-generic`. By calling `make test-integration`, the whole suite can run all the spec in parallel on two ginkgo test node except `service` and `link` irrespective of service catalog status in the cluster. However `make test-integration-service-catalog` runs all spec of service and link tests successfully in parallel on cluster having service catalog enabled. `make test-odo-login-e2e` doesn't honour environment variable `TEST_EXEC_NODES`. So by default it runs login and logout command integration test suite on a single ginkgo test node sequentially to avoid race conditions in a parallel run.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However there is a test file generic_test.gowhich handles certain test specs easily and we can run it parallelly by callingmake test-generic. By calling make test-integration, the whole suite will run all the specs

However make test-integration-service-catalogruns all the specs of service and link tests in parallel on a cluster having service catalog enabled.make test-odo-login-e2edoesn't honor environment variableTEST_EXEC_NODES`. So by default it runs login and logout command integration test suites on a single ginkgo test node sequentially to avoid race conditions during a parallel run.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed all the review comment.


*Running integration tests on Kubernetes:*

By default, the link:https://github.com/openshift/odo/tree/master/tests/integration/devfile[`tests`] in `odo` which are under experimental mode run against `kubernetes` cluster. For more information on Experimental mode, please read link:https://github.com/openshift/odo/blob/master/docs/dev/experimental-mode.adoc:[`odo experimental mode`] document.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
By default, the link:https://github.com/openshift/odo/tree/master/tests/integration/devfile[`tests`] in `odo` which are under experimental mode run against `kubernetes` cluster. For more information on Experimental mode, please read link:https://github.com/openshift/odo/blob/master/docs/dev/experimental-mode.adoc:[`odo experimental mode`] document.
By default, the link:https://github.com/openshift/odo/tree/master/tests/integration/devfile[integratoin tests] for devfile feature, which is in experimental mode, run against `kubernetes` cluster. For more information on Experimental mode, please read link:https://github.com/openshift/odo/blob/master/docs/dev/experimental-mode.adoc:[`odo experimental mode`] document.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍


The tests are run against the `odo` binary placed in the PATH which is created by command `make`. Integration tests can be run in two (parallel and sequential) ways. To control the parallel run use environment variable `TEST_EXEC_NODES`. For example devfile tests can be run
Copy link
Contributor

@mik-dass mik-dass May 26, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the command make
can be run in two ways: parallel and sequential. For example, the devfile`


* To run the tests on `kubernetes` cluster:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* To run the tests on `kubernetes` cluster:
* To run the tests on Kubernetes cluster:

No need for code blocks here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


+
Enable `KUBERNETES` environment variable
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Enable `KUBERNETES` environment variable
Set the `KUBERNETES` environment variable

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

+
----
$ export KUBERNETES=true
----

+
Enable `Experimental` mode
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Enable `Experimental` mode
Enable the experimental mode

Again no need of code block here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

+
----
$ export ODO_EXPERIMENTAL=true
----
+
OR
+
----
$ odo preference set Experimental true -f
----

* To run the test in parallel, on a test cluster (By default the test will run in parallel on two ginkgo test node):

+
Run catalog command integration tests
+
----
$ make test-cmd-devfile-catalog
----
+

* To run the catalog command integration tests sequentially or on single ginkgo test node:
+
Run catalog command integration tests
+
----
$ TEST_EXEC_NODES=1 make test-cmd-devfile-catalog
----

NOTE: To see the number of available integration test file for validation, press `tab` just after writing `make test-cmd-devfile-`. By calling `make test-integration-devfile`, the whole suite can run all the spec in parallel on two ginkgo test node.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
NOTE: To see the number of available integration test file for validation, press `tab` just after writing `make test-cmd-devfile-`. By calling `make test-integration-devfile`, the whole suite can run all the spec in parallel on two ginkgo test node.
NOTE: To see the number of available integration test files for validation, press `Tab` key just after writing `make test-cmd-devfile-`. By calling `make test-integration-devfile`, the suite will run all test specs in parallel on two ginkgo test nodes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for suggestion. Done 👍


*Running e2e tests:*

Expand Down