diff --git a/en/deploy-on-alibaba-cloud.md b/en/deploy-on-alibaba-cloud.md index 1ce5ef9d799..52357390f63 100644 --- a/en/deploy-on-alibaba-cloud.md +++ b/en/deploy-on-alibaba-cloud.md @@ -79,7 +79,7 @@ All the instances except ACK mandatory workers are deployed across availability cd tidb-operator/deploy/aliyun ``` - You can create or modify `terraform.tfvars` to set the values of the variables, and configure the cluster to fit your needs. You can view the configurable variables and their descriptions in `variables.tf`. The following is an example of how to configure the ACK cluster name, the TiDB cluster name and the number of PD, TiKV, and TiDB nodes. + You can create or modify `terraform.tfvars` to set the values of the variables, and configure the cluster to fit your needs. You can view the configurable variables and their descriptions in `variables.tf`. The following is an example of how to configure the ACK cluster name, the TiDB cluster name, the TiDB Operator version, and the number of PD, TiKV, and TiDB nodes. ``` cluster_name = "testack" @@ -87,8 +87,13 @@ All the instances except ACK mandatory workers are deployed across availability tikv_count = 3 tidb_count = 2 pd_count = 3 + operator_version = "v1.1.0-rc.1" ``` + > **Note:** + > + > Check the `operator_version` in the `variables.tf` file for the default TiDB Operator version of the current scripts. If the default version is not your desired one, configure `operator_version` in `terraform.tfvars`. + After the configuration, execute the following commands to initialize and deploy the cluster: {{< copyable "shell-regular" >}} diff --git a/en/deploy-on-aws-eks.md b/en/deploy-on-aws-eks.md index e6b0033e7f5..a0813fb2856 100644 --- a/en/deploy-on-aws-eks.md +++ b/en/deploy-on-aws-eks.md @@ -94,7 +94,7 @@ The default setup creates a new VPC and a `t2.micro` instance as the bastion mac You can create or modify `terraform.tfvars` to set the value of variables and configure the cluster as needed. See the variables that can be set and their descriptions in `variables.tf`. -The following is an example of how to configure the EKS cluster name, the TiDB cluster name, and the number of PD, TiKV and TiDB nodes: +The following is an example of how to configure the EKS cluster name, the TiDB cluster name, the TiDB Operator version, and the number of PD, TiKV and TiDB nodes: ``` default_cluster_pd_count = 3 @@ -102,8 +102,13 @@ default_cluster_tikv_count = 3 default_cluster_tidb_count = 2 default_cluster_name = "tidb" eks_name = "my-cluster" +operator_version = "v1.1.0-rc.1" ``` +> **Note:** +> +> Check the `operator_version` in the `variables.tf` file for the default TiDB Operator version of the current scripts. If the default version is not your desired one, configure `operator_version` in `terraform.tfvars`. + After configuration, execute the `terraform` command to initialize and deploy the cluster: {{< copyable "shell-regular" >}} diff --git a/en/deploy-on-gcp-gke.md b/en/deploy-on-gcp-gke.md index 6acb229063c..c64e885bb72 100644 --- a/en/deploy-on-gcp-gke.md +++ b/en/deploy-on-gcp-gke.md @@ -133,6 +133,7 @@ This section describes how to deploy a TiDB cluster. > **Note:** > + > * Check the `tidb_operator_version` in the `variables.tf` file for the default TiDB Operator version of the current scripts. If the default version is not your desired one, configure `tidb_operator_version` in `terraform.tfvars`. > * The Regional cluster is created by default. In this scenario, the specified number of nodes are created in each one of the three Availability Zones (AZ). For example, if you configure `pd_count = 1`, three nodes are actually created for PD. You can specify the Availability Zones by configuring `node_locations`, or create the Zonal cluster by configuring `location`. See the example in `examples/` for details. > * The number of worker nodes to create depends on the number of Availability Zones in the specified Region. Most Regions have three zones, but `us-central1` has four zones. See [Regions and Zones](https://cloud.google.com/compute/docs/regions-zones/) for more information. See the [Customize](#customize) section to learn how to customize node pools in a regional cluster.