Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update aks sdk to 2021-08-01 #13465

Merged

Conversation

ms-henglu
Copy link
Contributor

No description provided.

@ms-henglu ms-henglu force-pushed the ticket-1878174-update-aks-sdk-2021-08-01 branch from a430d82 to 42911d2 Compare September 23, 2021 02:51
Copy link
Collaborator

@katbyte katbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like we have some test failureS:

------- Stdout: -------
=== RUN   TestAccDataSourceKubernetesCluster_autoscalingWithAvailabilityZones
=== PAUSE TestAccDataSourceKubernetesCluster_autoscalingWithAvailabilityZones
=== CONT  TestAccDataSourceKubernetesCluster_autoscalingWithAvailabilityZones
    testcase.go:88: Step 1/1 error: Error running apply: exit status 1
        
        Error: creating Managed Kubernetes Cluster "acctestaks211013213333946649" (Resource Group "acctestRG-aks-211013213333946649"): containerservice.ManagedClustersClient#CreateOrUpdate: Failure sending request: StatusCode=400 -- Original Error: Code="AgentPoolK8sVersionNotSupported" Message="Version 1.18.19 is not supported in this region. Please use [az aks get-versions] command to get the supported version list in this region. For more information, please check https://aka.ms/supported-version-list"
        
          with azurerm_kubernetes_cluster.test,
          on terraform_plugin_test.tf line 12, in resource "azurerm_kubernetes_cluster" "test":
          12: resource "azurerm_kubernetes_cluster" "test" {
        
--- FAIL: TestAccDataSourceKubernetesCluster_autoscalingWithAvailabilityZones (65.69s)
FAIL

------ Stdout: -------
=== RUN   TestAccKubernetesCluster_privateClusterOnWithPrivateDNSZone
=== PAUSE TestAccKubernetesCluster_privateClusterOnWithPrivateDNSZone
=== CONT  TestAccKubernetesCluster_privateClusterOnWithPrivateDNSZone
    testcase.go:88: Step 1/2 error: Error running apply: exit status 1
        
        Error: creating Managed Kubernetes Cluster "acctestaks211013220747049085" (Resource Group "acctestRG-aks-211013220747049085"): containerservice.ManagedClustersClient#CreateOrUpdate: Failure sending request: StatusCode=400 -- Original Error: Code="CustomPrivateDNSZoneMissingPermissionError" Message="Service principal or user assigned identity must be given permission to read and write to custom private dns zone /subscriptions/*******/resourceGroups/acctestRG-aks-211013220747049085/providers/Microsoft.Network/privateDnsZones/privatelink.westeurope.azmk8s.io. Check access result not allowed for action Microsoft.Network/privateDnsZones/read."
        
          with azurerm_kubernetes_cluster.test,
          on terraform_plugin_test.tf line 28, in resource "azurerm_kubernetes_cluster" "test":
          28: resource "azurerm_kubernetes_cluster" "test" {
        
--- FAIL: TestAccKubernetesCluster_privateClusterOnWithPrivateDNSZone (221.61s)
FAIL

------- Stdout: -------
=== RUN   TestAccKubernetesClusterNodePool_manualScaleMultiplePoolsUpdate
=== PAUSE TestAccKubernetesClusterNodePool_manualScaleMultiplePoolsUpdate
=== CONT  TestAccKubernetesClusterNodePool_manualScaleMultiplePoolsUpdate
    testing_new.go:70: Error running post-test destroy, there may be dangling resources: exit status 1
        
        Error: waiting for the deletion of Node Pool "second" (Managed Kubernetes Cluster "acctestaks211013230148694453" / Resource Group "acctestRG-aks-211013230148694453"): Code="DeleteVMSSAgentPoolFailed" Message="autorest/azure: Service returned an error. Status=429 Code=\"OperationNotAllowed\" Message=\"The server rejected the request because too many requests have been received for this subscription.\" Details=[{\"code\":\"TooManyRequests\",\"message\":\"{\\\"operationGroup\\\":\\\"HighCostGetVMScaleSet30Min\\\",\\\"startTime\\\":\\\"2021-10-13T23:14:02.4305965+00:00\\\",\\\"endTime\\\":\\\"2021-10-13T23:15:00+00:00\\\",\\\"allowedRequestCount\\\":1058,\\\"measuredRequestCount\\\":1145}\",\"target\":\"HighCostGetVMScaleSet30Min\"}] InnerError={\"internalErrorCode\":\"TooManyRequestsReceived\"}"
        
--- FAIL: TestAccKubernetesClusterNodePool_manualScaleMultiplePoolsUpdate (794.14s)
FAIL

@jackofallops
Copy link
Member

@ms-henglu - A rebase should resolve the conflicts and the version issue. We bumped the versions used in the tests in main recently to deal with this.

@ms-henglu ms-henglu force-pushed the ticket-1878174-update-aks-sdk-2021-08-01 branch from 42911d2 to 37c370c Compare November 3, 2021 03:56
@katbyte katbyte added this to the v2.84.0 milestone Nov 3, 2021
Copy link
Collaborator

@katbyte katbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @ms-henglu - LGTM 🏗️

@katbyte katbyte merged commit e1af1e7 into hashicorp:main Nov 3, 2021
katbyte added a commit that referenced this pull request Nov 3, 2021
@github-actions
Copy link

github-actions bot commented Nov 5, 2021

This functionality has been released in v2.84.0 of the Terraform Provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.

For further feature requests or bug reports with this functionality, please create a new GitHub issue following the template. Thank you!

@github-actions
Copy link

github-actions bot commented Dec 6, 2021

I'm going to lock this pull request because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active contributions.
If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 6, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants