Skip to content
This repository has been archived by the owner on Jun 19, 2022. It is now read-only.

Making naming of PubSub resources consistent and more readable #1207

Merged
merged 13 commits into from
Jun 4, 2020

Conversation

nachocano
Copy link
Member

@nachocano nachocano commented Jun 4, 2020

Fixes #1155

Proposed Changes

  • Using the convention used in the broker: cre-<owner_type>_<namespace>_<name>_<uid>.
  • Moving things around to avoid duplication.
  • Changing PS's receive adapter name. This was also hard to debug. This will also be breaking and we should probably provide a post-install script in OSS to delete old deployments.
  • Adding UTs.
  • CAL sink size limit (100) is different than regular PubSub resources (255). Setting it to that new maximum as it was causing problems in PROW as we are using very long namespaces and names..

Release Note

We have changed the naming convention we use for GCP resources in order to make it easy for users to understand who created them (e.g., a particular source, a channel?). The convention is `cre-<owner_type>_<namespace>_<name>_<uid>`. For example if a Source mysource in the namespace default with uid 47163a creates a Pub/Sub subscription, then the subscription will be called `cre-src_default_mysource_47163a`. 

@knative-prow-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: nachocano

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@nachocano
Copy link
Member Author

/assign @Harwayne
fyi @zargarpur

@googlebot googlebot added the cla: yes (override cla status due to multiple authors bug) label Jun 4, 2020
@nachocano
Copy link
Member Author

nachocano commented Jun 4, 2020 via email

@nachocano
Copy link
Member Author

/retest

@nachocano
Copy link
Member Author

/retest

@nachocano nachocano changed the title [WIP] Making naming of PubSub resources consistent and more readable Making naming of PubSub resources consistent and more readable Jun 4, 2020
@nachocano
Copy link
Member Author

@Harwayne this is ready for review...

Copy link
Contributor

@Harwayne Harwayne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make a P1 issue that blocks the 0.16 release to have the controllers clean up the old resources.

@@ -38,5 +39,6 @@ func GenerateTopicResourceName(s *v1alpha1.CloudAuditLogsSource) string {
// GenerateSinkName generates a Stackdriver sink resource name for an
// CloudAuditLogsSource.
func GenerateSinkName(s *v1alpha1.CloudAuditLogsSource) string {
return fmt.Sprintf("sink-%s", string(s.UID))
// This name should be different than the one provided by GenerateTopicName.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add to this comment saying why it has to be different.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I may have been wrong. The problem might have been just about the length. Let me try again with cre-src... If it doesn't work, then I'll switch back to cre-cal and add the comment

@nachocano
Copy link
Member Author

Created an issue to track the proper removal of resources... #1217

@knative-metrics-robot
Copy link

The following is the coverage report on the affected files.
Say /test pull-google-knative-gcp-go-coverage to re-run this coverage report

File Old Coverage New Coverage Delta
pkg/reconciler/events/auditlogs/resources/names.go Do not exist 100.0%
pkg/reconciler/events/scheduler/resources/names.go Do not exist 100.0%
pkg/reconciler/events/storage/resources/names.go Do not exist 100.0%
pkg/reconciler/intevents/pullsubscription/resources/labels.go 0.0% 50.0% 50.0
pkg/reconciler/messaging/channel/resources/names.go 83.3% 100.0% 16.7
pkg/utils/naming/names.go Do not exist 100.0%

@nachocano
Copy link
Member Author

/test pull-google-knative-gcp-wi-tests

Copy link
Contributor

@Harwayne Harwayne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

@nachocano
Copy link
Member Author

seeing

2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1```

@Harwayne
Copy link
Contributor

Harwayne commented Jun 4, 2020

seeing

2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1```
error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1

I was seeing the same thing in #1156.

@nachocano
Copy link
Member Author

seeing

2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1```
error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1

I was seeing the same thing in #1156.

ok, yeah, might be something weird with PROW. Hope the retest will pass.
@chizhg something that might be worth looking if you have any cycles left

@nachocano
Copy link
Member Author

another one

2020/06/04 20:52:00 Unexpected error running "go build": signal: killed
2020/06/04 20:52:00 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/pubsub_target/pod.yaml": error resolving image references: exit status 1

/retest

@chizhg
Copy link
Member

chizhg commented Jun 4, 2020

seeing

2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1```
error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1

I was seeing the same thing in #1156.

ok, yeah, might be something weird with PROW. Hope the retest will pass.
@chizhg something that might be worth looking if you have any cycles left

Seems to be the OOM issue we had for build tests? We haven't seen integration test is taking so much memory for other repos so far, but I think making changes to the knative-gcp integration test Prow jobs (similar as done by knative/test-infra#2146) would fix this issue.

@nachocano
Copy link
Member Author

seeing

2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 Unexpected error running "go build": signal: killed
2020/06/04 19:53:37 error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1```
error processing import paths in "/home/prow/go/src/github.com/google/knative-gcp/test/test_images/auditlogs_receiver/pod.yaml": error resolving image references: exit status 1

I was seeing the same thing in #1156.

ok, yeah, might be something weird with PROW. Hope the retest will pass.
@chizhg something that might be worth looking if you have any cycles left

Seems to be the OOM issue we had for build tests? We haven't seen integration test is taking so much memory for other repos so far, but I think making changes to the knative-gcp integration test Prow jobs (similar as done by knative/test-infra#2146) would fix this issue.

@capri-xiyue I need to step out for a few hours. Can you help me out and increase those limits so that we can get this PR in? If you don't have cycles, no worries, I'll do once I'm back

@capri-xiyue
Copy link
Contributor

capri-xiyue commented Jun 4, 2020

@nacho Submitted this PR knative/test-infra#2166 to increase the mem limit of prow e2e test jobs

@capri-xiyue
Copy link
Contributor

/retest

@chizhg
Copy link
Member

chizhg commented Jun 4, 2020

/test pull-google-knative-gcp-integration-tests
/test pull-google-knative-gcp-wi-tests

@capri-xiyue
Copy link
Contributor

/test pull-google-knative-gcp-integration-tests

@capri-xiyue
Copy link
Contributor

/retest

@knative-test-reporter-robot

The following jobs failed:

Test name Triggers Retries
pull-google-knative-gcp-integration-tests pull-google-knative-gcp-integration-tests
pull-google-knative-gcp-integration-tests
pull-google-knative-gcp-integration-tests
pull-google-knative-gcp-integration-tests
pull-google-knative-gcp-integration-tests
3/3

Job pull-google-knative-gcp-integration-tests expended all 3 retries without success.

@capri-xiyue
Copy link
Contributor

/retest

@knative-prow-robot knative-prow-robot merged commit 391e078 into google:master Jun 4, 2020
@nachocano
Copy link
Member Author

YESSSSSSSSSSSSSSS!!!! FINALLYYYY

Thanks @capri-xiyue and @Harwayne

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
approved cla: yes (override cla status due to multiple authors bug) lgtm size/XXL
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PullSubscription names should be more "readable"
8 participants