Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make code more readable by enforcing max-nested-blocks = 3 with pylint #1018

Merged
merged 2 commits into from
Mar 6, 2024

Conversation

nfx
Copy link
Collaborator

@nfx nfx commented Mar 6, 2024

No logic changes, just for readability and to spare code reviewer's sanity.

@nfx nfx requested review from a team and andrascsillag-db March 6, 2024 22:12
Copy link

codecov bot commented Mar 6, 2024

Codecov Report

Attention: Patch coverage is 81.17647% with 32 lines in your changes are missing coverage. Please review.

Project coverage is 88.43%. Comparing base (6075fec) to head (c256672).
Report is 1 commits behind head on main.

Files Patch % Lines
...rc/databricks/labs/ucx/hive_metastore/locations.py 68.75% 4 Missing and 6 partials ⚠️
...atabricks/labs/ucx/hive_metastore/table_migrate.py 74.07% 3 Missing and 4 partials ⚠️
src/databricks/labs/ucx/assessment/jobs.py 86.84% 1 Missing and 4 partials ⚠️
src/databricks/labs/ucx/hive_metastore/udfs.py 66.66% 3 Missing and 1 partial ⚠️
src/databricks/labs/ucx/account.py 86.95% 1 Missing and 2 partials ⚠️
src/databricks/labs/ucx/framework/dashboards.py 70.00% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1018      +/-   ##
==========================================
- Coverage   88.45%   88.43%   -0.03%     
==========================================
  Files          47       47              
  Lines        6126     6157      +31     
  Branches     1100     1102       +2     
==========================================
+ Hits         5419     5445      +26     
- Misses        470      474       +4     
- Partials      237      238       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

github-actions bot commented Mar 6, 2024

❌ 108/109 passed, 1 failed, 14 skipped, 1h34m55s total

❌ test_running_real_assessment_job: databricks.labs.blueprint.parallel.ManyError: Detected 4 failures: Unknown: assess_global_init_scripts: run failed with error message (7m47.963s)
databricks.labs.blueprint.parallel.ManyError: Detected 4 failures: Unknown: assess_global_init_scripts: run failed with error message
 Could not reach driver of cluster DATABRICKS_CLUSTER_ID., Unknown: assess_jobs: run failed with error message
 Could not reach driver of cluster DATABRICKS_CLUSTER_ID., Unknown: assess_pipelines: run failed with error message
 Could not reach driver of cluster DATABRICKS_CLUSTER_ID., Unknown: workspace_listing: run failed with error message
 Could not reach driver of cluster DATABRICKS_CLUSTER_ID.
22:18 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE SCHEMA hive_metastore.ucx_ssenx
22:18 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_ssenx: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_ssenx
22:18 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_ssenx', metastore_id=None, name='ucx_ssenx', owner=None, properties=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
[gw9] linux -- Python 3.10.13 /home/runner/work/ucx/ucx/.venv/bin/python
22:18 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE SCHEMA hive_metastore.ucx_ssenx
22:18 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_ssenx: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_ssenx
22:18 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_ssenx', metastore_id=None, name='ucx_ssenx', owner=None, properties=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added workspace user fixture: User(active=True, display_name='sdk-t8rn@example.com', emails=[ComplexValue(display=None, primary=True, ref=None, type='work', value='sdk-t8rn@example.com')], entitlements=[], external_id=None, groups=[], id='2957468156724548', name=Name(family_name=None, given_name='sdk-t8rn@example.com'), roles=[], schemas=[<UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_USER: 'urn:ietf:params:scim:schemas:core:2.0:User'>, <UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER: 'urn:ietf:params:scim:schemas:extension:workspace:2.0:User'>], user_name='sdk-t8rn@example.com')
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Workspace group ucx_dsKa: https://DATABRICKS_HOST#setting/accounts/groups/746253014595047
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added workspace group fixture: Group(display_name='ucx_dsKa', entitlements=[ComplexValue(display=None, primary=None, ref=None, type=None, value='allow-cluster-create')], external_id=None, groups=[], id='746253014595047', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=ResourceMeta(resource_type='WorkspaceGroup'), roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Account group ucx_dsKa: https://accounts.CLOUD_ENVdatabricks.net/users/groups/820589149262652/members
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added account group fixture: Group(display_name='ucx_dsKa', entitlements=[], external_id=None, groups=[], id='820589149262652', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=None, roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Cluster policy: https://DATABRICKS_HOST#setting/clusters/cluster-policies/view/001E57F1763E3C24
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added cluster policy fixture: CreatePolicyResponse(policy_id='001E57F1763E3C24')
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added cluster_policy permissions fixture: 001E57F1763E3C24 [group_name admins CAN_USE] -> [group_name ucx_dsKa CAN_USE]
22:19 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.fVxJ/config.yml) doesn't exist.
22:19 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
22:19 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
22:19 INFO [databricks.labs.ucx.install] Creating UCX cluster policy.
22:19 INFO [databricks.labs.ucx.install] Installing UCX v0.14.1+1620240306221908
22:19 INFO [databricks.labs.ucx.install] Creating dashboards...
22:19 INFO [databricks.labs.ucx.install] Fetching warehouse_id from a config
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx database exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE SCHEMA IF NOT EXISTS hive_metastore.ucx_ssenx
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
22:19 DEBUG [databricks.labs.ucx.install] Creating jobs from tasks in main
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading dashboard folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
22:19 INFO [databricks.labs.ucx.framework.dashboards] Creating dashboard [FVXJ] UCX  Assessment (Main)...
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.CLOUD_ENV_service_principals table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.CLOUD_ENV_service_principals (application_id STR... (107 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.clusters table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.clusters (cluster_id STRING NOT NULL, succes... (91 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.global_init_scripts table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.global_init_scripts (script_id STRING NOT NU... (120 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.jobs table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.pipelines table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.jobs (job_id STRING NOT NULL, success LONG N... (79 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.external_locations table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.mounts table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.pipelines (pipeline_id STRING NOT NULL, succ... (99 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.grants table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.external_locations (location STRING NOT NULL... (40 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.groups table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.mounts (name STRING NOT NULL, source STRING ... (21 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.tables table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.grants (principal STRING NOT NULL, action_ty... (167 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.table_size table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.table_failures table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.workspace_objects table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.permissions table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.tables (catalog STRING NOT NULL, database ST... (189 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.submit_runs table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.table_size (catalog STRING NOT NULL, databas... (81 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.table_failures (catalog STRING NOT NULL, dat... (61 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.workspace_objects (path STRING NOT NULL, obj... (63 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.permissions (object_id STRING NOT NULL, obje... (57 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.submit_runs (run_ids STRING NOT NULL, hashed... (58 more bytes)
22:19 INFO [databricks.labs.ucx.install] Fetching warehouse_id from a config
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=assessment
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=remove-workspace-local-backup-groups
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=099-destroy-schema
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=migrate-groups
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=validate-groups-permissions
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.objects view matches queries/views/objects.sql contents
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE OR REPLACE VIEW hive_metastore.ucx_ssenx.objects AS SELECT "jobs" AS object_type, job_id ... (1639 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.grant_detail view matches queries/views/grant_detail.sql contents
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE OR REPLACE VIEW hive_metastore.ucx_ssenx.grant_detail AS SELECT CASE WHEN anonymous_funct... (1037 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading dashboard folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
22:19 INFO [databricks.labs.ucx.framework.dashboards] Creating dashboard [FVXJ] UCX  Assessment (Azure)...
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
22:19 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.fVxJ/README for the next steps.
22:19 DEBUG [databricks.labs.ucx.install] starting assessment job: https://DATABRICKS_HOST#job/1025930174748075
22:18 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE SCHEMA hive_metastore.ucx_ssenx
22:18 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_ssenx: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_ssenx
22:18 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_ssenx', metastore_id=None, name='ucx_ssenx', owner=None, properties=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added workspace user fixture: User(active=True, display_name='sdk-t8rn@example.com', emails=[ComplexValue(display=None, primary=True, ref=None, type='work', value='sdk-t8rn@example.com')], entitlements=[], external_id=None, groups=[], id='2957468156724548', name=Name(family_name=None, given_name='sdk-t8rn@example.com'), roles=[], schemas=[<UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_USER: 'urn:ietf:params:scim:schemas:core:2.0:User'>, <UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER: 'urn:ietf:params:scim:schemas:extension:workspace:2.0:User'>], user_name='sdk-t8rn@example.com')
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Workspace group ucx_dsKa: https://DATABRICKS_HOST#setting/accounts/groups/746253014595047
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added workspace group fixture: Group(display_name='ucx_dsKa', entitlements=[ComplexValue(display=None, primary=None, ref=None, type=None, value='allow-cluster-create')], external_id=None, groups=[], id='746253014595047', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=ResourceMeta(resource_type='WorkspaceGroup'), roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Account group ucx_dsKa: https://accounts.CLOUD_ENVdatabricks.net/users/groups/820589149262652/members
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added account group fixture: Group(display_name='ucx_dsKa', entitlements=[], external_id=None, groups=[], id='820589149262652', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=None, roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:19 INFO [databricks.labs.ucx.mixins.fixtures] Cluster policy: https://DATABRICKS_HOST#setting/clusters/cluster-policies/view/001E57F1763E3C24
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added cluster policy fixture: CreatePolicyResponse(policy_id='001E57F1763E3C24')
22:19 DEBUG [databricks.labs.ucx.mixins.fixtures] added cluster_policy permissions fixture: 001E57F1763E3C24 [group_name admins CAN_USE] -> [group_name ucx_dsKa CAN_USE]
22:19 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.fVxJ/config.yml) doesn't exist.
22:19 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
22:19 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
22:19 INFO [databricks.labs.ucx.install] Creating UCX cluster policy.
22:19 INFO [databricks.labs.ucx.install] Installing UCX v0.14.1+1620240306221908
22:19 INFO [databricks.labs.ucx.install] Creating dashboards...
22:19 INFO [databricks.labs.ucx.install] Fetching warehouse_id from a config
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx database exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE SCHEMA IF NOT EXISTS hive_metastore.ucx_ssenx
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
22:19 DEBUG [databricks.labs.ucx.install] Creating jobs from tasks in main
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading dashboard folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
22:19 INFO [databricks.labs.ucx.framework.dashboards] Creating dashboard [FVXJ] UCX  Assessment (Main)...
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.CLOUD_ENV_service_principals table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.CLOUD_ENV_service_principals (application_id STR... (107 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.clusters table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.clusters (cluster_id STRING NOT NULL, succes... (91 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.global_init_scripts table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.global_init_scripts (script_id STRING NOT NU... (120 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.jobs table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.pipelines table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.jobs (job_id STRING NOT NULL, success LONG N... (79 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.external_locations table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.mounts table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.pipelines (pipeline_id STRING NOT NULL, succ... (99 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.grants table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.external_locations (location STRING NOT NULL... (40 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.groups table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.mounts (name STRING NOT NULL, source STRING ... (21 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.tables table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.grants (principal STRING NOT NULL, action_ty... (167 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.table_size table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.table_failures table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.workspace_objects table exists
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.permissions table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.tables (catalog STRING NOT NULL, database ST... (189 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.submit_runs table exists
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.table_size (catalog STRING NOT NULL, databas... (81 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.table_failures (catalog STRING NOT NULL, dat... (61 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.workspace_objects (path STRING NOT NULL, obj... (63 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.permissions (object_id STRING NOT NULL, obje... (57 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_ssenx.submit_runs (run_ids STRING NOT NULL, hashed... (58 more bytes)
22:19 INFO [databricks.labs.ucx.install] Fetching warehouse_id from a config
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=assessment
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=remove-workspace-local-backup-groups
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=099-destroy-schema
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=migrate-groups
22:19 INFO [databricks.labs.ucx.install] Creating new job configuration for step=validate-groups-permissions
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.objects view matches queries/views/objects.sql contents
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE OR REPLACE VIEW hive_metastore.ucx_ssenx.objects AS SELECT "jobs" AS object_type, job_id ... (1639 more bytes)
22:19 INFO [databricks.labs.ucx.framework.crawlers] Ensuring ucx_ssenx.grant_detail view matches queries/views/grant_detail.sql contents
22:19 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] CREATE OR REPLACE VIEW hive_metastore.ucx_ssenx.grant_detail AS SELECT CASE WHEN anonymous_funct... (1037 more bytes)
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading dashboard folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
22:19 INFO [databricks.labs.ucx.framework.dashboards] Creating dashboard [FVXJ] UCX  Assessment (Azure)...
22:19 DEBUG [databricks.labs.ucx.framework.dashboards] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
22:19 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.fVxJ/README for the next steps.
22:19 DEBUG [databricks.labs.ucx.install] starting assessment job: https://DATABRICKS_HOST#job/1025930174748075
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 cluster_policy permissions fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing cluster_policy permissions fixture: 001E57F1763E3C24 [group_name admins CAN_USE] -> [group_name ucx_dsKa CAN_USE]
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 cluster policy fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing cluster policy fixture: CreatePolicyResponse(policy_id='001E57F1763E3C24')
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 workspace user fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing workspace user fixture: User(active=True, display_name='sdk-t8rn@example.com', emails=[ComplexValue(display=None, primary=True, ref=None, type='work', value='sdk-t8rn@example.com')], entitlements=[], external_id=None, groups=[], id='2957468156724548', name=Name(family_name=None, given_name='sdk-t8rn@example.com'), roles=[], schemas=[<UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_USER: 'urn:ietf:params:scim:schemas:core:2.0:User'>, <UserSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER: 'urn:ietf:params:scim:schemas:extension:workspace:2.0:User'>], user_name='sdk-t8rn@example.com')
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 account group fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing account group fixture: Group(display_name='ucx_dsKa', entitlements=[], external_id=None, groups=[], id='820589149262652', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=None, roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 workspace group fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing workspace group fixture: Group(display_name='ucx_dsKa', entitlements=[ComplexValue(display=None, primary=None, ref=None, type=None, value='allow-cluster-create')], external_id=None, groups=[], id='746253014595047', members=[ComplexValue(display='sdk-t8rn@example.com', primary=None, ref='Users/2957468156724548', type=None, value='2957468156724548')], meta=ResourceMeta(resource_type='WorkspaceGroup'), roles=[], schemas=[<GroupSchema.URN_IETF_PARAMS_SCIM_SCHEMAS_CORE_2_0_GROUP: 'urn:ietf:params:scim:schemas:core:2.0:Group'>])
22:26 INFO [databricks.labs.ucx.install] Deleting UCX v0.14.1+1620240306222619 from https://DATABRICKS_HOST
22:26 INFO [databricks.labs.ucx.install] Deleting inventory database ucx_ssenx
22:26 INFO [databricks.labs.ucx.framework.crawlers] deleting ucx_ssenx database
22:26 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] DROP SCHEMA IF EXISTS hive_metastore.ucx_ssenx CASCADE
22:26 INFO [databricks.labs.ucx.install] Deleting jobs
22:26 INFO [databricks.labs.ucx.install] Deleting assessment job_id=1025930174748075.
22:26 INFO [databricks.labs.ucx.install] Deleting remove-workspace-local-backup-groups job_id=129847591460193.
22:26 INFO [databricks.labs.ucx.install] Deleting 099-destroy-schema job_id=1086070116624123.
22:26 INFO [databricks.labs.ucx.install] Deleting migrate-groups job_id=609449686806426.
22:26 INFO [databricks.labs.ucx.install] Deleting validate-groups-permissions job_id=971741018869835.
22:26 INFO [databricks.labs.ucx.install] Deleting cluster policy
22:26 INFO [databricks.labs.ucx.install] Deleting secret scope
22:26 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 schema fixtures
22:26 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_ssenx', metastore_id=None, name='ucx_ssenx', owner=None, properties=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
22:26 DEBUG [databricks.labs.ucx.framework.crawlers] [api][execute] DROP SCHEMA IF EXISTS hive_metastore.ucx_ssenx CASCADE
[gw9] linux -- Python 3.10.13 /home/runner/work/ucx/ucx/.venv/bin/python

Running from acceptance #1452

@nfx
Copy link
Collaborator Author

nfx commented Mar 6, 2024

failed test is flaky and is passed locally:
image

@nfx nfx merged commit bc843c9 into main Mar 6, 2024
4 of 7 checks passed
@nfx nfx deleted the pylint/nested-blocks branch March 6, 2024 22:45
nkvuong added a commit that referenced this pull request Mar 7, 2024
author Vuong <vuong.nguyen@databricks.com> 1709738765 +0000
committer Vuong <vuong.nguyen@databricks.com> 1709812255 +0000

parent 7735a71
author Vuong <vuong.nguyen@databricks.com> 1709738765 +0000
committer Vuong <vuong.nguyen@databricks.com> 1709812237 +0000

parent 7735a71
author Vuong <vuong.nguyen@databricks.com> 1709738765 +0000
committer Vuong <vuong.nguyen@databricks.com> 1709812227 +0000

parent 7735a71
author Vuong <vuong.nguyen@databricks.com> 1709738765 +0000
committer Vuong <vuong.nguyen@databricks.com> 1709812214 +0000

parent 7735a71
author Vuong <vuong.nguyen@databricks.com> 1709738765 +0000
committer Vuong <vuong.nguyen@databricks.com> 1709812198 +0000

make fmt

Added `upgraded_from_workspace_id` property to migrated tables to indicated the source workspace. (#987)

Added table parameter `upgraded_from_ws` to migrated tables. The
parameters contains the sources workspace id.

Resolves #899

- [ ] added relevant user documentation
- [ ] added new CLI command
- [ ] modified existing command: `databricks labs ucx ...`
- [ ] added a new workflow
- [ ] modified existing workflow: `...`
- [ ] added a new table
- [ ] modified existing table: `...`

<!-- How is this tested? Please see the checklist below and also
describe any other relevant tests -->

- [x] manually tested
- [x] added unit tests
- [x] added integration tests
- [x] verified on staging environment (screenshot attached)

Added group members difference to the output of `validate-groups-membership` cli command (#995)

The `validate-groups-membership` command has been updated to include a
comparison of group memberships at both the account and workspace
levels, displaying the difference in members between the two levels in a
new column. This enhancement allows for a more detailed analysis of
group memberships, with the added functionality implemented in the
`validate_group_membership` function in the `groups.py` file located in
the `databricks/labs/ucx/workspace_access` directory. A new output
field, "group\_members\_difference," has been added to represent the
difference in the number of members between a workspace group and an
associated account group. The corresponding unit test file,
"test\_groups.py," has been updated to include a new test case that
verifies the calculation of the "group\_members\_difference" value. This
change provides users with a more comprehensive view of their group
memberships and allows them to easily identify any discrepancies between
the account and workspace levels. The functionality of the other
commands remains unchanged.

Improved installation integration test flakiness (#998)

- improved `_infer_error_from_job_run` and `_infer_error_from_task_run`
to also catch `KeyError` and `ValueError`
- removed retries for `Unknown` errors for installation tests

Expanded end-user documentation with detailed descriptions for workflows and commands (#999)

The Databricks Labs UCX project has been updated with several new
features to assist in upgrading to Unity Catalog. These include various
workflows and command-line utilities, such as an assessment workflow
that generates a detailed compatibility report for workspace entities
and a group migration workflow to upgrade all Databricks workspace
assets. Additionally, new utility commands have been added for managing
cross-workspace installations, and users can now view deployed
workflows' status and repair failed workflows. A new end-user
documentation has also been introduced, featuring comprehensive
descriptions of workflows, commands, and an assessment report image. The
Assessment Report, generated from UCX tools, now includes a more
detailed summary of the assessment findings, table counts, database
summaries, and external locations. Improved documentation for external
Hive Metastore integration and a new debugging notebook are also
included in this release. Lastly, the workspace group migration feature
has been expanded to handle potential conflicts when migrating multiple
workspaces with locally scoped group names.

Release v0.14.0 (#1000)

* Added `upgraded_from_workspace_id` property to migrated tables to
indicated the source workspace
([#987](#987)). In this
release, updates have been made to the `_migrate_external_table`,
`_migrate_dbfs_root_table`, and `_migrate_view` methods in the
`table_migrate.py` file to include a new parameter `upgraded_from_ws` in
the SQL commands used to alter tables, views, or managed tables. This
parameter is used to store the source workspace ID in the migrated
tables, indicating the migration origin. A new utility method
`sql_alter_from` has been added to the `Table` class in `tables.py` to
generate the SQL command with the new parameter. Additionally, a new
class-level attribute `UPGRADED_FROM_WS_PARAM` has been added to the
`Table` class in `tables.py` to indicate the source workspace. A new
property `upgraded_from_workspace_id` has been added to migrated tables
to store the source workspace ID. These changes resolve issue
[#899](#899) and are tested
through manual testing, unit tests, and integration tests. No new CLI
commands, workflows, or tables have been added or modified, and there
are no changes to user documentation.
* Added a command to create account level groups if they do not exist
([#763](#763)). This commit
introduces a new feature that enables the creation of account-level
groups if they do not already exist in the account. A new command,
`create-account-groups`, has been added to the `databricks labs ucx`
tool, which crawls all workspaces in the account and creates
account-level groups if a corresponding workspace-local group is not
found. The feature supports various scenarios, including creating
account-level groups that exist in some workspaces but not in others,
and creating multiple account-level groups with the same name but
different members. Several new methods have been added to the
`account.py` file to support the new feature, and the `test_account.py`
file has been updated with new tests to ensure the correct behavior of
the `create_account_level_groups` method. Additionally, the `cli.py`
file has been updated to include the new `create-account-groups`
command. With these changes, users can easily manage account-level
groups and ensure that they are consistent across all workspaces in the
account, improving the overall user experience.
* Added assessment for the incompatible `RunSubmit` API usages
([#849](#849)). In this
release, the assessment functionality for incompatible `RunSubmit` API
usages has been significantly enhanced through various changes. The
'clusters.py' file has seen improvements in clarity and consistency with
the renaming of private methods `check_spark_conf` to
`_check_spark_conf` and `check_cluster_failures` to
`_check_cluster_failures`. The `_assess_clusters` method has been
updated to call the renamed `_check_cluster_failures` method for
thorough checks of cluster configurations, resulting in better
assessment functionality. A new `SubmitRunsCrawler` class has been added
to the `databricks.labs.ucx.assessment.jobs` module, implementing
`CrawlerBase`, `JobsMixin`, and `CheckClusterMixin` classes. This class
crawls and assesses job runs based on their submitted runs, ensuring
compatibility and identifying failure issues. Additionally, a new
configuration attribute, `num_days_submit_runs_history`, has been
introduced in the `WorkspaceConfig` class of the `config.py` module,
controlling the number of days for which submission history of
`RunSubmit` API calls is retained. Lastly, various new JSON files have
been added for unit testing, assessing the `RunSubmit` API usages
related to different scenarios like dbt task runs, Git source-based job
runs, JAR file runs, and more. These tests will aid in identifying and
addressing potential compatibility issues with the `RunSubmit` API.
* Added group members difference to the output of
`validate-groups-membership` cli command
([#995](#995)). The
`validate-groups-membership` command has been updated to include a
comparison of group memberships at both the account and workspace
levels. This enhancement is implemented through the
`validate_group_membership` function, which has been updated to
calculate the difference in members between the two levels and display
it in a new `group_members_difference` column. This allows for a more
detailed analysis of group memberships and easily identifies any
discrepancies between the account and workspace levels. The
corresponding unit test file, "test_groups.py," has been updated to
include a new test case that verifies the calculation of the
`group_members_difference` value. The functionality of the other
commands remains unchanged. The new `group_members_difference` value is
calculated as the difference in the number of members in the workspace
group and the account group, with a positive value indicating more
members in the workspace group and a negative value indicating more
members in the account group. The table template in the labs.yml file
has also been updated to include the new column for the group membership
difference.
* Added handling for empty `directory_id` if managed identity
encountered during the crawling of StoragePermissionMapping
([#986](#986)). This PR adds
a `type` field to the `StoragePermissionMapping` and `Principal`
dataclasses to differentiate between service principals and managed
identities, allowing `None` for the `directory_id` field if the
principal is not a service principal. During the migration to UC storage
credentials, managed identities are currently ignored. These changes
improve handling of managed identities during the crawling of
`StoragePermissionMapping`, prevent errors when creating storage
credentials with managed identities, and address issue
[#339](#339). The changes
are tested through unit tests, manual testing, and integration tests,
and only affect the `StoragePermissionMapping` class and related
methods, without introducing new commands, workflows, or tables.
* Added migration for Azure Service Principals with secrets stored in
Databricks Secret to UC Storage Credentials
([#874](#874)). In this
release, we have made significant updates to migrate Azure Service
Principals with their secrets stored in Databricks Secret to UC Storage
Credentials, enhancing security and management of storage access. The
changes include: Addition of a new `migrate_credentials` command in the
`labs.yml` file to migrate credentials for storage access to UC storage
credential. Modification of `secrets.py` to handle the case where a
secret has been removed from the backend and to log warning messages for
secrets with invalid Base64 bytes. Introduction of the
`StorageCredentialManager` and `ServicePrincipalMigration` classes in
`credentials.py` to manage Azure Service Principals and their associated
client secrets, and to migrate them to UC Storage Credentials. Addition
of a new `directory_id` attribute in the `Principal` class and its
associated dataclass in `resources.py` to store the directory ID for
creating UC storage credentials using a service principal. Creation of a
new pytest fixture, `make_storage_credential_spn`, in `fixtures.py` to
simplify writing tests requiring Databricks Storage Credentials with
Azure Service Principal auth. Addition of a new test file for the Azure
integration of the project, including new classes, methods, and test
cases for testing the migration of Azure Service Principals to UC
Storage Credentials. These improvements will ensure better security and
management of storage access using Azure Service Principals, while
providing more efficient and robust testing capabilities.
* Added permission migration support for feature tables and the root
permissions for models and feature tables
([#997](#997)). This commit
introduces support for migration of permissions related to feature
tables and sets root permissions for models and feature tables. New
functions such as `feature_store_listing`, `feature_tables_root_page`,
`models_root_page`, and `tokens_and_passwords` have been added to
facilitate population of a workspace access page with necessary
permissions information. The `factory` function in `manager.py` has been
updated to include new listings for models' root page, feature tables'
root page, and the feature store for enhanced management and access
control of models and feature tables. New classes and methods have been
implemented to handle permissions for these resources, utilizing
`GenericPermissionsSupport`, `AccessControlRequest`, and `MigratedGroup`
classes. Additionally, new test methods have been included to verify
feature tables listing functionality and root page listing functionality
for feature tables and registered models. The test manager method has
been updated to include `feature-tables` in the list of items to be
checked for permissions, ensuring comprehensive testing of permission
functionality related to these new feature tables.
* Added support for serving endpoints
([#990](#990)). In this
release, we have made significant enhancements to support serving
endpoints in our open-source library. The `fixtures.py` file in the
`databricks.labs.ucx.mixins` module has been updated with new classes
and functions to create and manage serving endpoints, accompanied by
integration tests to verify their functionality. We have added a new
listing for serving endpoints in the assessment's permissions crawling,
using the `ws.serving_endpoints.list` function and the
`serving-endpoints` category. A new integration test, "test_endpoints,"
has been added to verify that assessments now crawl permissions for
serving endpoints. This test demonstrates the ability to migrate
permissions from one group to another. The test suite has been updated
to ensure the proper functioning of the new feature and improve the
assessment of permissions for serving endpoints, ensuring compatibility
with the updated `test_manager.py` file.
* Expanded end-user documentation with detailed descriptions for
workflows and commands
([#999](#999)). The
Databricks Labs UCX project has been updated with several new features
to assist in upgrading to Unity Catalog, including an assessment
workflow that generates a detailed compatibility report for workspace
entities, a group migration workflow for upgrading all Databricks
workspace assets, and utility commands for managing cross-workspace
installations. The Assessment Report now includes a more detailed
summary of the assessment findings, table counts, database summaries,
and external locations. Additional improvements include expanded
workspace group migration to handle potential conflicts with locally
scoped group names, enhanced documentation for external Hive Metastore
integration, a new debugging notebook, and detailed descriptions of
table upgrade considerations, data access permissions, external storage,
and table crawler.
* Fixed `config.yml` upgrade from very old versions
([#984](#984)). In this
release, we've introduced enhancements to the configuration upgrading
process for `config.yml` in our open-source library. We've replaced the
previous `v1_migrate` class method with a new implementation that
specifically handles migration from version 1. The new method retrieves
the `groups` field, extracts the `selected` value, and assigns it to the
`include_group_names` key in the configuration. The
`backup_group_prefix` value from the `groups` field is assigned to the
`renamed_group_prefix` key, and the `groups` field is removed, with the
version number updated to 2. These changes simplify the code and improve
readability, enabling users to upgrade smoothly from version 1 of the
configuration. Furthermore, we've added new unit tests to the
`test_config.py` file to ensure backward compatibility. Two new tests,
`test_v1_migrate_zeroconf` and `test_v1_migrate_some_conf`, have been
added, utilizing the `MockInstallation` class and loading the
configuration using `WorkspaceConfig`. These tests enhance the
robustness and reliability of the migration process for `config.yml`.
* Renamed columns in assessment SQL queries to use actual names, not
aliases ([#983](#983)). In
this update, we have resolved an issue where aliases used for column
references in SQL queries caused errors in certain setups by renaming
them to use actual names. Specifically, for assessment SQL queries, we
have modified the definition of the `is_delta` column to use the actual
`table_format` name instead of the alias `format`. This change improves
compatibility and enhances the reliability of query execution. As a
software engineer, you will appreciate that this modification ensures
consistent interpretation of column references across various setups,
thereby avoiding potential errors caused by aliases. This change does
not introduce any new methods, but instead modifies existing
functionality to use actual column names, ensuring a more reliable and
consistent SQL query for the `05_0_all_tables` assessment.
* Updated groups permissions validation to use Table ACL cluster
([#979](#979)). In this
update, the `validate_groups_permissions` task has been modified to
utilize the Table ACL cluster, as indicated by the inclusion of
`job_cluster="tacl"`. This task is responsible for ensuring that all
crawled permissions are accurately applied to the destination groups by
calling the `permission_manager.apply_group_permissions` method during
the migration state. This modification enhances the validation of group
permissions by performing it on the Table ACL cluster, potentially
improving performance or functionality. If you are implementing this
project, it is crucial to comprehend the consequences of this change on
your permissions validation process and adjust your workflows
appropriately.

Run integration tests only for pull requests ready for review (#1002)

Tested on https://github.com/databrickslabs/blueprint

Reducing flakiness of create account groups (#1003)

Prompt user if Terraform utilised for deploying infrastructure (#1004)

Added prompt is_terraform_used and updated the same in the config of
WorkspaceInstaller

Resolves #393

---------

Co-authored-by: Serge Smertin <259697+nfx@users.noreply.github.com>

Update CONTRIBUTING.md (#1005)

Closes #850

Fix gitguardian warning caused by "hello world" secret used in unit test (#1010)

Replace the plain encoded string by base64.b64encode to mitigate the
gitguardian warning.

<!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes,
fixed, resolve, resolves, resolved. See
https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword
-->

Resolves #..

- [ ] added relevant user documentation
- [ ] added new CLI command
- [ ] modified existing command: `databricks labs ucx ...`
- [ ] added a new workflow
- [ ] modified existing workflow: `...`
- [ ] added a new table
- [ ] modified existing table: `...`

<!-- How is this tested? Please see the checklist below and also
describe any other relevant tests -->

- [ ] manually tested
- [ ] added unit tests
- [ ] added integration tests
- [ ] verified on staging environment (screenshot attached)

Create UC external locations in Azure based on migrated storage credentials (#992)

Handle widget delete on upgrade platform bug (#1011)

Deprecate legacy installer (#1014)

<img width="799" alt="image"
src="https://github.com/databrickslabs/ucx/assets/259697/2aa5fed6-5734-44c2-87bc-39fbc214d5fa">

Automatically upgrade existing installations to avoid breaking changes (#985)

This PR incorporates the work from
databrickslabs/blueprint#50, which enables
smoother cross-version upgrades.

Fix #471

Added missing documentation for `create-uber-principal` command (#1015)

Add `migrate-locations` command (#1016)

Add cli command `migrate_locations` to create UC external location.

<!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes,
fixed, resolve, resolves, resolved. See
https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword
-->

- [ ] added relevant user documentation
- [ ] added new CLI command
- [ ] modified existing command: `databricks labs ucx ...`
- [ ] added a new workflow
- [ ] modified existing workflow: `...`
- [ ] added a new table
- [ ] modified existing table: `...`

<!-- How is this tested? Please see the checklist below and also
describe any other relevant tests -->

- [ ] manually tested
- [ ] added unit tests
- [ ] added integration tests
- [ ] verified on staging environment (screenshot attached)

Fix document for `migrate-locations` command (#1017)

<!-- Summary of your changes that are easy to understand. Add
screenshots when necessary -->

<!-- DOC: Link issue with a keyword: close, closes, closed, fix, fixes,
fixed, resolve, resolves, resolved. See
https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword
-->

- [ ] added relevant user documentation
- [ ] added new CLI command
- [ ] modified existing command: `databricks labs ucx ...`
- [ ] added a new workflow
- [ ] modified existing workflow: `...`
- [ ] added a new table
- [ ] modified existing table: `...`

<!-- How is this tested? Please see the checklist below and also
describe any other relevant tests -->

- [ ] manually tested
- [ ] added unit tests
- [ ] added integration tests
- [ ] verified on staging environment (screenshot attached)

Make code more readable by enforcing `max-nested-blocks = 3` with `pylint` (#1018)

No logic changes, just for readability and to spare code reviewer's
sanity.

Added AWS S3 support for `migrate-locations` command (#1009)

Release v0.15.0 (#1020)

* Added AWS S3 support for `migrate-locations` command
([#1009](#1009)). In this
release, the open-source library has been enhanced with AWS S3 support
for the `migrate-locations` command, enabling efficient and secure
management of S3 data. The new functionality includes the identification
of missing S3 prefixes and the creation of corresponding roles and
policies through the addition of methods `_identify_missing_paths`,
`_get_existing_credentials_dict`, and `create_external_locations`. The
library now also includes new classes `AwsIamRole`,
`ExternalLocationInfo`, and `StorageCredentialInfo` for better handling
of AWS-related functionality. Additionally, two new tests,
`test_create_external_locations` and
`test_create_external_locations_skip_existing`, have been added to
ensure the correct behavior of the new AWS-related functionality. The
new test function `test_migrate_locations_aws` checks the AWS-specific
implementation of the `migrate-locations` command, while
`test_missing_aws_cli` verifies the correct error message is displayed
when the AWS CLI is not found in the system path. These changes enhance
the library's capabilities, improving data security, privacy, and
overall performance for users working with AWS S3.
* Added `databricks labs ucx create-uber-principal` command to create
Azure Service Principal for migration
([#976](#976)). The new CLI
command, `databricks labs ucx create-uber-principal`, has been
introduced to create an Azure Service Principal (SPN) and grant it
STORAGE BLOB READER access on all the storage accounts used by the
tables in the workspace. The SPN information is then stored in the UCX
cluster policy. A new class, AzureApiClient, has been added to isolate
Azure API calls, and unit and integration tests have been included to
verify the functionality. This development enhances migration
capabilities for Azure workspaces, providing a more streamlined and
automated way to create and manage Service Principals, and improves the
functionality and usability of the UCX tool. The changes are
well-documented and follow the project's coding standards.
* Added `migrate-locations` command
([#1016](#1016)). In this
release, we've added a new CLI command, `migrate_locations`, to create
Unity Catalog (UC) external locations. This command extracts candidates
for location creation from the `guess_external_locations` assessment
task and checks if corresponding UC Storage Credentials exist before
creating the locations. Currently, the command only supports Azure, with
plans to add support for AWS and GCP in the future. The
`migrate_locations` function is marked with the `ucx.command` decorator
and is available as a command-line interface (CLI) command. The pull
request also includes unit tests for this new command, which check the
environment (Azure, AWS, or GCP) before executing the migration and log
a message if the environment is AWS or GCP, indicating that the
migration is not yet supported on those platforms. No changes have been
made to existing workflows, commands, or tables.
* Added handling for widget delete on upgrade platform bug
([#1011](#1011)). In this
release, the `_install_dashboard` method in `dashboards.py` has been
updated to handle a platform bug that occurred during the deletion of
dashboard widgets during an upgrade process (issue
[#1011](#1011)). Previously,
the method attempted to delete each widget using the
`self._ws.dashboard_widgets.delete(widget.id)` command, which resulted
in a `TypeError` when attempting to delete a widget. The updated method
now includes a try/except block that catches this `TypeError` and logs a
warning message, while also tracking the issue under bug ES-1061370. The
rest of the method remains unchanged, creating a dashboard with the
given name, role, and parent folder ID if no widgets are present. This
enhancement improves the robustness of the `_install_dashboard` method
by adding error handling for the SDK API response when deleting
dashboard widgets, ensuring a smoother upgrade process.
* Create UC external locations in Azure based on migrated storage
credentials ([#992](#992)).
The `locations.py` file in the `databricks.labs.ucx.azure` package has
been updated to include a new class `ExternalLocationsMigration`, which
creates UC external locations in Azure based on migrated storage
credentials. This class takes various arguments, including
`WorkspaceClient`, `HiveMetastoreLocations`, `AzureResourcePermissions`,
and `AzureResources`. It has a `run()` method that lists any missing
external locations in UC, extracts their location URLs, and attempts to
create a UC external location with a mapped storage credential name if
the missing external location is in the mapping. The class also includes
helper methods for generating credential name mappings. Additionally,
the `resources.py` file in the same package has been modified to include
a new method `managed_identity_client_id`, which retrieves the client ID
of a managed identity associated with a given access connector. Test
functions for the `ExternalLocationsMigration` class and Azure external
locations functionality have been added in the new file
`test_locations.py`. The `test_resources.py` file has been updated to
include tests for the `managed_identity_client_id` method. A new
`mappings.json` file has also been added for tests related to Azure
external location mappings based on migrated storage credentials.
* Deprecate legacy installer
([#1014](#1014)). In this
release, we have deprecated the legacy installer for the UCX project,
which was previously implemented as a bash script. A warning message has
been added to inform users about the deprecation and direct them to the
UCX installation instructions. The functionality of the script remains
unchanged, and it still performs tasks such as installing Python
dependencies and building Python bindings. The script will eventually be
replaced with the `databricks labs install ucx` command. This change is
part of issue [#1014](#1014)
and is intended to streamline the installation process and improve the
overall user experience. We recommend that users update their
installation process to the new recommended method as soon as possible
to avoid any issues with the legacy installer in the future.
* Prompt user if Terraform utilised for deploying infrastructure
([#1004](#1004)). In this
update, the `config.py` file has been modified to include a new
attribute, `is_terraform_used`, in the `WorkspaceConfig` class. This
boolean flag indicates whether Terraform has been used for deploying
certain entities in the workspace. Issue
[#393](#393) has been
addressed with this change. The `WorkspaceInstaller` configuration has
also been updated to take advantage of this new attribute, allowing
developers to determine if Terraform was used for infrastructure
deployment, thereby increasing visibility into the deployment process.
Additionally, a new prompt has been added to the `warehouse_type`
function to ascertain if Terraform is being utilized for infrastructure
deployment, setting the `is_terraform_used` variable to True if it is.
This improvement is intended for software engineers adopting this
open-source library.
* Updated CONTRIBUTING.md
([#1005](#1005)). In this
contribution to the open-source library, the CONTRIBUTING.md file has
been significantly updated with clearer instructions on how to
effectively contibute to the project. The previous command to print the
Python path has been removed, as the IDE is now advised to be configured
to use the Python interpreter from the virtual environment. A new step
has been added, recommending the use of a consistent styleguide and
formatting of the code before every commit. Moreover, it is now
encouraged to run tests before committing to minimize potential issues
during the review process. The steps on how to make a Fork from the ucx
repo and create a PR have been updated with links to official
documentation. Lastly, the commit now includes information on handling
dependency errors that may occur after `git pull`.
* Updated databricks-labs-blueprint requirement from ~=0.2.4 to ~=0.3.0
([#1001](#1001)). In this
pull request update, the requirements file, pyproject.toml, has been
modified to upgrade the databricks-labs-blueprint package from version
~0.2.4 to ~0.3.0. This update integrates the latest features and bug
fixes of the package, including an automated upgrade framework, a
brute-forcing approach for handling SerdeError, and enhancements for
running nightly integration tests with service principals. These
improvements increase the testability and functionality of the software,
ensuring its stable operation with service principals during nightly
integration tests. Furthermore, the reliability of the test for
detecting existing installations has been reinforced by adding a new
test function that checks for the correct detection of existing
installations and retries the test for up to 15 seconds if they are not.

Dependency updates:

* Updated databricks-labs-blueprint requirement from ~=0.2.4 to ~=0.3.0
([#1001](#1001)).
dmoore247 pushed a commit that referenced this pull request Mar 23, 2024
…lint` (#1018)

No logic changes, just for readability and to spare code reviewer's
sanity.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant