-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added test to submit and run various Python tasks on multiple DBR versions #806
Conversation
internal/bundle/helpers.go
Outdated
var sparkVersions = []string{ | ||
"11.3.x-scala2.12", | ||
"12.2.x-scala2.12", | ||
"13.0.x-scala2.12", | ||
"13.1.x-scala2.12", | ||
"13.2.x-scala2.12", | ||
"13.3.x-scala2.12", | ||
"14.0.x-scala2.12", | ||
"14.1.x-scala2.12", | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this list will get out of date in couple of months, can you use the w.Clusters.SparkVersions
to pick the list of runtimes dynamically?
see https://github.com/databricks/databricks-sdk-py/blob/main/tests/integration/test_auth.py#L90-L124
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I consider that but for now we want to keep it simple since we don't run them continuously yet. Once we do, we will get the list dynamically indeed
internal/bundle/helpers.go
Outdated
SparkVersion: sparkVersions[i], | ||
NumWorkers: 1, | ||
NodeTypeId: nodeTypeId, | ||
DataSecurityMode: compute.DataSecurityModeUserIsolation, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is only for Unity Catalog-enabled workspaces, but it doesn't check for .DataSecurityModeSingleUser
and DataSecurityModeNone
.
what about non-uc workspaces, like DataSecurityModeLegacySingleUser
?
internal/bundle/python_tasks_test.go
Outdated
|
||
func TestAccRunPythonTaskWorkspace(t *testing.T) { | ||
// TODO: remove RUN_PYTHON_TASKS_TEST when ready to be executed as part of nightly | ||
internal.GetEnvOrSkipTest(t, "RUN_PYTHON_TASKS_TEST") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there's ucws
testing environment, which tests against UC-enabled workspace, use those env vars
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
atm we don't plan to execute it on continuously basis yet, just as a script to be executed manually to generate the data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, but you can still execute it with deco env shell
and pick a *-ucws
environment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
General comment; since we don't run this as part of the regular suites, I think it should live in its own package. The helpers you introduce here are generic enough that they should be reusable from more testing packages.
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please include the source for this wheel file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stamp to unblock.
CLI: * Refactor change computation for sync ([#785](#785)). Bundles: * Allow digits in the generated short name ([#820](#820)). * Emit an error when incompatible all purpose cluster used with Python wheel tasks ([#823](#823)). * Use normalized short name for tag value in development mode ([#821](#821)). * Added `python.DetectInterpreters` and other utils ([#805](#805)). * Mark artifacts properties as optional ([#834](#834)). * Added support for glob patterns in pipeline libraries section ([#833](#833)). Internal: * Run tests to verify backend tag validation behavior ([#814](#814)). * Library to validate and normalize cloud specific tags ([#819](#819)). * Added test to submit and run various Python tasks on multiple DBR versions ([#806](#806)). * Create a release PR in setup-cli repo on tag push ([#827](#827)). API Changes: * Changed `databricks account metastore-assignments list` command to return . * Changed `databricks jobs cancel-all-runs` command with new required argument order. * Added `databricks account o-auth-published-apps` command group. * Changed `databricks serving-endpoints query` command . New request type is . * Added `databricks serving-endpoints patch` command. * Added `databricks credentials-manager` command group. * Added `databricks settings` command group. * Changed `databricks clean-rooms list` command to require request of . * Changed `databricks statement-execution execute-statement` command with new required argument order. OpenAPI commit bcbf6e851e3d82fd910940910dd31c10c059746c (2023-10-02) Dependency updates: * Bump github.com/google/uuid from 1.3.0 to 1.3.1 ([#825](#825)). * Updated Go SDK to 0.22.0 ([#831](#831)).
CLI: * Refactor change computation for sync ([#785](#785)). Bundles: * Allow digits in the generated short name ([#820](#820)). * Emit an error when incompatible all purpose cluster used with Python wheel tasks ([#823](#823)). * Use normalized short name for tag value in development mode ([#821](#821)). * Added `python.DetectInterpreters` and other utils ([#805](#805)). * Mark artifacts properties as optional ([#834](#834)). * Added support for glob patterns in pipeline libraries section ([#833](#833)). Internal: * Run tests to verify backend tag validation behavior ([#814](#814)). * Library to validate and normalize cloud specific tags ([#819](#819)). * Added test to submit and run various Python tasks on multiple DBR versions ([#806](#806)). * Create a release PR in setup-cli repo on tag push ([#827](#827)). API Changes: * Changed `databricks account metastore-assignments list` command to return . * Changed `databricks jobs cancel-all-runs` command with new required argument order. * Added `databricks account o-auth-published-apps` command group. * Changed `databricks serving-endpoints query` command . New request type is . * Added `databricks serving-endpoints patch` command. * Added `databricks credentials-manager` command group. * Added `databricks settings` command group. * Changed `databricks clean-rooms list` command to require request of . * Changed `databricks statement-execution execute-statement` command with new required argument order. OpenAPI commit bcbf6e851e3d82fd910940910dd31c10c059746c (2023-10-02) Dependency updates: * Bump github.com/google/uuid from 1.3.0 to 1.3.1 ([#825](#825)). * Updated Go SDK to 0.22.0 ([#831](#831)).
…sions (#806) ## Changes These tests allow us to get information for execution context (PYTHONPATH, CWD) for various Python tasks and different cluster setups. Note: this test won't be executed automatically as part of nightly builds since it requires RUN_PYTHON_TASKS_TEST env to be executed. ## Tests Integration test run successfully. --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
CLI: * Refactor change computation for sync ([#785](#785)). Bundles: * Allow digits in the generated short name ([#820](#820)). * Emit an error when incompatible all purpose cluster used with Python wheel tasks ([#823](#823)). * Use normalized short name for tag value in development mode ([#821](#821)). * Added `python.DetectInterpreters` and other utils ([#805](#805)). * Mark artifacts properties as optional ([#834](#834)). * Added support for glob patterns in pipeline libraries section ([#833](#833)). Internal: * Run tests to verify backend tag validation behavior ([#814](#814)). * Library to validate and normalize cloud specific tags ([#819](#819)). * Added test to submit and run various Python tasks on multiple DBR versions ([#806](#806)). * Create a release PR in setup-cli repo on tag push ([#827](#827)). API Changes: * Changed `databricks account metastore-assignments list` command to return . * Changed `databricks jobs cancel-all-runs` command with new required argument order. * Added `databricks account o-auth-published-apps` command group. * Changed `databricks serving-endpoints query` command . New request type is . * Added `databricks serving-endpoints patch` command. * Added `databricks credentials-manager` command group. * Added `databricks settings` command group. * Changed `databricks clean-rooms list` command to require request of . * Changed `databricks statement-execution execute-statement` command with new required argument order. OpenAPI commit bcbf6e851e3d82fd910940910dd31c10c059746c (2023-10-02) Dependency updates: * Bump github.com/google/uuid from 1.3.0 to 1.3.1 ([#825](#825)). * Updated Go SDK to 0.22.0 ([#831](#831)).
Changes
These tests allow us to get information for execution context (PYTHONPATH, CWD) for various Python tasks and different cluster setups.
Note: this test won't be executed automatically as part of nightly builds since it requires RUN_PYTHON_TASKS_TEST env to be executed.
Tests
Integration test run successfully.