You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please list the steps required to reproduce the issue, for example:
Have an asset bundle configuration that includes a jar library on the local filesystem, listed as a relative path
Have an artifact configured to create that jar file
Have the source value be a relative path to the current repo, matching what is in the task's libraries field
run databricks bundle deploy
Expected Behavior
I would expect the relative path to be found and linked to the library without any issue and without having to list the absolute path. This is my first time doing this with a jar file, but with a Python wheel task you don't even need to list the files/sources, it just works.
I found it especially surprising that even with the source written exactly how it was written in the libraries section of the task it still did not work.
Building convert_events...
Uploading convert_events-2.0.jar...
Uploading bundle files to /Users/<username>/.bundle/dbx_pipeline_events/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Actual Behavior
The jar file is built but the file is not uploaded, this less that useful error message is shown instead
Building convert_events...
artifact section is not defined for file at /home/jessica/dbx-pipeline-events/convert_events/target/scala-2.12/convert_events-2.0.jar. Skipping uploading. In order to use the define 'artifacts' section
Uploading bundle files to /Users/<username>/.bundle/dbx_pipeline_events/dev/files...
Deploying resources...
Updating deployment state...
Deployment complete!
OS and CLI version
Databricks CLI v0.212.2
Ubuntu 22.04 - WSL2 via Windows 11
It appears that the artifacts section doesn't do globbing for any artifact type, it's not just Jar files. I tried adding the files list to my working python wheel artifact and if I try to use a relative path or a glob pattern in there it does not work either. So the title may be more accurate without the Jar artifact specifier.
I think this code is the reason why the Python wheels will upload when no file list is provided, so maybe this could be extended to cover more situations?
Hi @NodeJSmith ! Indeed, your observations are correct. At the bare minimum DABs should support relative paths in source section and does not even require specifying it. I'll assign the issue to myself
…load all artifact files (#1247)
Support relative paths in artifact files source section and always
upload all artifact files
Fixes#1156
## Tests
Added unit tests
Describe the issue
When deploying an asset bundle that uses a local jar file, the artifact
files
source
requires an absolute path in order to be recognized.Configuration
Please provide a minimal reproducible configuration for the issue
Steps to reproduce the behavior
Please list the steps required to reproduce the issue, for example:
databricks bundle deploy
Expected Behavior
I would expect the relative path to be found and linked to the library without any issue and without having to list the absolute path. This is my first time doing this with a jar file, but with a Python wheel task you don't even need to list the files/sources, it just works.
I found it especially surprising that even with the source written exactly how it was written in the libraries section of the task it still did not work.
Actual Behavior
The jar file is built but the file is not uploaded, this less that useful error message is shown instead
OS and CLI version
Databricks CLI v0.212.2
Ubuntu 22.04 - WSL2 via Windows 11
Is this a regression?
Not sure
Debug Logs
databricks_cli_jar_issue_redacted_logs.log
The text was updated successfully, but these errors were encountered: