Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Base64 encode the pickled code #1476

Merged
merged 1 commit into from
Jun 14, 2019
Merged

Base64 encode the pickled code #1476

merged 1 commit into from
Jun 14, 2019

Conversation

kvalev
Copy link
Contributor

@kvalev kvalev commented Jun 9, 2019

Due to its nature, Argo will replace any strings it encounters that are enclosed in double curly braces, which will make the code non-executable. To workaround this, the code is encoded in the Argo yaml template and decoded on the fly, before the execution.

Fixes #1453


This change is Reviewable

Due to its nature, Argo will replace any strings it encounters
that are enclosed in double curly braces, which will make the code
non-executable. To workaround this, the code is encoded in the Argo
yaml template and decoded on the fly, before the execution.
@k8s-ci-robot
Copy link
Contributor

Hi @kvalev. Thanks for your PR.

I'm waiting for a kubeflow member to verify that this patch is reasonable to test. If it is, they should reply with /ok-to-test on its own line. Until that is done, I will not automatically test new commits in this PR, but the usual testing commands by org members will still work. Regular contributors should join the org to skip this step.

Once the patch is verified, the new status will be reflected by the ok-to-test label.

I understand the commands that are listed here.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@kvalev
Copy link
Contributor Author

kvalev commented Jun 9, 2019

/assign @gaoning777

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 10, 2019

We only really need to replace double opening braces {{

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 10, 2019

\lgtm

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 10, 2019

Due to its nature, Argo will replace any strings it encounters that are enclosed in double curly braces, which will make the code non-executable. To workaround this, the code is encoded in the Argo yaml template and decoded on the fly, before the execution.

Fixes #1453

Please note that this won't fully "fix" your old workflow. The code will stop crashing, but the placeholders won't be substituted.

The best way to use them is

unique_task_id_template = '{{workflow.uid}}_{{pod.name}}' #We should probably move that to DSL to hide the actual template strings.

some_op(output_path = prefix + '/' + unique_task_id_template + '/data')

@kvalev
Copy link
Contributor Author

kvalev commented Jun 10, 2019

Actually it does fix it (in a way). The DSL compiler explicitly declares all arguments in the container.args section in the generated argo template, so the default function values are ignored anyway. And argo seems to correctly substitute those values, so in the end it works out.

Another solution that I had in mind was modifying the method metadata by removing the default values before serializing the function, but it seems that the metadata is immutable, so it was not easily doable (if at all).

And the reason for even passing such details as the expected minio path is that I want to visualize some data using the output viewers. I was hoping that I can just define the local file as source in mlpipeline-ui-metadata.json and the file will be automatically stored in the configured storage and the local path in the metadata file will be replaced on the fly with the remote one. However that does no seem to be the case, so the only way to currently achieve something similar is to declare the local file as output artifact (so that it is archived by argo) and then write the expected minio location in the mlpipeline-ui-metadata.json file, so that the file contents can be visualized afterwards.

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 12, 2019

/lgtm
/approve
/hold

@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: Ark-kun

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@Ark-kun
Copy link
Contributor

Ark-kun commented Jun 14, 2019

/hold cancel

@k8s-ci-robot k8s-ci-robot merged commit 8938669 into kubeflow:master Jun 14, 2019
@kvalev kvalev deleted the kvalev/base64-pickled-code branch June 14, 2019 11:46
magdalenakuhn17 pushed a commit to magdalenakuhn17/pipelines that referenced this pull request Oct 22, 2023
…rving-custom-model/model-server (kubeflow#1476)

Bumps [pillow](https://github.com/python-pillow/Pillow) from 7.1.0 to 8.1.1.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/master/CHANGES.rst)
- [Commits](python-pillow/Pillow@7.1.0...8.1.1)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cloudpickle fails to deserialize a function
4 participants