Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipeline can't parse input parameters in volumes field #1602

Closed
TaylorHere opened this issue Jul 9, 2019 · 4 comments
Closed

pipeline can't parse input parameters in volumes field #1602

TaylorHere opened this issue Jul 9, 2019 · 4 comments
Assignees

Comments

@TaylorHere
Copy link

What happened:
when I do this:

    dop = DataBrokerOP(query=query, return_key=query_return_key, data_path=data_path)
    volume = V1Volume(
        name='train-data',
        host_path=V1HostPathVolumeSource(
            path=volume_path
            )
        )
    dop.add_pvolumes({data_path: volume})

I got a YAML file in where I got:

  volumes:
  - hostPath:
      path: '{{inputs.parameters.volume-path}}'
    name: train_data

but when I run it, and describe the pod, I got:

Volumes:
  podmetadata:
    Type:  DownwardAPI (a volume populated by information about the pod)
    Items:
      metadata.annotations -> annotations
  docker-lib:
    Type:          HostPath (bare host directory volume)
    Path:          /var/lib/docker
    HostPathType:  Directory
  docker-sock:
    Type:          HostPath (bare host directory volume)
    Path:          /var/run/docker.sock
    HostPathType:  Socket
  train-data:
    Type:          HostPath (bare host directory volume)
    Path:          {{inputs.parameters.volume-path}}
    HostPathType:
  pipeline-runner-token-jzpd6:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  pipeline-runner-token-jzpd6
    Optional:    false

the {{inputs.parameters.volume-path}} is stil a string here, not traslate to the input arguments
What did you expect to happen:
I want to change my volume host path by the input parameters
What steps did you take:
[A clear and concise description of what the bug is.]
like above
Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]

@TaylorHere
Copy link
Author

In our use case,
we will create a data broker which will download train data into {{input.parameters.volume-path}}
like: /data/vol/001, we will use this path to create a volume named train-volume
later, the volume will attach to train op and mounted as /data/vol/
so, in our train code, it will ignore the train data path detail, always get data from /data/vol/
so this bug matters.
hope someone can fix this.
thx.

@elikatsis
Copy link
Member

elikatsis commented Jul 9, 2019

Hello.
This seems like duplicate of #1327.
Could you check if the solution described there solves your problem?

Note: in the solution I mention the Argo version v2.3.0-rc3. Since the release v2.3.0 is available, you should use this instead.

@elikatsis
Copy link
Member

elikatsis commented Jul 9, 2019

In Kubeflow 0.6 this issue will be resolved. All the required updates are patched.

@TaylorHere
Copy link
Author

thank you for your replay! that might be helpful!

magdalenakuhn17 pushed a commit to magdalenakuhn17/pipelines that referenced this issue Oct 22, 2023
…ocol (kubeflow#1602)

* Update torchserve config file

* Add doc for torchserve config.properties

* Update config.properties for torchserve
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants