You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This requires both the backend support work, an API, and a frontend button to call the API. @Ark-kun has done some work to explore using Argo's activeDeadlineSeconds field. We also need proper messaging to set the user's expectations that this will stop the workflow without ability to resume.
@Ark-kun, if I understand correctly, this will terminate all the workflow's pods too, not just stop scheduling new ones, correct?
@ajayalfred we'll need a spec and messaging around this experience.
The text was updated successfully, but these errors were encountered:
Note that for Pipelines that start jobs in GCP, such as those using DataProc, the termination of the pipeline results in a best effort attempt to stop running jobs as soon as possible.
For example, I ran this notebook twice: https://github.com/kubeflow/pipelines/tree/master/components/gcp/dataproc/submit_hive_job
and the first time I terminated it, the hive query was stopped. The second time, I terminated the pipeline (so the UI showed it as terminated), but the hive query completed before the "cancel" command reached it.
This behavior should be documented to avoid surprising users.
This requires both the backend support work, an API, and a frontend button to call the API.
@Ark-kun has done some work to explore using Argo's
activeDeadlineSeconds
field. We also need proper messaging to set the user's expectations that this will stop the workflow without ability to resume.@Ark-kun, if I understand correctly, this will terminate all the workflow's pods too, not just stop scheduling new ones, correct?
@ajayalfred we'll need a spec and messaging around this experience.
The text was updated successfully, but these errors were encountered: