You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pipelines should allow events to be triggered at different steps, and provide handlers to handle actions ins response to those events. Going further, ideally the eventing solution can be plugin-based, for e.g. if someone is using Apache OpenWhisk as an eventing platform, or someone is using Knative eventing, it can allow the engine to be plugged in.
The text was updated successfully, but these errors were encountered:
The user can create an event based workflow based on: a set of (query, workflow step) where:
1.a A query is made to our metadata DB (e.g.: trigger each time there is a new model, trigger each time there is new data, trigger each time there is a new event of type T, etc.)
1.b The workflow step corresponding to the query is executed each time the query returns new data. Here, a workflow step could be backed by a container, an Argo workflow (i.e. a subworkflow), etc.
Events can be added to the metadata DB using from various sources (webhook, pub/sub, OpenWhisk, etc.).
This design would decouple the systems that can be source of events from the orchestration system (the metadata API being the integration point).
Pipelines should allow events to be triggered at different steps, and provide handlers to handle actions ins response to those events. Going further, ideally the eventing solution can be plugin-based, for e.g. if someone is using Apache OpenWhisk as an eventing platform, or someone is using Knative eventing, it can allow the engine to be plugged in.
The text was updated successfully, but these errors were encountered: