From 9d11fd181853c85cf887404e714fae6be6ddc5db Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Thu, 18 Aug 2022 10:51:27 -0700 Subject: [PATCH 1/7] Document third-party workflows using DASH as a Git submodule. --- dash-pipeline/README-dash-as-submodule.md | 143 ++++++++++++++++++ .../images/dash-submodule-git-hierarchy.svg | 4 + .../images/dash-submodule-workflow.svg | 4 + 3 files changed, 151 insertions(+) create mode 100644 dash-pipeline/README-dash-as-submodule.md create mode 100644 dash-pipeline/images/dash-submodule-git-hierarchy.svg create mode 100644 dash-pipeline/images/dash-submodule-workflow.svg diff --git a/dash-pipeline/README-dash-as-submodule.md b/dash-pipeline/README-dash-as-submodule.md new file mode 100644 index 000000000..d86160aaa --- /dev/null +++ b/dash-pipeline/README-dash-as-submodule.md @@ -0,0 +1,143 @@ +**C'Mon, I want a [Quick-Start](#quick-start)!** + +**Table of Contents** + +- [Importing the DASH project into another project](#importing-the-dash-project-into-another-project) +- [Quick-Start](#quick-start) +- [How to use DASH as a Git Submodule](#how-to-use-dash-as-a-git-submodule) +- [Third-Party Workflow & DASH Workflow Reuse](#third-party-workflow--dash-workflow-reuse) + - [Recap: DASH bmv2 workflow](#recap-dash-bmv2-workflow) + - [Custom DASH Workflow](#custom-dash-workflow) + - [Reusable Build toolchain and artifacts](#reusable-build-toolchain-and-artifacts) + - [Required Custom Tools and Artifacts](#required-custom-tools-and-artifacts) + - [Custom Traffic Test Harness](#custom-traffic-test-harness) + - [Custom Tests](#custom-tests) + - [Third-Party CI Pipeline Automation (Git Actions)](#third-party-ci-pipeline-automation-git-actions) +# Importing the DASH project into another project + +The [Azure/DASH project](https://github.com/Azure/DASH) can be used as a resouce within other projects, such as third-party, commercial or open-source DASH implementations. For example, a commercial DPU vendor can incorporate the DASH project into a private Git repository and utilize many of the components, providing consistency with the community implementation and definition, reusing test-cases, and avoiding duplication of efforts. + +# Quick-Start +This will show you how to import the [Azure/DASH project](https://github.com/Azure/DASH) project into your own Git project. + +1. Start with a Git project, either a new or existing one of your choosing. You might want to make a scratch project just to try this out. +2. Copy the [Makefile.3rdpty](Makefile.3rdpty) into your project. You can put it into its own subdirectory and/or or rename it to suit. If you rename it, please interpret the subsequent instructions accordingly. +3. Choose a subdirectory in which to import the DASH project as a submodule. The sample [Makefile.3rdpty](Makefile.3rdpty) assumes a directory `./DASH` relative to the Makefile location. Edit the following line in the makefile to change this (or, set the environment variable `DASHDIR` before calling `make`): + ``` + DASHDIR ?=DASH + ``` +4. Import the DASH repository as a submodule using the following command. Modify the final parameter to match the relative directory in your project where you want the submodule to be cloned into: + ``` + git submodule add -b main --name DASH [git@github.com:Azure/DASH.git](https://github.com/Azure/DASH.git) DASH + ``` +5. Commit the changes now or later, see [DASH as a Git Submodule](#dash-as-a-git-submodule) +6. To verify the dash submodule was imported correctly and crucial steps function properly, execute the following. (If you rename the file to `Makefile` you can omit the `-f` option.) + ``` + make [-f Makefile.3rdpty] clean + make [-f Makefile.3rdpty] all + ``` + This will run selected build steps from DASH - those which don't depend upon third-party implementations. This includes compiling the P4 code, generating SAI headers, and also pulls in several Docker images. It's a great starting point. See the detailed descriptions elsewhere in this document for next steps. +7. *OPTIONAL:* To perform the entire dash-pipeline build process, execute the following: + ``` + make [-f Makefile.3rdpty] dash-pipeline-regression + ``` + This will run `make clean && make all` from the dash project. You don't *have* to do this since many of the artifacts are irrelevant for third-party adaptations. +8. *OPTIONAL:* You can also `cd /dash-pipeline` and run any of the steps outlined in the DASH bmv2 [workflows](README-dash-workflows.md), such as the following. This has the benefit of verifying the function of SW traffic generators etc. in your environment. You can use this to confirm funtional tests against the reference implementation. + ``` + make run-switch # console 1 + make run-saithrift-server # console 2 + make run-all-tests # console 3 + +# How to use DASH as a Git Submodule +A third-party project can import the DASH project as a Git Submodule. See [about-git-submodules](README-dash-workflows.md#about-git-submodules) for background. In this example, DASH is imported at the top-level of the project using the following command. (See the documentation for `git submodule add` for other options.) + +``` +git submodule add -b main --name DASH [git@github.com:Azure/DASH.git](https://github.com/Azure/DASH.git) DASH +``` + +The effects of this command are: +- Clone the DASH repository in-place under the directory `DASH` (relative to the working directory of the command). +- Make an entry in the `.gitmodules` file (creating it if needed). For example: + ``` + [submodule "DASH"] + path = DASH + url = git@github.com:Azure/DASH.git + branch = main + ``` +- Store the imported repository git index/database under the parent project's `.git/modules` directory + +Importing the submodule also creates new items for `DASH` and `.gitmodules` which need to be committed. For example: +``` +chris@chris-z4:~/dashsubmodule$ git status +On branch main +Your branch is up to date with 'origin/main'. + +Changes to be committed: + (use "git restore --staged ..." to unstage) + new file: .gitmodules + new file: DASH + +Changes not staged for commit: + (use "git add ..." to update what will be committed) + (use "git restore ..." to discard changes in working directory) + (commit or discard the untracked or modified content in submodules) + modified: DASH (modified content, untracked content) +``` +To commit to your project: +``` +git add .gitmodules DASH +git commit +[git push] +``` +The resulting Git structure is as follows. DASH is imported as a submodule. Furthermore, the DASH project itself contains multiple levels of submodules. Via `git submodule update --init` instructions in `DASH/dash-pipeline/Makefile`) these repositories are cloned-in-place. + +![Git Hierarchy](images/dash-submodule-git-hierarchy.svg) + +# Third-Party Workflow & DASH Workflow Reuse +## Recap: DASH bmv2 workflow +The figure below shows the traditional bmv2-based workflow and is described in [README-dash-workflows](README-dash-workflows.md). + +![dash-p4-bmv2-thrift-workflow](https://github.com/Azure/DASH/raw/main/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg +) +## Custom DASH Workflow +The reference project contains a `Makefile.3rdpty` to serve as a starting point. It has make targets which are just wrappers to invoke predefined Makefile targets in the DASH repository (e.g. using `make -C...`). It also has placeholder make targets where third-party cusomization is required. You can modify it arbitrarily. The intent was to reuse as much as possible from DASH. + +The drawing below shows where third-party customization will be needed, using "exciting" colors. + +![Custom Dash Workflow](images/dash-submodule-workflow.svg) + +The main objective is to re-use DASH artifacts, Makefiles, Dockerfiles, etc. where possible and replace (or augment) certain resources by third-party implementations. + +## Reusable Build toolchain and artifacts + +The following toolchains and output artifacts *should* be reusable as-is from the DASH project, with no (or very few) modifications: +* dash-pipeline P4 source code (for SAI header generation) +* P4 behavioral Model code compilation. The primary artifact of interest is just the P4Info file used to auto-generate the SAI headers for overlay services. Both the dockerized `dash-p4c-bmv2` container and the output artifacts should be reusable as-is from DASH. +* SAI experimental headers describing the interface to the dataplane, derived from P4Info. A code generator script [SAI/sai_api_gen.py](SAI/sai_api_gen.py) produces SAI headers derived from the P4 code, emitted into [SAI/SAI/experimental](SAI/SAI/experimental). It also generates a SAI-to-P4Runtime adaptor layer emitted into [SAI/lib](SAI/lib), for the bmv2 implementation.Third-party workflows can ignore the bmv2 adaptor layer or use it as inspiration. +* SAI metadata derived from the combination of standard SAI headers and the DASH headers. This is done by makefiles and scripts inside the SAI submodule and also uses the `dash-saithrift-bldr` container. +* `saithrift-client-bldr` base docker container, retrieved from a docker registry and built in a standard way. It contains base tools and packages. +* `dash-saithrift-client` docker container which includes all tools and artifacts needed to perform dataplane tests. The artifacts are generated based on the outputs of the saithrift-server build step (below) which might need third-party customization as described below. + >**Note:** The community dash-pipeline bmv2 build workflows assume that saithrift-server is built first, then saithrift-client is built next. In principle the saithrift client is target-agnostic and should not depend upon the saithrift server, but the build process for saithrift client and server are somewhat combined. To make a saithrift client for a third-party implementation *without* depending upon third-party saithrift server (which depends on third-party `libsai`), just use the dash-pipeline `make all` and use the resulting saithrift-client docker image. + +## Required Custom Tools and Artifacts +The following will undoubtedly be developed uniquely for each DASH implementation: +* DASH Dataplane - this is the primary focus of third-party DASH implementations and can be any mix of hardware and/or software. +* SAI adaptor layer to translate SAI API calls into the underlying dataplane configuration "SDK." You might want to adapt the code generator [SAI/sai_api_gen.py](SAI/sai_api_gen.py) to produce your own adaptor layer, or a skeleton thereof. +* saithrift-server [Makefile](SAI/saithrift/Makefile) and `dash-saithrift-bldr` container to run the saithrift code generator and link to third-party `libsai.so`. + + >**Note:** this might require significant third-party customizations to compile for certain archtectures. Default is Ubuntu 20.04 running on an x86 "device." Some implementations, e.g. the bmv2/P4Runtime implementation, use an RPC between the SAI adaptor layer and the underlying device "SDK," which means the saithrift server *could* run on one processor while the dataplane and device SDK run in a DPU with a different achitecture. For example, if a third-party dataplane has a native RPC such as gRPC, it could serve the same role as the P4Runtime API in the community bmv2 archtecture. If so, then the saithrift server could be compiled using the community workflow; presumably the custom `libsai` would translate SAI calls into native third-party gRPC calls, which means the saithrift server runs in a different process than the dataplane/native gRPC server. The [SAI/saithrift/Makefile](SAI/saithrift/Makefile) will probably need modifications to pass in different `SAIRPC_EXTRA_LIBS` at the minimum. + +## Custom Traffic Test Harness +The community DASH bmv2 test workflow includes SW traffic-generators connected to the SW dataplane via `veth` ports. Third-party integrations can continue to use this method, or others, including: +* Using physical NIC devices driven by SW traffic generators, cabled to SW dataplanes bound to other physical NIC ports (high-performance SW implementations relying on NIC devices) +* HW implementations of DASH dataplanes, e.g. "real DPUs" or physical emulations thereof, cabled to SW traffic generators which are bound to physical NIC ports. +* HW or SW DASH dataplane implementations cabled to HW-based traffic generators such as IXIA chassis etc. + +If test ports other than `veth0/1` and `veth2/3` are used, some modifications of setup scripts may be required: +* PTF tests using scapy for SW traffic generation can be parameterized to specify logical-to-physical port mappings. +* Pytests using ixia-c SW traffic generator are set up using docker-compose topology files under [DASH/test/third-party/traffic_gen/deployment](https://github.com/Azure/DASH/tree/main/test/third-party/traffic_gen/deployment) + +## Custom Tests +You can use the tests under DASH by calling the appropriate DASH make targets from the parent project. You can also have private tests in your own project repository which you invoke from your Makefiles. We recommend if you write new tests which are generally applicable that you consider upstreaming to the Community repository. +## Third-Party CI Pipeline Automation (Git Actions) +You should be able to adapt the CI automation files from the dash project as located under [.github/workflows](../.github/workflows). You will need to modify them to suit your project by changing the trigger conditions (e.g. filesystem paths) and steps. \ No newline at end of file diff --git a/dash-pipeline/images/dash-submodule-git-hierarchy.svg b/dash-pipeline/images/dash-submodule-git-hierarchy.svg new file mode 100644 index 000000000..8c3716e11 --- /dev/null +++ b/dash-pipeline/images/dash-submodule-git-hierarchy.svg @@ -0,0 +1,4 @@ + + + +
Git
Git
github.com/parent-project
github.com/parent-project
Git
Git
github.com/Azure/DASH
github.com/Azure/DASH
Git
Git
github.com/opencomputeproject/SAI
github.com/opencomputeproject/SAI
/DASH
/DASH
./dash-pipeline/SAI/SAI
./dash-pipeline/SAI/S...
Git Repos
Git Repos
Filesystem directories
Filesystem...
DASH  submodule directly imported into "parent-project
DASH  submodule directly i...
Git
Git
github.com/p4lang/ptf
github.com/p4lang/ptf
./test/ptf
./test/...
Git submodules used by DASH project itself, indirectly imported into "parent-project."
Git submodules used by DASH pr...
Text is not SVG - cannot display
\ No newline at end of file diff --git a/dash-pipeline/images/dash-submodule-workflow.svg b/dash-pipeline/images/dash-submodule-workflow.svg new file mode 100644 index 000000000..96de7a3f6 --- /dev/null +++ b/dash-pipeline/images/dash-submodule-workflow.svg @@ -0,0 +1,4 @@ + + + +
ixia-c
ixia-c
SAI
Implementation
?
SAI...
make  sai-thrift-client
make  sai-thrift-client
DASH P4
behavioral model 
(source of truth)
DASH P4...
Standard OCP SAI
header files subset
(underlay)
Standard OCP SAI...
DASH SAI
header files
(overrlay)
DASH SAI...
Saithrift code
generator
Saithrift code...
Thrift server
skeleton C++ code
Thrift server...
opencompute/SAI
opencompute/SAI
make P4
make P4
DASH/dash-pipeline
DASH/dash-pipeline
Generate SAI headers
Generate SAI headers
Git
Git
dash_pipeline.json
dash_pipeline.json
generate_dash_api.sh
generate_dash_api.sh
Containers provide the build & run environment:
Containers provide the b...
make docker-XXX
make docker-XXX
make sai
make sai
meta/make
meta/make
SAI implementation code
SAI implementation code
meta/gensairpc.pl
meta/gensairpc.pl
saithrift
server
saithrift...
libsai
libsai
p4c
p4c
make sai
make sai
make <target>
make <target>
LEGEND
LEGEND
make target or script in dash-pipeline
make target or script in dash-pipeline
make <target>
make <target>
make target or script in another repo (e.g. SAI/meta)
make target or script in another repo (e.g. SAI/meta)
SAI & meta headers
SAI & meta headers
Resource comes from external repo (resources assumed to be in this repo otherwise)
Resource comes from external repo (resources assumed to be in this repo otherwi...
Test scripts:
PTF, Pytest
built into container
Test scripts:...
Runtime socket communications (RPC commands or test traffic)
Runtime socket communications (RPC commands or test traffic)
Tgen Commands
Tgen Commands
Build step produces artifacts
Build step produces artifacts
make run-switch
make run-switch
make run-saithrift_XXXtests
make run-saithrift_XXXtests
(Git Submodule)
(Git Submodule)

make docker-XXX-publish
make docker-XXX-publish
local environment
local environ...
Various repos (Ubuntu, p4.org, etc.)
Various repos (Ubunt...
local environment
local environ...
dash-xxx
dash-xxx
dash-XXX
dash-XXX
P4 Info
P4 Info
Reg
Reg
make docker-XXX-pull (explicit)
make docker-XXX...
docker-run (implicit)
docker-run (imp...
Python thrift client  lib*
Python thrift client...
DASH Dataplane
DASH Dataplane
Git
Git
opencompute/SAI
opencompute/SAI
SAI PTF Framework
SAI PTF Framework
Scapy
Scapy
make deploy-ixia-c
make deploy-ixia-c
Build-time container
Build-time container
Run-time container
Run-time container
Test scripts:
PTF, Pytest
mounted from host dev env
Test scripts:...
make run-saithrift_dev-XXXtests
make run-saithrift_dev-XXXtests
/test-dev
/test-dev
/test
/test
/SAI
/SAI
Custom third-party implementation
Custom third-party i...
socket or in-process
socket or in...
Text is not SVG - cannot display
\ No newline at end of file From 69eccce14861d9079ab1e3a32b1f81d7ca7f7705 Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Thu, 18 Aug 2022 12:04:44 -0700 Subject: [PATCH 2/7] Add URL to sample project. --- dash-pipeline/README-dash-as-submodule.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/dash-pipeline/README-dash-as-submodule.md b/dash-pipeline/README-dash-as-submodule.md index d86160aaa..4db1f9550 100644 --- a/dash-pipeline/README-dash-as-submodule.md +++ b/dash-pipeline/README-dash-as-submodule.md @@ -20,6 +20,9 @@ The [Azure/DASH project](https://github.com/Azure/DASH) can be used as a resouce # Quick-Start This will show you how to import the [Azure/DASH project](https://github.com/Azure/DASH) project into your own Git project. +A minimal sample project created using this recipe can be found here: https://github.com/chrispsommers/dashsubmodule + + 1. Start with a Git project, either a new or existing one of your choosing. You might want to make a scratch project just to try this out. 2. Copy the [Makefile.3rdpty](Makefile.3rdpty) into your project. You can put it into its own subdirectory and/or or rename it to suit. If you rename it, please interpret the subsequent instructions accordingly. 3. Choose a subdirectory in which to import the DASH project as a submodule. The sample [Makefile.3rdpty](Makefile.3rdpty) assumes a directory `./DASH` relative to the Makefile location. Edit the following line in the makefile to change this (or, set the environment variable `DASHDIR` before calling `make`): From 6c6084b48c5fb9226d485563167bd9776971b889 Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Tue, 6 Sep 2022 15:42:49 -0700 Subject: [PATCH 3/7] Incorporate review feedback (typo; missing file). --- dash-pipeline/Makefile.3rdpty | 66 +++++++++++++++++++++++ dash-pipeline/README-dash-as-submodule.md | 2 +- 2 files changed, 67 insertions(+), 1 deletion(-) create mode 100644 dash-pipeline/Makefile.3rdpty diff --git a/dash-pipeline/Makefile.3rdpty b/dash-pipeline/Makefile.3rdpty new file mode 100644 index 000000000..684f75572 --- /dev/null +++ b/dash-pipeline/Makefile.3rdpty @@ -0,0 +1,66 @@ +all: dash-p4 sai thirdparty-saithrift-server docker-saithrift-client + +clean: dash-p4-clean dash-sai-clean + +# Submodule location relative to this Makefile +DASHDIR ?=DASH + +DASH: dash-submodule + +.PHONY: dash-submodule +dash-submodule: + @echo "Initializing DASH submodule..." + git submodule update --init DASH + +# Build entire dash-pipeline codebase as sanity check +dash-pipeline-regression: DASH + $(MAKE) -C $(DASHDIR)/dash-pipeline clean + $(MAKE) -C $(DASHDIR)/dash-pipeline all + +dash-pipeline-clean: DASH + $(MAKE) -C $(DASHDIR)/dash-pipeline clean + +.PHONY: sai +sai: dash-sai-clean dash-sai-headers dash-sai-meta thirdparty-libsai + + +# Build behavioral model code, needed for SAI headers +dash-p4: + $(MAKE) -C $(DASHDIR)/dash-pipeline p4 + +dash-p4-clean: + $(MAKE) -C $(DASHDIR)/dash-pipeline p4-clean + +dash-sai-clean: + $(MAKE) -C $(DASHDIR)/dash-pipeline sai-clean + +# Autogenerate SAI headers +dash-sai-headers: + $(MAKE) -C $(DASHDIR)/dash-pipeline sai-headers + +# Autogenerate SAI meta, needed for saithrift client/server +dash-sai-meta: + $(MAKE) -C $(DASHDIR)/dash-pipeline sai-meta + +# Implementation-dependent libsai library +thirdparty-libsai: + @echo "Build third-pary libsai" + @echo " Put libsai.so under $(DASHDIR)/dash-pipeline/SAI/lib" + @echo " For use by saithrift-server build stage." + +thirdparty-saithrift-server: + @echo "Build third-party saithrift-server" + @echo " Expects libsai.so under $(DASHDIR)/dash-pipeline/SAI/lib" + # For reference: + # $(MAKE) -C $(DASHDIR)/dash-pipeline saithrift-server + +docker-saithrift-client: + @echo "Build third-pary saithrift-client" + @echo " Expects saithrift-server already built" + # Uncomment when saithrift server can be built + # $(MAKE) -C $(DASHDIR)/dash-pipeline docker-saithrift-client + +run-all-tests: + # Uncomment when saithrift client & server can be built + # Can add more custom tests in addition to DASH tests + # make -C $(DASHDIR)/dash-pipeline run-all-tests diff --git a/dash-pipeline/README-dash-as-submodule.md b/dash-pipeline/README-dash-as-submodule.md index 4db1f9550..b86a80e1c 100644 --- a/dash-pipeline/README-dash-as-submodule.md +++ b/dash-pipeline/README-dash-as-submodule.md @@ -15,7 +15,7 @@ - [Third-Party CI Pipeline Automation (Git Actions)](#third-party-ci-pipeline-automation-git-actions) # Importing the DASH project into another project -The [Azure/DASH project](https://github.com/Azure/DASH) can be used as a resouce within other projects, such as third-party, commercial or open-source DASH implementations. For example, a commercial DPU vendor can incorporate the DASH project into a private Git repository and utilize many of the components, providing consistency with the community implementation and definition, reusing test-cases, and avoiding duplication of efforts. +The [Azure/DASH project](https://github.com/Azure/DASH) can be used as a resource within other projects, such as third-party, commercial or open-source DASH implementations. For example, a commercial DPU vendor can incorporate the DASH project into a private Git repository and utilize many of the components, providing consistency with the community implementation and definition, reusing test-cases, and avoiding duplication of efforts. # Quick-Start This will show you how to import the [Azure/DASH project](https://github.com/Azure/DASH) project into your own Git project. From 8e6adade5726ab6f9e6815209c253e2f5f661d1a Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Tue, 6 Sep 2022 16:10:22 -0700 Subject: [PATCH 4/7] Spellcheck fixes. --- .wordlist.txt | 9 +++++++++ dash-pipeline/README-dash-as-submodule.md | 8 ++++---- .../high-avail/design/AMD-Pensando_HA_Proposal.md | 4 ++-- 3 files changed, 15 insertions(+), 6 deletions(-) diff --git a/.wordlist.txt b/.wordlist.txt index 82f618ac7..81c2e476b 100644 --- a/.wordlist.txt +++ b/.wordlist.txt @@ -1,3 +1,4 @@ +3rdpty ABNF accessor accessors @@ -64,6 +65,7 @@ Cfg cfg checkboxes chris +chrispsommers ci CLA cla @@ -99,6 +101,7 @@ CurrentUdpFlow customizable Cx cyberithub +dashsubmodule DASHOrch dashorch DashOrch @@ -109,6 +112,8 @@ Datagram datagrams datapath Datapath +dataplane +Dataplane dataplanes datastore DBs @@ -136,6 +141,7 @@ Dockerfiles dockerfiles dockerhub Dockerhub +dockerized DoS DotNet downcasting @@ -236,6 +242,7 @@ InbfromLB INIT Init initializer +integrations integrators interoperable io @@ -283,6 +290,7 @@ lts Macsec Makefile Makefiles +makefiles MatchedHalfOpenFlow MatchedOtherFlow MatchedTcpFlow @@ -548,6 +556,7 @@ unpair Unpair untracked upcasting +upstreaming vcpus veth VFP diff --git a/dash-pipeline/README-dash-as-submodule.md b/dash-pipeline/README-dash-as-submodule.md index b86a80e1c..291906634 100644 --- a/dash-pipeline/README-dash-as-submodule.md +++ b/dash-pipeline/README-dash-as-submodule.md @@ -1,4 +1,4 @@ -**C'Mon, I want a [Quick-Start](#quick-start)!** +**I want a [Quick-Start](#quick-start)!** **Table of Contents** @@ -45,7 +45,7 @@ A minimal sample project created using this recipe can be found here: https://gi make [-f Makefile.3rdpty] dash-pipeline-regression ``` This will run `make clean && make all` from the dash project. You don't *have* to do this since many of the artifacts are irrelevant for third-party adaptations. -8. *OPTIONAL:* You can also `cd /dash-pipeline` and run any of the steps outlined in the DASH bmv2 [workflows](README-dash-workflows.md), such as the following. This has the benefit of verifying the function of SW traffic generators etc. in your environment. You can use this to confirm funtional tests against the reference implementation. +8. *OPTIONAL:* You can also `cd /dash-pipeline` and run any of the steps outlined in the DASH bmv2 [workflows](README-dash-workflows.md), such as the following. This has the benefit of verifying the function of SW traffic generators etc. in your environment. You can use this to confirm functional tests against the reference implementation. ``` make run-switch # console 1 make run-saithrift-server # console 2 @@ -103,7 +103,7 @@ The figure below shows the traditional bmv2-based workflow and is described in [ ![dash-p4-bmv2-thrift-workflow](https://github.com/Azure/DASH/raw/main/dash-pipeline/images/dash-p4-bmv2-thrift-workflow.svg ) ## Custom DASH Workflow -The reference project contains a `Makefile.3rdpty` to serve as a starting point. It has make targets which are just wrappers to invoke predefined Makefile targets in the DASH repository (e.g. using `make -C...`). It also has placeholder make targets where third-party cusomization is required. You can modify it arbitrarily. The intent was to reuse as much as possible from DASH. +The reference project contains a `Makefile.3rdpty` to serve as a starting point. It has make targets which are just wrappers to invoke predefined Makefile targets in the DASH repository (e.g. using `make -C...`). It also has placeholder make targets where third-party customization is required. You can modify it arbitrarily. The intent was to reuse as much as possible from DASH. The drawing below shows where third-party customization will be needed, using "exciting" colors. @@ -143,4 +143,4 @@ If test ports other than `veth0/1` and `veth2/3` are used, some modifications of ## Custom Tests You can use the tests under DASH by calling the appropriate DASH make targets from the parent project. You can also have private tests in your own project repository which you invoke from your Makefiles. We recommend if you write new tests which are generally applicable that you consider upstreaming to the Community repository. ## Third-Party CI Pipeline Automation (Git Actions) -You should be able to adapt the CI automation files from the dash project as located under [.github/workflows](../.github/workflows). You will need to modify them to suit your project by changing the trigger conditions (e.g. filesystem paths) and steps. \ No newline at end of file +You should be able to adapt the CI automation files from the dash project as located under [.github/workflows](../.github/workflows). You will need to modify them to suit your project by changing the trigger conditions (e.g. file system paths) and steps. \ No newline at end of file diff --git a/documentation/high-avail/design/AMD-Pensando_HA_Proposal.md b/documentation/high-avail/design/AMD-Pensando_HA_Proposal.md index adeb868aa..048b4be73 100644 --- a/documentation/high-avail/design/AMD-Pensando_HA_Proposal.md +++ b/documentation/high-avail/design/AMD-Pensando_HA_Proposal.md @@ -145,7 +145,7 @@ This state is reached when bulk sync is complete and the admin role of the node **Wait Primary** -In this state the node is waiting for the Datapath to signal completion of taking over as primary. The datapath indicates this by notifying the SONIC stack via an oper status update message. At this point the peer is notified to move to standy. +In this state the node is waiting for the Datapath to signal completion of taking over as primary. The datapath indicates this by notifying the SONIC stack via an oper status update message. At this point the peer is notified to move to standby. **Activate Secondary** @@ -715,7 +715,7 @@ All flows inserted on the secondary before switchover are after evaluation of pr ![](images/planned_switchover.009.jpeg) -Swithover can be a planned event for maintenance and other reasons. With planned switchover the goal is to have close to zero loss and to coordinate between the primary and secondary to achieve this goal. Both the DP-VIPs will switch roles to primary on this trigger. +Switchover can be a planned event for maintenance and other reasons. With planned switchover the goal is to have close to zero loss and to coordinate between the primary and secondary to achieve this goal. Both the DP-VIPs will switch roles to primary on this trigger. The controller initiates the planned switchover and notifies the secondary DPU to initiate switchover. Once switchover is complete the newly primary DPU relays a SwitchoverDone message to the old primary DPU. The old primary initiates a withdrawal of protocol routes so the network can drain traffic. During this time the old primary continues to forward traffic so any traffic in transit is forwarded without being dropped. During this network convergence timeout both the primary and secondary are forwarding traffic and flow sync messages may be exchanged in both directions. From 1d03d27df9d896b12abc7dec9588fbacfb68dabb Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Tue, 6 Sep 2022 16:14:28 -0700 Subject: [PATCH 5/7] Spellcheck --- .wordlist.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.wordlist.txt b/.wordlist.txt index 81c2e476b..b635952b8 100644 --- a/.wordlist.txt +++ b/.wordlist.txt @@ -1,4 +1,3 @@ -3rdpty ABNF accessor accessors @@ -400,6 +399,7 @@ Pyunit qcow QoS Radv +rdpty reachability README READMEs From 8082f07c8fa3fd9cfe5c47f35db2c1e9589e33a5 Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Tue, 6 Sep 2022 16:22:10 -0700 Subject: [PATCH 6/7] Add .wordlist.txt to CI triggers. --- .github/workflows/dash-md-spellcheck.yml | 2 ++ 1 file changed, 2 insertions(+) diff --git a/.github/workflows/dash-md-spellcheck.yml b/.github/workflows/dash-md-spellcheck.yml index 5cc9bed4e..9b73265f8 100644 --- a/.github/workflows/dash-md-spellcheck.yml +++ b/.github/workflows/dash-md-spellcheck.yml @@ -3,9 +3,11 @@ on: pull_request: paths: - '**/*.md' + - '.wordlist.txt' push: paths: - '**/*.md' + - '.wordlist.txt' workflow_dispatch: jobs: From b438a404c5c8652a262919e8f825ba910907dbd8 Mon Sep 17 00:00:00 2001 From: Chris Sommers Date: Tue, 6 Sep 2022 16:43:07 -0700 Subject: [PATCH 7/7] Spellcheck wordlist. --- .wordlist.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/.wordlist.txt b/.wordlist.txt index b635952b8..8ba0fc2b9 100644 --- a/.wordlist.txt +++ b/.wordlist.txt @@ -287,6 +287,7 @@ loopback LPM lts Macsec +makefile Makefile Makefiles makefiles