Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run installer tests in pipeline #47313

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

NikolaMilosavljevic
Copy link
Member

This completes the installer package testing work - dotnet/runtime#109848

Features:

  • Adds x64 and arm64 installer test jobs to VMR Validation stage
  • Fixes arm64-specific test code-paths
  • Updates test-image repo
  • Adds filtering of NuGet packages, as the size of PackageArtifacts in official build is about 8GB
  • Enables running tests with a new property - TestInstallerPackages
  • Modifies tests.proj to condition test projects based on this new property. This is far from ideal, and will get more complicated as we add more test projects

As we need to support both pipeline and developer scenarios, some decisions were made to support these scenario with a single code-path, i.e.:

  • Pipeline contains a merged manifest which could have been used for filtering NuGet packages, based on the correct vertical name that corresponds to the target test platform. Vertical name isn't ideal and it could change, so this is something that could be incorporated in the future, but for now we filter NuGet packages based on file patterns. Filtering is needed so we don't need to copy all packages to the test image/container
  • Instead of copying packages to the image, I explored sharing packages from host to container. This is doable in pipeline as host is a real machine and not a docker container. However, in developer scenarios, host is likely a container. In this scenario, testing utilizes sibling containers. Developer could have packages (and repo clone) in container file-system or in a mounted docker volume. It would be possible to mount the same volume in a test container, but we cannot share packages from host container's file-system.

This was tested with an earlier dotnet sha: build

I'm running another verification with latest sha, to incorporate some recent pipeline changes: build

@Copilot Copilot bot review requested due to automatic review settings March 6, 2025 16:46
@NikolaMilosavljevic NikolaMilosavljevic requested review from a team as code owners March 6, 2025 16:46
@dotnet-issue-labeler dotnet-issue-labeler bot added Area-Infrastructure untriaged Request triage from a team member labels Mar 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Overview

This PR adds installer tests to the pipeline and refines test setups on Linux to better handle platform-specific scenarios and package filtering.

  • Adds YAML pipeline templates for validating installer packages for Linux x64 and arm64.
  • Updates Linux installer tests to use platform-specific package architectures and improves package filtering logic.
  • Revises test code constants and downloads for consistency with current runtime dependencies.

Reviewed Changes

File Description
eng/pipelines/templates/steps/vmr-validate-installers.yml Introduces tasks to download build artifacts and authenticate feeds for installer tests.
src/SourceBuild/content/test/Microsoft.DotNet.Installer.Tests/LinuxInstallerTests.cs Updates Linux installer test methods to support refined package filtering and architecture-specific adjustments.
eng/pipelines/templates/stages/vmr-build.yml Adds new jobs for Linux x64 and Linux arm64 installer validations in the build stage.

Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.

Comments suppressed due to low confidence (3)

src/SourceBuild/content/test/Microsoft.DotNet.Installer.Tests/LinuxInstallerTests.cs:28

  • [nitpick] Consider renaming _excludeLinuxArch to a more descriptive name (e.g., _oppositeArch) to clearly indicate that it holds the architecture to exclude for package filtering.
private readonly string _excludeLinuxArch;

src/SourceBuild/content/test/Microsoft.DotNet.Installer.Tests/LinuxInstallerTests.cs:122

  • Using .Wait() on an asynchronous task can cause deadlock issues or hide exceptions; consider using GetAwaiter().GetResult() for better exception propagation.
DownloadFileAsync(NetStandard21RpmPackage, Path.Combine(_contextDir, Path.GetFileName(NetStandard21RpmPackage))).Wait();

src/SourceBuild/content/test/Microsoft.DotNet.Installer.Tests/LinuxInstallerTests.cs:98

  • [nitpick] The use of 'aarch64' for RPM packages versus 'arm64' for others might be confusing; consider standardizing the naming for clarity if possible.
string packageArchitecture = Config.Architecture == "x64" ? "x64" : packageType == PackageType.Rpm ? "aarch64" : "arm64";
extraBuildProperties="$extraBuildProperties /p:TestDebPackages=true"
fi
./build.sh \
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This additional entry point invocation worries me. The VMR is supposed to receive properties like TargetOS, TargetArchitecture, ContinuousIntegrationBuild, etc that help us distinguish different OSs, architectures, CI vs dev, etc.

Could this yml import the steps/vmr-build.yml which handles all that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be ideal. But, we don't currently use that for running tests - the only such instance also has its own custom invocation:

- ${{ if eq(parameters.runTests, 'True') }}:
- script: build.cmd
$(baseArguments)
$(targetArguments)
$(buildPassArguments)
${{ parameters.extraProperties }}
-test
-excludeCIBinarylog
/bl:artifacts/log/Release/Test.binlog
displayName: Run Tests
workingDirectory: ${{ variables.sourcesPath }}
timeoutInMinutes: ${{ variables.runTestsTimeout }}

I'll think some more about this.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this to work I would use jobs/vmr-build.yml template, but first I'd need to modify it to enable test-only mode (skipBuild or testOnly), which would run tests but skip the build. Today, we allow tests, as an addition to regular build step.

This would also, likely, fix the issue with missing test status, as test logs are currently not uploaded from this job. The alternative is to add test-log publishing to the new job.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the only such instance also has its own custom invocation

Right, we need the additional invocation as we can't build and run tests from a single invocation today. It also better separates the actions so I don't see that as a problem. That additional invocation in vmr-build.yml is kept in sync with the build invocation so that isn't a problem. It receives the properties that the build invocation receives. If it doesn't then that's a bug.

My feedback here was around not introducing another YML entry point for the VMR. vmr-build.yml should be able to handle the different scenarios (building and testing today). The vmr-final-join has its own entry point and I think that should also get consolidated eventually.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this makes sense. I will add the test-only feature to jobs/vmr-build.yml and use that yml as a template for the new installer-testing job - thus removing the separate build.sh invocation.

Copy link
Member

@ViktorHofer ViktorHofer Mar 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we extract "just" the build part of the VMR into a steps template?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The essential part here is that there's one yml entry point that controls how to build the VMR. I think a steps template would help with that. I agree that the current vmr-build.yml does to much for this purpose.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we extract "just" the build part of the VMR into a steps template?

We have two build steps (regular and test), for two different scenarios (source-build or not) - total of 4 build.sh/build.cmd invocations.

For my current scenario I need test build and test publishing steps - which I have implemented in eng/pipelines/templates/steps/vmr-validate-installers.yml

Refactoring all the various steps out of vmr-build.yml job is something that should be done - eventually.

Maybe refactor the test build and test publish steps for now, to accommodate my changes?

The value-added might be small, though - eventual build step would only be a single build.sh or build.cmd command that would consume a single parameter, specifying the build switches/arguments. All VMR-specific logic would still happen in vmr-build.yml job template.

Copy link
Member Author

@NikolaMilosavljevic NikolaMilosavljevic Mar 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. Three ways to refactor this.

Option 1 is for steps template to run just one build steps - either regular build or tests. It seems clunky, as we'd need to supply some property that specifies the build type, regular or test. Based on such property (or properties) we'd only do regular or test build. This way we can share a single entry point across all VMR pipelines.

Option 2 would run build and test (if requested). This implies that we'd also need to run any steps that are executed between main and test builds, i.e.

- ${{ if eq(variables['_EnableDacSigning'], 'true') }}:
# TODO: Once we turn off the dotnet/runtime official build, move these templates into the VMR's eng folder.
- template: ${{ variables['Build.SourcesDirectory'] }}/src/runtime/eng/pipelines/coreclr/templates/remove-diagnostic-certs.yml
parameters:
isOfficialBuild: ${{ variables.isOfficialBuild }}
scriptRoot: '$(Build.SourcesDirectory)/src/runtime'
and
- ${{ if ne(parameters.runOnline, 'True' )}}:
- script: |
set -ex
# Update the owner of the staging directory to the current user
sudo chown -R $(whoami) $(artifactsStagingDir)
displayName: Update owner of artifacts staging directory

Option 3 would move all of these steps to the new build steps yml:

- ${{ if eq(parameters.targetOS, 'windows') }}:
# Node 20.x is a toolset dependency to build aspnetcore
# Keep in sync with aspnetcore: https://github.com/dotnet/aspnetcore/blob/7d5309210d8f7bae8fa074da495e9d009d67f1b4/.azure/pipelines/ci.yml#L719-L722
- task: NodeTool@0
displayName: Install Node 20.x
inputs:
versionSpec: 20.x
- ${{ if eq(variables['_EnableDacSigning'], 'true') }}:
# TODO: Once we turn off the dotnet/runtime official build, move these templates into the VMR's eng folder.
- template: ${{ variables['Build.SourcesDirectory'] }}/src/runtime/eng/pipelines/coreclr/templates/install-diagnostic-certs.yml
parameters:
isOfficialBuild: ${{ variables.isOfficialBuild }}
certNames:
- 'dotnetesrp-diagnostics-aad-ssl-cert'
- 'dotnet-diagnostics-esrp-pki-onecert'
vaultName: 'clrdiag-esrp-id'
azureSubscription: 'diagnostics-esrp-kvcertuser'
scriptRoot: '$(Build.SourcesDirectory)/src/runtime'
- script: build.cmd
$(baseArguments)
$(targetArguments)
$(signArguments)
$(buildPassArguments)
$(ibcArguments)
$(_SignDiagnosticFilesArgs)
${{ parameters.extraProperties }}
displayName: Build
workingDirectory: ${{ variables.sourcesPath }}
- ${{ if eq(variables['_EnableDacSigning'], 'true') }}:
# TODO: Once we turn off the dotnet/runtime official build, move these templates into the VMR's eng folder.
- template: ${{ variables['Build.SourcesDirectory'] }}/src/runtime/eng/pipelines/coreclr/templates/remove-diagnostic-certs.yml
parameters:
isOfficialBuild: ${{ variables.isOfficialBuild }}
scriptRoot: '$(Build.SourcesDirectory)/src/runtime'
- ${{ if eq(parameters.runTests, 'True') }}:
- script: build.cmd
$(baseArguments)
$(targetArguments)
$(buildPassArguments)
${{ parameters.extraProperties }}
-test
-excludeCIBinarylog
/bl:artifacts/log/Release/Test.binlog
displayName: Run Tests
workingDirectory: ${{ variables.sourcesPath }}
timeoutInMinutes: ${{ variables.runTestsTimeout }}
- ${{ else }}:
- ${{ if eq(parameters.targetOS, 'osx') }}:
- script: |
$(sourcesPath)/eng/common/native/install-dependencies.sh osx
displayName: Install dependencies
- ${{ if eq(parameters.buildSourceOnly, 'true') }}:
- script: |
set -ex
customPrepArgs=""
prepSdk=true
if [[ -n '${{ parameters.artifactsRid }}' ]]; then
customPrepArgs="${customPrepArgs} --artifacts-rid ${{ parameters.artifactsRid }}"
fi
if [[ '${{ parameters.withPreviousSDK }}' == 'True' ]]; then
# Source-built artifacts are from CentOS 9 Stream. We want to download them without
# downloading portable versions from the internet.
customPrepArgs="${customPrepArgs} --no-sdk --no-bootstrap"
prepSdk=false
elif [[ '${{ length(parameters.reuseBuildArtifactsFrom) }}' -gt '0' ]]; then
customPrepArgs="${customPrepArgs} --no-sdk --no-artifacts"
prepSdk=false
fi
if [[ "$prepSdk" == "false" ]]; then
mkdir $(sourcesPath)/.dotnet
previousSdkPath="$(sourcesPath)/prereqs/packages/archive/dotnet-sdk-*.tar.gz"
eval tar -ozxf "$previousSdkPath" -C "$(sourcesPath)/.dotnet"
eval rm -f "$previousSdkPath"
echo "##vso[task.setvariable variable=additionalBuildArgs]--with-sdk $(sourcesPath)/.dotnet"
fi
./prep-source-build.sh $customPrepArgs
displayName: Prep the Build
workingDirectory: $(sourcesPath)
- script: |
set -ex
df -h
customEnvVars=""
customPreBuildArgs=""
customBuildArgs="--ci --clean-while-building --prepareMachine -c ${{ parameters.configuration }} $(officialBuildParameter)"
if [[ '${{ parameters.runOnline }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --online"
else
customPreBuildArgs="$customPreBuildArgs sudo unshare -n"
fi
if [[ '${{ parameters.enablePoison }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --poison"
fi
if [[ '${{ parameters.buildFromArchive }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --source-repository https://github.com/dotnet/dotnet"
customBuildArgs="$customBuildArgs --source-version $(git -C "$(vmrPath)" rev-parse HEAD)"
fi
if [[ '${{ parameters.buildSourceOnly }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --source-only"
extraBuildProperties="$extraBuildProperties /p:ReportSbrpUsage=true"
fi
if [[ '${{ parameters.useMonoRuntime }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --use-mono-runtime"
fi
if [[ '${{ parameters.sign }}' == 'True' ]] && [[ '${{ parameters.buildSourceOnly }}' != 'True' ]]; then
customBuildArgs="$customBuildArgs --sign"
if [[ '$(_SignType)' == 'real' ]] || [[ '$(_SignType)' == 'test' ]]; then
# Force dry run signing until https://github.com/dotnet/source-build/issues/4793 is resolved - https://github.com/dotnet/source-build/issues/4678
extraBuildProperties="$extraBuildProperties /p:DotNetSignType=$(_SignType) /p:TeamName=$(_TeamName) /p:ForceDryRunSigning=true"
else
extraBuildProperties="$extraBuildProperties /p:ForceDryRunSigning=true"
fi
fi
if [[ -n "${{ parameters.targetRid }}" ]]; then
customBuildArgs="$customBuildArgs --target-rid ${{ parameters.targetRid }}"
fi
if [[ -n "${{ parameters.crossRootFs }}" ]]; then
customEnvVars="$customEnvVars ROOTFS_DIR=${{ parameters.crossRootFs}}"
if [[ '${{ parameters.targetArchitecture }}' != 'wasm' ]]; then
extraBuildProperties="$extraBuildProperties /p:CrossBuild=true"
fi
fi
if [[ ! -z '${{ parameters.targetOS }}' ]]; then
extraBuildProperties="$extraBuildProperties /p:TargetOS=${{ parameters.targetOS }}"
fi
if [[ ! -z '${{ parameters.targetArchitecture }}' ]]; then
extraBuildProperties="$extraBuildProperties /p:TargetArchitecture=${{ parameters.targetArchitecture }}"
fi
if [[ -n "${{ parameters.buildPass }}" ]]; then
extraBuildProperties="$extraBuildProperties /p:DotNetBuildPass=${{ parameters.buildPass }}"
fi
if [[ -n "${{ parameters.extraProperties }}" ]]; then
extraBuildProperties="$extraBuildProperties ${{ parameters.extraProperties }}"
fi
extraBuildProperties="$extraBuildProperties /p:VerticalName=$(Agent.JobName)"
extraBuildProperties="$extraBuildProperties /p:ArtifactsStagingDir=$(artifactsStagingDir)"
buildArgs="$(additionalBuildArgs) $customBuildArgs $extraBuildProperties"
for envVar in $customEnvVars; do
customEnvVarsWithBashSyntax="$customEnvVarsWithBashSyntax export $envVar;"
done
eval $customEnvVarsWithBashSyntax
$customPreBuildArgs ./build.sh $buildArgs
displayName: Build
workingDirectory: $(sourcesPath)
- ${{ if ne(parameters.runOnline, 'True' )}}:
- script: |
set -ex
# Update the owner of the staging directory to the current user
sudo chown -R $(whoami) $(artifactsStagingDir)
displayName: Update owner of artifacts staging directory
# Only run tests if enabled
- ${{ if eq(parameters.runTests, 'True') }}:
# Setup the NuGet sources used by the tests to use private feeds. This is necessary when testing internal-only product
# builds where the packages are only available in the private feeds. This allows the tests to restore from those feeds.
- ${{ if eq(variables['System.TeamProject'], 'internal') }}:
- task: Bash@3
displayName: Setup Private Feeds Credentials
inputs:
filePath: $(sourcesPath)/src/sdk/eng/common/SetupNugetSources.sh
arguments: $(sourcesPath)/src/sdk/NuGet.config $Token
env:
Token: $(dn-bot-dnceng-artifact-feeds-rw)
- script: cp "$(sourcesPath)/src/sdk/NuGet.config" "$(sourcesPath)/test/Microsoft.DotNet.SourceBuild.Tests/assets/online.NuGet.Config"
displayName: Copy Test NuGet Config for Smoke Tests
- script: |
set -ex
customPreBuildArgs=''
customBuildArgs=''
extraBuildProperties=''
customBuildArgs="--ci --prepareMachine -c ${{ parameters.configuration }} $(officialBuildParameter)"
if [[ '${{ parameters.runOnline }}' == 'False' ]]; then
customPreBuildArgs="$customPreBuildArgs sudo"
fi
if [[ ! -z '${{ parameters.targetOS }}' ]]; then
extraBuildProperties="$extraBuildProperties /p:TargetOS=${{ parameters.targetOS }}"
fi
if [[ ! -z '${{ parameters.targetArchitecture }}' ]]; then
extraBuildProperties="$extraBuildProperties /p:TargetArchitecture=${{ parameters.targetArchitecture }}"
fi
if [[ '${{ parameters.buildSourceOnly }}' == 'True' ]]; then
if [[ '${{ parameters.enablePoison }}' == 'True' ]]; then
customBuildArgs="$customBuildArgs --poison"
fi
customBuildArgs="$customBuildArgs --source-only /p:SourceBuildTestsExcludeOmniSharpTests=${{ parameters.excludeOmniSharpTests }}"
fi
if [[ -n "${{ parameters.targetRid }}" ]]; then
customBuildArgs="$customBuildArgs --target-rid ${{ parameters.targetRid }}"
fi
extraBuildProperties="$extraBuildProperties /p:VerticalName=$(Agent.JobName)"
extraBuildProperties="$extraBuildProperties /p:ArtifactsStagingDir=$(artifactsStagingDir)"
if [[ -n "${{ parameters.extraProperties }}" ]]; then
extraBuildProperties="$extraBuildProperties ${{ parameters.extraProperties }}"
fi
cd $(sourcesPath)
$customPreBuildArgs ./build.sh --test --excludeCIBinarylog /bl:artifacts/log/Release/Test.binlog $customBuildArgs $extraBuildProperties $(additionalBuildArgs)
displayName: Run Tests
timeoutInMinutes: ${{ variables.runTestsTimeout }}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Option 3 sounds best but I'm not sure what others think. cc @mmitche

@@ -92,31 +95,46 @@ public void DebTest(string repo, string tag)

private void InitializeContext(PackageType packageType)
{
string packageArchitecture =
Config.Architecture == "x64" ?
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider defining consts for the architectures to help avoid possible bugs.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed with 4c52a0c

@@ -20,7 +20,11 @@
<_RunScenarioTests Condition="'$(DotNetBuildPass)' != '' and '$(DotNetBuildPass)' != '1'">false</_RunScenarioTests>
</PropertyGroup>

<ItemGroup>
<ItemGroup Condition="'$(TestInstallerPackages)' == 'true'">
<ProjectReference Condition="'$(TestRpmPackages)' == 'true' or '$(TestDebPackages)' == 'true'" Include="Microsoft.DotNet.Installer.Tests\Microsoft.DotNet.Installer.Tests.csproj" BuildInParallel="false" />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these conditions really necessary here? It seems unnecessary and a bit of a maintenance concern as new installer tests are added.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I did forget to remove Condition="'$(TestRpmPackages)' == 'true' or '$(TestDebPackages)' == 'true'"

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed with 4c52a0c

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was expecting to see tests listed under the tests tab of the build you linked to - https://dev.azure.com/dnceng/internal/_build/results?buildId=2655424&view=ms.vss-test-web.build-test-results-tab

Also do you happen to have build where there is a failure? I am curious what the UX is in this case on seeing what scenario test failed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, interesting - will check what's going on.

I do not have the pipeline run, with failing scenario tests. I could share the output of the local run.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed with defc5d2

This will still, likely, get refactored further to utilize shared build.sh invocation from jobs/vmr-build.yml.

Copy link
Member Author

@NikolaMilosavljevic NikolaMilosavljevic Mar 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was an issue with search folder so tests weren't being published. This is now fixed with 08f3e6b

Verification build with tests showing up correctly: https://dev.azure.com/dnceng/internal/_build/results?buildId=2659027&view=ms.vss-test-web.build-test-results-tab

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area-Infrastructure untriaged Request triage from a team member
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants