diff --git a/docs/index.md b/docs/index.md index 2420ea9f..f950012d 100644 --- a/docs/index.md +++ b/docs/index.md @@ -118,8 +118,8 @@ requirements: run_constrained: - xsimd >=8.0.3,<10 -test: - commands: +tests: + - script: - if: unix or emscripten then: - test -d ${PREFIX}/include/xtensor @@ -181,13 +181,11 @@ requirements: - python 3.10 - typing_extensions >=4.0.0,<5.0.0 -test: - imports: - - rich - commands: - - pip check - requires: - - pip +tests: + - python: + imports: + - rich + pip_check: true about: homepage: https://github.com/Textualize/rich diff --git a/docs/recipe_file.md b/docs/recipe_file.md index 26bf26ec..66bd046f 100644 --- a/docs/recipe_file.md +++ b/docs/recipe_file.md @@ -30,10 +30,11 @@ The reason for a new spec are: - no full Jinja2 support: no conditional or `{% set ...` support, only string interpolation. Variables can be set in the toplevel "context" which is valid YAML -- Jinja string interpolation needs to be preceded by a dollar sign at the beginning of a string, - e.g. `- ${{ version }}` in order for it to be valid YAML -- Selectors use a YAML dictionary style (vs. comments in conda-build). Instead of `- somepkg #[osx]` - we use +- Jinja string interpolation needs to be preceded by a dollar sign at the + beginning of a string, e.g. `- ${{ version }}` in order for it to be valid + YAML +- Selectors use a YAML dictionary style (vs. comments in conda-build). Instead + of `- somepkg #[osx]` we use ```yaml if: osx then: @@ -60,9 +61,10 @@ The recipe spec has the following parts: Spec reference -------------- -The spec is also made available through a JSON Schema (which is used for validation). -The schema (and pydantic source file) can be found in this repository: https://github.com/prefix-dev/recipe-format. -To use with VSCode or other IDEs, start the document with the following line: +The spec is also made available through a JSON Schema (which is used for +validation). The schema (and pydantic source file) can be found in this +repository: https://github.com/prefix-dev/recipe-format. To use with VSCode or +other IDEs, start the document with the following line: ```yaml # yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json @@ -193,9 +195,11 @@ source: lfs: true # defaults to false ``` -Note: `git_rev` may not be available within commit depth range, consider avoiding use of both simultaneously. +Note: `git_rev` may not be available within commit depth range, consider +avoiding use of both simultaneously. -When you want to use git-lfs, you need to set `lfs: true`. This will also pull the lfs files from the repository. +When you want to use git-lfs, you need to set `lfs: true`. This will also pull +the lfs files from the repository. #### Source from a local path @@ -208,8 +212,9 @@ source is copied to the work directory before building. use_gitignore: false # (defaults to true) ``` -By default, all files in the local path that are ignored by git are also ignored by rattler-build. -You can disable this behavior by setting `use_gitignore` to `false`. +By default, all files in the local path that are ignored by git are also ignored +by rattler-build. You can disable this behavior by setting `use_gitignore` to +`false`. #### Patches @@ -227,10 +232,10 @@ Patches may optionally be applied to the source. #### Destination path Within boa's work directory, you may specify a particular folder to place source -into. `rattler-build` will always drop you into the same folder (build folder/work), but -it's up to you whether you want your source extracted into that folder, or -nested deeper. This feature is particularly useful when dealing with multiple -sources, but can apply to recipes with single sources as well. +into. `rattler-build` will always drop you into the same folder (build +folder/work), but it's up to you whether you want your source extracted into +that folder, or nested deeper. This feature is particularly useful when dealing +with multiple sources, but can apply to recipes with single sources as well. ```yaml source: @@ -498,118 +503,180 @@ This is the version bound consistent with CentOS 6. Software built against glibc mamba tell the user that a given package can't be installed if their system glibc version is too old. -Test section ------------- +Tests section +------------- -If this section exists or if there is a `run_test.[py,pl,sh,bat]` file in the -recipe, the package is installed into a test environment after the build is -finished and the tests are run there. +Rattler-build supports 4 different types of tests. The "script" test installs +the package and runs a list of commands. The python test attempts to import a +list of python modules and runs `pip check`. The downstream test runs the tests +of a downstream package that reverse depends on the package being built. - +#### Extra Test Files -### Test requirements +Test files that are copied from the source work directory into the temporary +test directory and are needed during testing (note that the source work +directory is otherwise not available at all during testing). -In addition to the runtime requirements, you can specify requirements needed -during testing. The runtime requirements that you specified in the "run" section -described above are automatically included during testing. +You can also include files that come from the `recipe` folder. They are copied +into the test directory as well. + +At test execution time, the test directory is the current working directory. ```yaml -test: - requires: - - nose +tests: + - script: + - ls + files: + source: + - myfile.txt + - tests/ + - some/directory/pattern*.sh + recipe: + - extra-file.txt ``` -### Test commands +--> -Commands that are run as part of the test. +#### Test requirements + +In addition to the runtime requirements, you can specify requirements needed +during testing. The runtime requirements that you specified in the "run" section +described above are automatically included during testing (because the built +package is installed like regular). + +In the `build` section you can specify additional requirements that are only +needed on the build system for cross-compilation (e.g. emulators or compilers). ```yaml -test: - commands: - - bsdiff4 -h - - bspatch4 -h +tests: + - script: + - echo "hello world" + requirements: + build: + - myemulator + run: + - nose ``` -### Python imports +### Python tests + +For this test type you can list a set of Python modules that need to be +importable. The test will fail if any of the modules cannot be imported. + +The test will also automatically run `pip check` to check for any broken +dependencies. This can be disabled by setting `pip_check: false` in the YAML. -List of Python modules or packages that will be imported in the test -environment. ```yaml -test: - imports: - - bsdiff4 +tests: + - python: + imports: + - bsdiff4 + - bspatch4 + pip_check: true # can be left out because this is the default ``` -This would be equivalent to having a `run_test.py` with the following: +Internally this will write a small Python script that imports the modules: ```python import bsdiff4 +import bspatch4 ``` + ### Check for package-contents -Checks if the built package contains the mentioned items. +Checks if the built package contains the mentioned items. These checks are executed directly at +the end of the build process to make sure that all expected files are present in the package. ```yaml -test: - package-contents: - # checks for the existence of files inside $PREFIX or %PREFIX% - # or, checks that there is at least one file matching the specified `glob` - # pattern inside the prefix - files: - - etc/libmamba/test.txt - - etc/libmamba - - etc/libmamba/*.mamba.txt - - # checks for the existence of `mamba/api/__init__.py` inside of the - # Python site-packages directory (note: also see Python import checks) - site_packages: - - mamba.api - - - # looks in $PREFIX/bin/mamba for unix and %PREFIX%\Library\bin\mamba.exe on Windows - # note: also check the `commands` and execute something like `mamba --help` to make - # sure things work fine - bins: - - mamba - - # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS, - # on Windows for %PREFIX%\Library\lib\mamba.dll & %PREFIX%\Library\bin\mamba.bin - libs: - - mamba - - # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and - # on Windows for `%PREFIX%\Library\include\mamba.hpp` - includes: - - libmamba/mamba.hpp +tests: + - package-contents: + # checks for the existence of files inside $PREFIX or %PREFIX% + # or, checks that there is at least one file matching the specified `glob` + # pattern inside the prefix + files: + - etc/libmamba/test.txt + - etc/libmamba + - etc/libmamba/*.mamba.txt + + # checks for the existence of `mamba/api/__init__.py` inside of the + # Python site-packages directory (note: also see Python import checks) + site_packages: + - mamba.api + + + # looks in $PREFIX/bin/mamba for unix and %PREFIX%\Library\bin\mamba.exe on Windows + # note: also check the `commands` and execute something like `mamba --help` to make + # sure things work fine + bin: + - mamba + + # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS, + # on Windows for %PREFIX%\Library\lib\mamba.dll & %PREFIX%\Library\bin\mamba.bin + lib: + - mamba + + # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and + # on Windows for `%PREFIX%\Library\include\mamba.hpp` + includes: + - libmamba/mamba.hpp ``` +### Downstream tests + +!!! warning + Downstream tests are not yet implemented in `rattler-build`. + +A downstream test can mention a single package that has a dependency on the package being built. +The test will install the package and run the tests of the downstream package with our current +package as a dependency. + +Sometimes downstream packages do not resolve. In this case, the test is ignored. + +```yaml +tests: + - downstream: numpy +``` Outputs section --------------- diff --git a/docs/testing.md b/docs/testing.md index 9536a2d4..5c82db90 100644 --- a/docs/testing.md +++ b/docs/testing.md @@ -1,18 +1,23 @@ # Testing packages -When you are developing a package, you should write tests for it. -The tests are automatically executed right after the package build has finished. +When you are developing a package, you should write tests for it. The tests are +automatically executed right after the package build has finished. -The tests from the test section are actually packaged _into_ your package and can -also be executed straight from the existing package. For this, we have the `test` subcommand: +The tests from the test section are actually packaged _into_ your package and +can also be executed straight from the existing package. + +The idea behind adding the tests into the package is that you can execute the +tests independently from building the package. That is also why we are shipping +a `test` subcommand that takes as input an existing package and executes the +tests: ```bash rattler-build test --package-file ./xtensor-0.24.6-h60d57d3_0.tar.bz2 ``` -Running the above command will extract the package and create a clean environment -where the package and dependencies are installed. Then the tests are executed in -this environment. +Running the above command will extract the package and create a clean +environment where the package and dependencies are installed. Then the tests are +executed in this environment. If you inspect the package contents, you would find the test files under `info/test/*`. @@ -22,52 +27,70 @@ If you inspect the package contents, you would find the test files under The test section allows you to specify the following things: ```yaml -test: - # commands to run to test the package. If any of the commands - # returns with an error code, the test is considered failed. - commands: - - echo "Hello world" - - pip check +tests: + - script: + # commands to run to test the package. If any of the commands + # returns with an error code, the test is considered failed. + - echo "Hello world" + - pytest ./tests + + # additional requirements at test time + requirements: + run: + - pytest + + files: + # Extra files to be copied to the test directory from the "work directory" + source: + - tests/ + - test.py + - *.sh + recipe: + - more_tests/*.py # This test section tries to import the Python modules and errors if it can't - imports: - - mypkg - - mypkg.subpkg - - # additional requirements at test time (only in the target platform architecture) - requires: - - pip - - # Extra files to be copied to the test directory from the build dir (can be globs) - files: - - test.py - - "*.sh" - - # Extra files to be copied to the test directory from the source directory (can be globs) - source_files: - - test_files/ + - python: + imports: + - mypkg + - mypkg.subpkg ``` -The files from the `files` and `source_files` sections are copied into the -`info/test/` folder. The `commands` section is turned into a `run_test.sh` -or `run_test.bat` file, depending on the platform. For a `noarch` package, -both are created. The imports section is turned into a `run_test.py` script. +When you are writing a test for your package, additional files are created and +added to your package. These files are placed under the `info/tests/{index}/` +folder per test. -## Internals +For a script test: -When you are writing a test for your package, additional files are created and added to your package. +- All the files are copied straight into the test folder (under + `info/tests/{index}/`) +- The script is turned into a `run_test.sh` or `run_test.bat` file +- The extra requirements are stored as a JSON file called + `test_time_dependencies.json` -The files are: +For a Python import test: -- `run_test.sh` (Unix) -- `run_test.bat` (Windows) -- `run_test.py` (for the Python import tests) +- A JSON file is created that is called `python_test.json` and stores the + imports to be tested and wether to execute `pip check` or not. This file is + placed under `info/tests/{index}/` + +For a downstream test: + +- A JSON file is created that is called `downstream_test.json` and stores the + downstream tests to be executed. This file is placed under + `info/tests/{index}/` -These files are created under the `info/test` directory of the package. -Additionally, any `source_files` or `files` are also moved into this directory. +## Legacy tests -The tests are executed pointing to this directory as the current working directory. +Legacy tests (from conda-build) are still supported for execution. These tests +are stored as files under the `info/test/` folder. + +The files are: + +- `run_test.sh` (Unix) +- `run_test.bat` (Windows) +- `run_test.py` (for the Python import tests) +- `test_time_dependencies.json` (for additional dependencies at test time) -The idea behind adding the tests into the package is that you can execute the tests independent -from building the package. That is also why we are shipping a `test` subcommand that takes -as input an existing package and executes the tests. +Additionally, the `test` folder contains all the files specified in the test +section as `source_files` and `files`. The tests are executed pointing to this +directory as the current working directory. diff --git a/examples/rich/recipe.yaml b/examples/rich/recipe.yaml index 6c1056f4..34ca8fc6 100644 --- a/examples/rich/recipe.yaml +++ b/examples/rich/recipe.yaml @@ -29,13 +29,10 @@ requirements: - python 3.10 - typing_extensions >=4.0.0,<5.0.0 -test: - imports: - - rich - commands: - - pip check - requires: - - pip +tests: + - python: + imports: + - rich about: homepage: https://github.com/Textualize/rich diff --git a/examples/xtensor/recipe.yaml b/examples/xtensor/recipe.yaml index 39abf32e..0dfdef1a 100644 --- a/examples/xtensor/recipe.yaml +++ b/examples/xtensor/recipe.yaml @@ -40,19 +40,19 @@ requirements: run_constrained: - xsimd >=8.0.3,<10 -test: - commands: - - if: unix or emscripten - then: - - test -d ${PREFIX}/include/xtensor - - test -f ${PREFIX}/include/xtensor/xarray.hpp - - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake - - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake - - if: win - then: - - if not exist %LIBRARY_PREFIX%\include\xtensor\xarray.hpp (exit 1) - - if not exist %LIBRARY_PREFIX%\share\cmake\xtensor\xtensorConfig.cmake (exit 1) - - if not exist %LIBRARY_PREFIX%\share\cmake\xtensor\xtensorConfigVersion.cmake (exit 1) +tests: + - script: + - if: unix or emscripten + then: + - test -d ${PREFIX}/include/xtensor + - test -f ${PREFIX}/include/xtensor/xarray.hpp + - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake + - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake + - if: win + then: + - if not exist %LIBRARY_PREFIX%\include\xtensor\xarray.hpp (exit 1) + - if not exist %LIBRARY_PREFIX%\share\cmake\xtensor\xtensorConfig.cmake (exit 1) + - if not exist %LIBRARY_PREFIX%\share\cmake\xtensor\xtensorConfigVersion.cmake (exit 1) about: homepage: https://github.com/xtensor-stack/xtensor diff --git a/mkdocs.yml b/mkdocs.yml index 54bdf852..09ba44c1 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -72,20 +72,22 @@ extra: nav: - Home: index.md - Highlevel overview: highlevel.md + - Recipe file: recipe_file.md + - Selector syntax: selectors.md - Build script: build_script.md - Variants: variants.md - - Package specification: package_spec.md - Compilers and cross compilation: compilers.md + + - CLI Usage: cli_usage.md + - Automatic recipe linting: automatic_linting.md + - Testing packages: testing.md - Reproducible builds: rebuild.md + - Package specification: package_spec.md - Internals: internals.md - - Recipe file: recipe_file.md - - - CLI Usage: cli_usage.md - - Automatic recipe linting: automatic_linting.md plugins: - search diff --git a/rust-tests/src/lib.rs b/rust-tests/src/lib.rs index b68f844f..98c42584 100644 --- a/rust-tests/src/lib.rs +++ b/rust-tests/src/lib.rs @@ -147,6 +147,7 @@ mod tests { } unreachable!("above is an infinite loop") } + fn rattler() -> RattlerBuild { if let Ok(path) = std::env::var("RATTLER_BUILD_PATH") { if let Some(ret) = RattlerBuild::with_binary(path) { @@ -155,9 +156,11 @@ mod tests { } RattlerBuild::with_cargo(".").unwrap() } + fn recipes() -> PathBuf { test_data_dir().join("recipes") } + fn test_data_dir() -> PathBuf { PathBuf::from(shx("cargo locate-project --workspace -q --message-format=plain").unwrap()) .parent() @@ -420,10 +423,13 @@ mod tests { let license = pkg.join("info/licenses/LICENSE.rst"); assert!(license.exists()); - assert!(pkg.join("info/test/run_test.sh").exists()); - assert!(pkg.join("info/test/run_test.bat").exists()); - assert!(pkg.join("info/test/run_test.py").exists()); - assert!(pkg.join("info/test/test_time_dependencies.json").exists()); + assert!(pkg.join("info/tests/1/run_test.sh").exists()); + assert!(pkg.join("info/tests/1/run_test.bat").exists()); + assert!(pkg + .join("info/tests/1/test_time_dependencies.json") + .exists()); + + assert!(pkg.join("info/tests/0/python_test.json").exists()); // make sure that the entry point does not exist assert!(!pkg.join("python-scripts/flask").exists()); diff --git a/src/build.rs b/src/build.rs index e5e8142e..0d1829fe 100644 --- a/src/build.rs +++ b/src/build.rs @@ -18,12 +18,12 @@ use rattler_shell::shell; use crate::env_vars::write_env_script; use crate::metadata::{Directories, Output}; +use crate::package_test::TestConfiguration; use crate::packaging::{package_conda, record_files}; -use crate::recipe::parser::ScriptContent; +use crate::recipe::parser::{ScriptContent, TestType}; use crate::render::resolved_dependencies::{install_environments, resolve_dependencies}; use crate::source::fetch_sources; -use crate::test::TestConfiguration; -use crate::{index, test, tool_configuration}; +use crate::{index, package_test, tool_configuration}; const BASH_PREAMBLE: &str = r#" ## Start of bash preamble @@ -317,14 +317,18 @@ pub async fn run_build( ) .into_diagnostic()?; - if let Some(package_content) = output.recipe.test().package_content() { - test::run_package_content_tests( - package_content, - paths_json, - &output.build_configuration.target_platform, - ) - .await - .into_diagnostic()?; + // We run all the package content tests + for test in output.recipe.tests() { + // TODO we could also run each of the (potentially multiple) test scripts and collect the errors + if let TestType::PackageContents(package_contents) = test { + package_test::run_package_content_tests( + package_contents, + &paths_json, + &output.build_configuration.target_platform, + ) + .await + .into_diagnostic()?; + } } if !tool_configuration.no_clean { @@ -347,15 +351,15 @@ pub async fn run_build( } else { tracing::info!("Running tests"); - test::run_test( + package_test::run_test( &result, &TestConfiguration { test_prefix: test_dir.clone(), target_platform: Some(output.build_configuration.target_platform), keep_test_prefix: tool_configuration.no_clean, channels, + tool_configuration: tool_configuration.clone(), }, - &tool_configuration, ) .await .into_diagnostic()?; diff --git a/src/lib.rs b/src/lib.rs index b96c1107..ac554e98 100644 --- a/src/lib.rs +++ b/src/lib.rs @@ -1,14 +1,14 @@ -#![deny(missing_docs)] +// #![deny(missing_docs)] //! The library pieces of rattler-build pub mod build; pub mod metadata; +pub mod package_test; pub mod recipe; pub mod render; pub mod selectors; pub mod source; -pub mod test; pub mod tool_configuration; pub mod used_variables; pub mod variant_config; diff --git a/src/main.rs b/src/main.rs index 96767b7c..e38ab47e 100644 --- a/src/main.rs +++ b/src/main.rs @@ -28,9 +28,9 @@ use rattler_build::{ build::run_build, hash::HashInfo, metadata::{BuildConfiguration, Directories, PackageIdentifier}, + package_test::{self, TestConfiguration}, recipe::{parser::Recipe, ParsingError}, selectors::SelectorConfig, - test::{self, TestConfiguration}, tool_configuration, variant_config::{ParseErrors, VariantConfig}, }; @@ -155,6 +155,10 @@ struct BuildOpts { #[derive(Parser)] struct TestOpts { + /// Channels to use when testing + #[arg(short = 'c', long)] + channel: Option>, + /// The package file to test #[arg(short, long)] package_file: PathBuf, @@ -322,13 +326,6 @@ async fn run_test_from_args(args: TestOpts) -> miette::Result<()> { let test_prefix = PathBuf::from("test-prefix"); fs::create_dir_all(&test_prefix).into_diagnostic()?; - let test_options = TestConfiguration { - test_prefix, - target_platform: Some(Platform::current()), - keep_test_prefix: false, - channels: vec!["conda-forge".to_string(), "./output".to_string()], - }; - let client = AuthenticatedClient::from_client( reqwest::Client::builder() .no_gzip() @@ -337,14 +334,23 @@ async fn run_test_from_args(args: TestOpts) -> miette::Result<()> { get_auth_store(args.common.auth_file), ); - let global_configuration = tool_configuration::Configuration { - client, - multi_progress_indicator: MultiProgress::new(), - no_clean: test_options.keep_test_prefix, - ..Default::default() + let test_options = TestConfiguration { + test_prefix, + target_platform: Some(Platform::current()), + keep_test_prefix: false, + channels: args + .channel + .unwrap_or_else(|| vec!["conda-forge".to_string()]), + tool_configuration: tool_configuration::Configuration { + client, + multi_progress_indicator: MultiProgress::new(), + // duplicate from `keep_test_prefix`? + no_clean: false, + ..Default::default() + }, }; - test::run_test(&package_file, &test_options, &global_configuration) + package_test::run_test(&package_file, &test_options) .await .into_diagnostic()?; diff --git a/src/test.rs b/src/package_test.rs similarity index 73% rename from src/test.rs rename to src/package_test.rs index 062b2091..036a4c76 100644 --- a/src/test.rs +++ b/src/package_test.rs @@ -7,9 +7,9 @@ //! * `imports` - import a list of modules and check if they can be imported //! * `files` - check if a list of files exist +use fs_err as fs; use std::{ - fs::{self}, - io::{Read, Write}, + io::Write, path::{Path, PathBuf}, str::FromStr, }; @@ -17,7 +17,7 @@ use std::{ use dunce::canonicalize; use rattler::package_cache::CacheKey; use rattler_conda_types::{ - package::{ArchiveIdentifier, ArchiveType, PathsJson}, + package::{ArchiveIdentifier, PathsJson}, MatchSpec, Platform, }; use rattler_shell::{ @@ -26,7 +26,10 @@ use rattler_shell::{ }; use crate::{ - env_vars, index, render::solver::create_environment, tool_configuration::Configuration, + env_vars, index, + recipe::parser::{CommandsTestRequirements, PackageContents, PythonTest}, + render::solver::create_environment, + tool_configuration, }; #[allow(missing_docs)] @@ -57,7 +60,7 @@ pub enum TestError { TestEnvironmentSetup(#[from] anyhow::Error), #[error("Failed to setup test environment: {0}")] - TestEnvironementActivation(#[from] ActivationError), + TestEnvironmentActivation(#[from] ActivationError), #[error("Failed to parse JSON from test files: {0}")] TestJSONParseError(#[from] serde_json::Error), @@ -181,10 +184,10 @@ impl Tests { } } -async fn tests_from_folder(pkg: &Path) -> Result<(PathBuf, Vec), TestError> { +async fn legacy_tests_from_folder(pkg: &Path) -> Result<(PathBuf, Vec), TestError> { let mut tests = Vec::new(); - let test_folder = pkg.join("info").join("test"); + let test_folder = pkg.join("info/test"); if !test_folder.exists() { return Ok((test_folder, tests)); @@ -212,52 +215,8 @@ async fn tests_from_folder(pkg: &Path) -> Result<(PathBuf, Vec), TestErro Ok((test_folder, tests)) } -fn file_from_tar_bz2(archive_path: &Path, find_path: &Path) -> Result { - let reader = std::fs::File::open(archive_path)?; - let mut archive = rattler_package_streaming::read::stream_tar_bz2(reader); - - for entry in archive.entries()? { - let mut entry = entry?; - let path = entry.path()?; - if path == find_path { - let mut contents = String::new(); - entry.read_to_string(&mut contents)?; - return Ok(contents); - } - } - Err(std::io::Error::new( - std::io::ErrorKind::NotFound, - format!("{:?} not found in {:?}", find_path, archive_path), - )) -} - -fn file_from_conda(archive_path: &Path, find_path: &Path) -> Result { - let reader = std::fs::File::open(archive_path)?; - - let mut archive = if find_path.starts_with("info") { - rattler_package_streaming::seek::stream_conda_info(reader) - .expect("Could not open conda file") - } else { - todo!("Not implemented yet"); - }; - - for entry in archive.entries()? { - let mut entry = entry?; - let path = entry.path()?; - if path == find_path { - let mut contents = String::new(); - entry.read_to_string(&mut contents)?; - return Ok(contents); - } - } - Err(std::io::Error::new( - std::io::ErrorKind::NotFound, - format!("{:?} not found in {:?}", find_path, archive_path), - )) -} - /// The configuration for a test -#[derive(Default, Debug)] +#[derive(Default)] pub struct TestConfiguration { /// The test prefix directory (will be created) pub test_prefix: PathBuf, @@ -268,6 +227,8 @@ pub struct TestConfiguration { /// The channels to use for the test – do not forget to add the local build outputs channel /// if desired pub channels: Vec, + /// The tool configuration + pub tool_configuration: tool_configuration::Configuration, } /// Run a test for a single package @@ -292,11 +253,7 @@ pub struct TestConfiguration { /// /// * `Ok(())` if the test was successful /// * `Err(TestError::TestFailed)` if the test failed -pub async fn run_test( - package_file: &Path, - config: &TestConfiguration, - global_configuration: &Configuration, -) -> Result<(), TestError> { +pub async fn run_test(package_file: &Path, config: &TestConfiguration) -> Result<(), TestError> { let tmp_repo = tempfile::tempdir()?; let target_platform = config.target_platform.unwrap_or_else(Platform::current); @@ -312,31 +269,6 @@ pub async fn run_test( ), )?; - let archive_type = - ArchiveType::try_from(package_file).ok_or(TestError::ArchiveTypeNotSupported)?; - let test_dep_json = PathBuf::from("info/test/test_time_dependencies.json"); - let test_dependencies = match archive_type { - ArchiveType::TarBz2 => file_from_tar_bz2(package_file, &test_dep_json), - ArchiveType::Conda => file_from_conda(package_file, &test_dep_json), - }; - - let mut dependencies: Vec = match test_dependencies { - Ok(contents) => { - let test_deps: Vec = serde_json::from_str(&contents)?; - test_deps - .iter() - .map(|s| MatchSpec::from_str(s)) - .collect::, _>>()? - } - Err(error) => { - if error.kind() == std::io::ErrorKind::NotFound { - Vec::new() - } else { - return Err(TestError::TestFailed); - } - } - }; - // index the temporary channel index::index(tmp_repo.path(), Some(&target_platform))?; @@ -349,15 +281,10 @@ pub async fn run_test( let package_folder = cache_dir.join("pkgs").join(cache_key.to_string()); if package_folder.exists() { - tracing::info!("Removing previously cached package {:?}", package_folder); - fs::remove_dir_all(package_folder)?; + tracing::info!("Removing previously cached package {:?}", &package_folder); + fs::remove_dir_all(&package_folder)?; } - let match_spec = - MatchSpec::from_str(format!("{}={}={}", pkg.name, pkg.version, pkg.build_string).as_str()) - .map_err(|e| TestError::MatchSpecParse(e.to_string()))?; - dependencies.push(match_spec); - let prefix = canonicalize(&config.test_prefix)?; tracing::info!("Creating test environment in {:?}", prefix); @@ -368,32 +295,203 @@ pub async fn run_test( Platform::current() }; + let mut channels = config.channels.clone(); + channels.insert(0, tmp_repo.path().to_string_lossy().to_string()); + + tracing::info!("Collecting tests from {:?}", package_folder); + + rattler_package_streaming::fs::extract(package_file, &package_folder).map_err(|e| { + tracing::error!("Failed to extract package: {:?}", e); + TestError::TestFailed + })?; + + // extract package in place + if package_folder.join("info/test").exists() { + let test_dep_json = PathBuf::from("info/test/test_time_dependencies.json"); + let test_dependencies: Vec = if package_folder.join(&test_dep_json).exists() { + serde_json::from_str(&std::fs::read_to_string( + package_folder.join(&test_dep_json), + )?)? + } else { + Vec::new() + }; + + let mut dependencies: Vec = test_dependencies + .iter() + .map(|s| MatchSpec::from_str(s)) + .collect::, _>>()?; + + tracing::info!("Creating test environment in {:?}", prefix); + let match_spec = MatchSpec::from_str( + format!("{}={}={}", pkg.name, pkg.version, pkg.build_string).as_str(), + ) + .map_err(|e| TestError::MatchSpecParse(e.to_string()))?; + dependencies.push(match_spec); + + create_environment( + &dependencies, + &platform, + &prefix, + &channels, + &config.tool_configuration, + ) + .await + .map_err(TestError::TestEnvironmentSetup)?; + + // These are the legacy tests + let (test_folder, tests) = legacy_tests_from_folder(&package_folder).await?; + + for test in tests { + test.run(&prefix, &test_folder)?; + } + + tracing::info!( + "{} all tests passed!", + console::style(console::Emoji("✔", "")).green() + ); + } + + if package_folder.join("info/tests").exists() { + // These are the new style tests + let test_folder = package_folder.join("info/tests"); + let mut read_dir = tokio::fs::read_dir(&test_folder).await?; + + // for each enumerated test, we load and run it + while let Some(entry) = read_dir.next_entry().await? { + println!("test {:?}", entry.path()); + run_individual_test(&pkg, &entry.path(), &prefix, config).await?; + } + } + + fs::remove_dir_all(prefix)?; + + Ok(()) +} + +async fn run_python_test( + pkg: &ArchiveIdentifier, + path: &Path, + prefix: &Path, + config: &TestConfiguration, +) -> Result<(), TestError> { + let test_file = path.join("python_test.json"); + let test: PythonTest = serde_json::from_str(&fs::read_to_string(test_file)?)?; + + let match_spec = + MatchSpec::from_str(format!("{}={}={}", pkg.name, pkg.version, pkg.build_string).as_str()) + .unwrap(); + let mut dependencies = vec![match_spec]; + if test.pip_check { + dependencies.push(MatchSpec::from_str("pip").unwrap()); + } + + let platform = Platform::current(); + create_environment( &dependencies, &platform, - &prefix, + prefix, &config.channels, - global_configuration, + &config.tool_configuration, ) .await .map_err(TestError::TestEnvironmentSetup)?; - let cache_key = CacheKey::from(pkg); - let dir = cache_dir.join("pkgs").join(cache_key.to_string()); + let default_shell = ShellEnum::default(); - tracing::info!("Collecting tests from {:?}", dir); - let (test_folder, tests) = tests_from_folder(&dir).await?; + let mut test_file = tempfile::Builder::new() + .prefix("rattler-test-") + .suffix(".py") + .tempfile()?; - for test in tests { - test.run(&prefix, &test_folder)?; + for import in test.imports { + writeln!(test_file, "import {}", import)?; } - tracing::info!( - "{} all tests passed!", - console::style(console::Emoji("✔", "")).green() - ); + run_in_environment( + default_shell.clone(), + format!("python {}", test_file.path().to_string_lossy()), + path, + prefix, + )?; - fs::remove_dir_all(prefix)?; + if test.pip_check { + run_in_environment(default_shell, "pip check".into(), path, prefix) + } else { + Ok(()) + } +} + +async fn run_shell_test( + pkg: &ArchiveIdentifier, + path: &Path, + prefix: &Path, + config: &TestConfiguration, +) -> Result<(), TestError> { + let deps = if path.join("test_time_dependencies.json").exists() { + let test_dep_json = path.join("test_time_dependencies.json"); + serde_json::from_str(&fs::read_to_string(test_dep_json)?)? + } else { + CommandsTestRequirements::default() + }; + + if !deps.build.is_empty() { + todo!("build dependencies not implemented yet"); + } + + let mut dependencies = deps + .run + .iter() + .map(|s| MatchSpec::from_str(s)) + .collect::, _>>()?; + + // create environment with the test dependencies + dependencies.push(MatchSpec::from_str( + format!("{}={}={}", pkg.name, pkg.version, pkg.build_string).as_str(), + )?); + + let platform = Platform::current(); + + create_environment( + &dependencies, + &platform, + prefix, + &config.channels, + &config.tool_configuration, + ) + .await + .map_err(TestError::TestEnvironmentSetup)?; + + let default_shell = ShellEnum::default(); + + let test_file_path = if platform.is_windows() { + path.join("run_test.bat") + } else { + path.join("run_test.sh") + }; + + let contents = fs::read_to_string(test_file_path)?; + + tracing::info!("Testing commands:"); + run_in_environment(default_shell, contents, path, prefix)?; + + Ok(()) +} + +async fn run_individual_test( + pkg: &ArchiveIdentifier, + path: &Path, + prefix: &Path, + config: &TestConfiguration, +) -> Result<(), TestError> { + if path.join("python_test.json").exists() { + run_python_test(pkg, path, prefix, config).await?; + } else if path.join("run_test.sh").exists() || path.join("run_test.bat").exists() { + // run shell test + run_shell_test(pkg, path, prefix, config).await?; + } else { + // no test found + } Ok(()) } @@ -408,8 +506,8 @@ pub async fn run_test( /// * `Ok(())` if the test was successful /// * `Err(TestError::TestFailed)` if the test failed pub async fn run_package_content_tests( - package_content: &crate::recipe::parser::PackageContent, - paths_json: PathsJson, + package_content: &PackageContents, + paths_json: &PathsJson, target_platform: &Platform, ) -> Result<(), TestError> { // files globset diff --git a/src/packaging.rs b/src/packaging.rs index 55750d2a..beec68d5 100644 --- a/src/packaging.rs +++ b/src/packaging.rs @@ -28,6 +28,7 @@ use rattler_package_streaming::write::{ use crate::macos; use crate::metadata::Output; +use crate::recipe::parser::{DownstreamTest, PythonTest, TestType}; use crate::{linux, post}; #[derive(Debug, thiserror::Error)] @@ -624,17 +625,6 @@ fn copy_license_files( let licenses_folder = tmp_dir_path.join("info/licenses/"); fs::create_dir_all(&licenses_folder)?; - for license_glob in license_globs - .iter() - // Only license globs that do not end with '/' or '*' - .filter(|license_glob| !license_glob.ends_with('/') && !license_glob.ends_with('*')) - { - let filepath = licenses_folder.join(license_glob); - if !filepath.exists() { - tracing::warn!(path = %filepath.display(), "File does not exist"); - } - } - let copy_dir = crate::source::copy_dir::CopyDir::new( &output.build_configuration.directories.recipe_dir, &licenses_folder, @@ -643,7 +633,7 @@ fn copy_license_files( .use_gitignore(false) .run()?; - let copied_files_recipe_dir = copy_dir.copied_pathes(); + let copied_files_recipe_dir = copy_dir.copied_paths(); let any_include_matched_recipe_dir = copy_dir.any_include_glob_matched(); let copy_dir = crate::source::copy_dir::CopyDir::new( @@ -654,7 +644,7 @@ fn copy_license_files( .use_gitignore(false) .run()?; - let copied_files_work_dir = copy_dir.copied_pathes(); + let copied_files_work_dir = copy_dir.copied_paths(); let any_include_matched_work_dir = copy_dir.any_include_glob_matched(); let copied_files = copied_files_recipe_dir @@ -712,103 +702,127 @@ fn filter_pyc(path: &Path, new_files: &HashSet) -> bool { fn write_test_files(output: &Output, tmp_dir_path: &Path) -> Result, PackagingError> { let mut test_files = Vec::new(); - let test = output.recipe.test(); - if !test.is_empty() { - let test_folder = tmp_dir_path.join("info/test/"); - fs::create_dir_all(&test_folder)?; - - if !test.imports().is_empty() { - let test_file = test_folder.join("run_test.py"); - let mut file = File::create(&test_file)?; - for el in test.imports() { - writeln!(file, "import {}\n", el)?; + + for (idx, test) in output.recipe.tests().iter().enumerate() { + if let Some(files) = match test { + TestType::Python(python_test) => { + Some(serialize_python_test(python_test, idx, tmp_dir_path)?) } - test_files.push(test_file); + TestType::Command(command_test) => Some(serialize_command_test( + command_test, + idx, + output, + tmp_dir_path, + )?), + TestType::Downstream(downstream_test) => Some(serialize_downstream_test( + downstream_test, + idx, + tmp_dir_path, + )?), + TestType::PackageContents(_) => None, + } { + test_files.extend(files); } + } - if !test.commands().is_empty() { - let mut command_files = vec![]; - let target_platform = &output.build_configuration.target_platform; - if target_platform.is_windows() || target_platform == &Platform::NoArch { - command_files.push(test_folder.join("run_test.bat")); - } - if target_platform.is_unix() || target_platform == &Platform::NoArch { - command_files.push(test_folder.join("run_test.sh")); - } + Ok(test_files) +} - for cf in command_files { - let mut file = File::create(&cf)?; - for el in test.commands() { - writeln!(file, "{}\n", el)?; - } - test_files.push(cf); - } - } +fn serialize_downstream_test( + downstream_test: &DownstreamTest, + idx: usize, + tmp_dir_path: &Path, +) -> Result, PackagingError> { + let folder = tmp_dir_path.join(format!("info/tests/{}", idx)); + fs::create_dir_all(&folder)?; - if !test.requires().is_empty() { - let test_dependencies = test.requires(); - let test_file = test_folder.join("test_time_dependencies.json"); - let mut file = File::create(&test_file)?; - file.write_all(serde_json::to_string(test_dependencies)?.as_bytes())?; - test_files.push(test_file); - } + let path = folder.join("downstream_test.json"); + let mut file = File::create(&path)?; + file.write_all(serde_json::to_string(downstream_test)?.as_bytes())?; - if !test.files().is_empty() { - let globs = test.files(); - let include_globs = globs - .iter() - .filter(|glob| !glob.trim_start().starts_with('~')) - .map(AsRef::as_ref) - .collect::>(); - - let exclude_globs = globs - .iter() - .filter(|glob| glob.trim_start().starts_with('~')) - .map(AsRef::as_ref) - .collect::>(); - - let copy_dir = crate::source::copy_dir::CopyDir::new( - &output.build_configuration.directories.recipe_dir, - &test_folder, - ) - .with_include_globs(include_globs) - .with_exclude_globs(exclude_globs) - .use_gitignore(true) - .run()?; - - test_files.extend(copy_dir.copied_pathes().iter().cloned()); - } + Ok(vec![path]) +} + +fn serialize_command_test( + command_test: &crate::recipe::parser::CommandsTest, + idx: usize, + output: &Output, + tmp_dir_path: &Path, +) -> Result, PackagingError> { + let mut command_files = vec![]; + let mut test_files = vec![]; + + let test_folder = tmp_dir_path.join(format!("info/tests/{}", idx)); + fs::create_dir_all(&test_folder)?; - if !test.source_files().is_empty() { - let globs = test.source_files(); - let include_globs = globs - .iter() - .filter(|glob| !glob.trim_start().starts_with('~')) - .map(AsRef::as_ref) - .collect::>(); - - let exclude_globs = globs - .iter() - .filter(|glob| glob.trim_start().starts_with('~')) - .map(AsRef::as_ref) - .collect::>(); - - let copy_dir = crate::source::copy_dir::CopyDir::new( - &output.build_configuration.directories.work_dir, - &test_folder, - ) - .with_include_globs(include_globs) - .with_exclude_globs(exclude_globs) - .use_gitignore(true) - .run()?; - - test_files.extend(copy_dir.copied_pathes().iter().cloned()); + let target_platform = &output.build_configuration.target_platform; + if target_platform.is_windows() || target_platform == &Platform::NoArch { + command_files.push(test_folder.join("run_test.bat")); + } + + if target_platform.is_unix() || target_platform == &Platform::NoArch { + command_files.push(test_folder.join("run_test.sh")); + } + + for cf in command_files { + let mut file = File::create(&cf)?; + for el in &command_test.script { + writeln!(file, "{}\n", el)?; } + test_files.push(cf); + } + + if !command_test.requirements.is_empty() { + let test_dependencies = &command_test.requirements; + let test_file = test_folder.join("test_time_dependencies.json"); + let mut file = File::create(&test_file)?; + file.write_all(serde_json::to_string(&test_dependencies)?.as_bytes())?; + test_files.push(test_file); + } + + if !command_test.files.recipe.is_empty() { + let globs = &command_test.files.recipe; + let copy_dir = crate::source::copy_dir::CopyDir::new( + &output.build_configuration.directories.recipe_dir, + &test_folder, + ) + .with_parse_globs(globs.iter().map(AsRef::as_ref)) + .use_gitignore(true) + .run()?; + + test_files.extend(copy_dir.copied_paths().iter().cloned()); + } + + if !command_test.files.source.is_empty() { + let globs = &command_test.files.source; + let copy_dir = crate::source::copy_dir::CopyDir::new( + &output.build_configuration.directories.work_dir, + &test_folder, + ) + .with_parse_globs(globs.iter().map(AsRef::as_ref)) + .use_gitignore(true) + .run()?; + + test_files.extend(copy_dir.copied_paths().iter().cloned()); } Ok(test_files) } +fn serialize_python_test( + python_test: &PythonTest, + idx: usize, + tmp_dir_path: &Path, +) -> Result, PackagingError> { + let folder = tmp_dir_path.join(format!("info/tests/{}", idx)); + fs::create_dir_all(&folder)?; + + let path = folder.join("python_test.json"); + serde_json::to_writer(&File::create(&path)?, python_test)?; + + Ok(vec![path]) +} + fn write_recipe_folder( output: &Output, tmp_dir_path: &Path, @@ -818,7 +832,7 @@ fn write_recipe_folder( let copy_result = crate::source::copy_dir::CopyDir::new(recipe_dir, &recipe_folder).run()?; - let mut files = Vec::from(copy_result.copied_pathes()); + let mut files = Vec::from(copy_result.copied_paths()); // write the variant config to the appropriate file let variant_config_file = recipe_folder.join("variant_config.yaml"); let mut variant_config = File::create(&variant_config_file)?; diff --git a/src/recipe/parser.rs b/src/recipe/parser.rs index f5aede81..4f88c0cf 100644 --- a/src/recipe/parser.rs +++ b/src/recipe/parser.rs @@ -35,7 +35,10 @@ pub use self::{ }, script::{Script, ScriptContent}, source::{Checksum, GitRev, GitSource, GitUrl, PathSource, Source, UrlSource}, - test::{PackageContent, Test}, + test::{ + CommandsTest, CommandsTestFiles, CommandsTestRequirements, DownstreamTest, PackageContents, + PythonTest, TestType, + }, }; use super::custom_yaml::Node; @@ -53,12 +56,13 @@ pub struct Recipe { /// The information about the requirements pub requirements: Requirements, /// The information about how to test the package - #[serde(default, skip_serializing_if = "Test::is_default")] - pub test: Test, + #[serde(default, skip_serializing_if = "Vec::is_empty")] + pub tests: Vec, /// The information about the package #[serde(default, skip_serializing_if = "About::is_default")] pub about: About, } + pub(crate) trait CollectErrors: Iterator> + Sized { fn collect_errors(self) -> Result<(), Vec> { let err = self @@ -184,7 +188,7 @@ impl Recipe { let mut build = Build::default(); let mut source = Vec::new(); let mut requirements = Requirements::default(); - let mut test = Test::default(); + let mut tests = Vec::default(); let mut about = About::default(); rendered_node @@ -204,7 +208,7 @@ impl Recipe { "source" => source = value.try_convert(key_str)?, "build" => build = value.try_convert(key_str)?, "requirements" => requirements = value.try_convert(key_str)?, - "test" => test = value.try_convert(key_str)?, + "tests" => tests = value.try_convert(key_str)?, "about" => about = value.try_convert(key_str)?, "context" => {} "extra" => {} @@ -237,7 +241,7 @@ impl Recipe { build, source, requirements, - test, + tests, about, }; @@ -265,8 +269,8 @@ impl Recipe { } /// Get the test information. - pub const fn test(&self) -> &Test { - &self.test + pub const fn tests(&self) -> &Vec { + &self.tests } /// Get the about information. @@ -278,6 +282,7 @@ impl Recipe { #[cfg(test)] mod tests { use insta::assert_yaml_snapshot; + use rattler_conda_types::Platform; use crate::{assert_miette_snapshot, variant_config::ParseErrors}; @@ -286,12 +291,24 @@ mod tests { #[test] fn it_works() { let recipe = include_str!("../../examples/xtensor/recipe.yaml"); - let recipe = Recipe::from_yaml(recipe, SelectorConfig::default()); - assert!(recipe.is_ok()); - #[cfg(target_family = "unix")] - insta::assert_debug_snapshot!(recipe.unwrap()); - #[cfg(target_family = "windows")] - insta::assert_debug_snapshot!("recipe_windows", recipe.unwrap()); + + let selector_config_win = SelectorConfig { + target_platform: Platform::Win64, + ..SelectorConfig::default() + }; + + let selector_config_unix = SelectorConfig { + target_platform: Platform::Linux64, + ..SelectorConfig::default() + }; + + let unix_recipe = Recipe::from_yaml(recipe, selector_config_unix); + let win_recipe = Recipe::from_yaml(recipe, selector_config_win); + assert!(unix_recipe.is_ok()); + assert!(win_recipe.is_ok()); + + insta::assert_debug_snapshot!("unix_recipe", unix_recipe.unwrap()); + insta::assert_debug_snapshot!("recipe_windows", win_recipe.unwrap()); } #[test] diff --git a/src/recipe/parser/snapshots/rattler_build__recipe__parser__test__test__parsing.snap b/src/recipe/parser/snapshots/rattler_build__recipe__parser__test__test__parsing.snap new file mode 100644 index 00000000..5ddd1c96 --- /dev/null +++ b/src/recipe/parser/snapshots/rattler_build__recipe__parser__test__test__parsing.snap @@ -0,0 +1,9 @@ +--- +source: src/recipe/parser/test.rs +expression: "serde_yaml::to_string(&tests).unwrap()" +--- +- test_type: python + imports: + - import os + - import sys + diff --git a/src/recipe/parser/test.rs b/src/recipe/parser/test.rs index 05a18e39..845f49be 100644 --- a/src/recipe/parser/test.rs +++ b/src/recipe/parser/test.rs @@ -1,48 +1,97 @@ +//! Test parser module. + use serde::{Deserialize, Serialize}; use crate::{ _partialerror, recipe::{ - custom_yaml::{HasSpan, RenderedMappingNode, RenderedNode, TryConvertNode}, + custom_yaml::{ + HasSpan, RenderedMappingNode, RenderedNode, RenderedSequenceNode, TryConvertNode, + }, error::{ErrorKind, PartialParsingError}, }, }; use super::FlattenErrors; -/// Define tests in your recipe that are executed after successfully building the package. #[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize)] -pub struct Test { - /// Try importing a python module as a sanity check - #[serde(default, skip_serializing_if = "Vec::is_empty")] - imports: Vec, - /// Run a list of given commands +pub struct CommandsTestRequirements { #[serde(default, skip_serializing_if = "Vec::is_empty")] - commands: Vec, - /// Extra requirements to be installed at test time + pub run: Vec, + #[serde(default, skip_serializing_if = "Vec::is_empty")] - requires: Vec, - /// Extra files to be copied to the test environment from the source dir (can be globs) + pub build: Vec, +} + +#[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize)] +pub struct CommandsTestFiles { + // TODO parse as globs #[serde(default, skip_serializing_if = "Vec::is_empty")] - source_files: Vec, - /// Extra files to be copied to the test environment from the build dir (can be globs) + pub source: Vec, #[serde(default, skip_serializing_if = "Vec::is_empty")] - files: Vec, - /// All new test section - #[serde(skip_serializing_if = "Option::is_none")] - package_contents: Option, + pub recipe: Vec, } -impl Test { - /// Returns true if the test has its default configuration. - pub fn is_default(&self) -> bool { - self == &Self::default() +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +pub struct CommandsTest { + pub script: Vec, + #[serde(default, skip_serializing_if = "CommandsTestRequirements::is_empty")] + pub requirements: CommandsTestRequirements, + #[serde(default, skip_serializing_if = "CommandsTestFiles::is_empty")] + pub files: CommandsTestFiles, +} + +impl CommandsTestRequirements { + pub fn is_empty(&self) -> bool { + self.run.is_empty() && self.build.is_empty() + } +} + +impl CommandsTestFiles { + pub fn is_empty(&self) -> bool { + self.source.is_empty() && self.recipe.is_empty() } } +fn default_pip_check() -> bool { + true +} + +fn is_true(value: &bool) -> bool { + *value +} + +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +pub struct PythonTest { + /// List of imports to test + pub imports: Vec, + /// Wether to run `pip check` or not (default to true) + #[serde(default = "default_pip_check", skip_serializing_if = "is_true")] + pub pip_check: bool, +} + +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +pub struct DownstreamTest { + pub downstream: String, +} + +#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] +/// The test type enum +#[serde(rename_all = "snake_case", tag = "test_type")] +pub enum TestType { + /// A Python test. + Python(PythonTest), + /// A test that executes multiple commands in a freshly created environment + Command(CommandsTest), + /// A test that runs the tests of a downstream package + Downstream(DownstreamTest), + /// A test that checks the contents of the package + PackageContents(PackageContents), +} + #[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize)] /// PackageContent -pub struct PackageContent { +pub struct PackageContents { /// file paths, direct and/or globs #[serde(default, skip_serializing_if = "Vec::is_empty")] files: Vec, @@ -61,7 +110,7 @@ pub struct PackageContent { includes: Vec, } -impl PackageContent { +impl PackageContents { /// Get the package files. pub fn files(&self) -> &[String] { &self.files @@ -88,126 +137,333 @@ impl PackageContent { } } -impl TryConvertNode for RenderedNode { - fn try_convert(&self, name: &str) -> Result> { +impl TryConvertNode> for RenderedNode { + fn try_convert(&self, name: &str) -> Result, Vec> { + match self { + RenderedNode::Sequence(seq) => seq.try_convert(name), + RenderedNode::Scalar(_) | RenderedNode::Mapping(_) => Err(vec![_partialerror!( + *self.span(), + ErrorKind::ExpectedSequence, + )])?, + RenderedNode::Null(_) => Ok(vec![]), + } + } +} + +impl TryConvertNode> for RenderedSequenceNode { + fn try_convert(&self, name: &str) -> Result, Vec> { + let mut tests = vec![]; + for value in self.iter() { + let test = value.try_convert(name)?; + tests.push(test); + } + Ok(tests) + } +} + +impl TryConvertNode for RenderedNode { + fn try_convert(&self, name: &str) -> Result> { match self { RenderedNode::Mapping(map) => map.try_convert(name), RenderedNode::Sequence(_) | RenderedNode::Scalar(_) => Err(vec![_partialerror!( *self.span(), ErrorKind::ExpectedMapping, )])?, - RenderedNode::Null(_) => Ok(PackageContent::default()), + RenderedNode::Null(_) => Ok(TestType::PackageContents(PackageContents::default())), } } } -impl TryConvertNode for RenderedMappingNode { - fn try_convert(&self, name: &str) -> Result> { - let mut files = vec![]; - let mut site_packages = vec![]; - let mut libs = vec![]; - let mut bins = vec![]; - let mut includes = vec![]; +pub fn as_mapping( + value: &RenderedNode, + name: &str, +) -> Result> { + value.as_mapping().cloned().ok_or_else(|| { + vec![_partialerror!( + *value.span(), + ErrorKind::ExpectedMapping, + help = format!("expected fields for {name} to be a map") + )] + }) +} + +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut test = TestType::PackageContents(PackageContents::default()); + self.iter().map(|(key, value)| { let key_str = key.as_str(); + match key_str { - "files" => files = value.try_convert(key_str)?, - "site_packages" => site_packages = value.try_convert(key_str)?, - "libs" => libs = value.try_convert(key_str)?, - "bins" => bins = value.try_convert(key_str)?, - "includes" => includes = value.try_convert(key_str)?, + "python" => { + let imports = as_mapping(value, key_str)?.try_convert(key_str)?; + test = TestType::Python(imports); + } + "script" | "requirements" | "files" => { + let commands = self.try_convert(key_str)?; + test = TestType::Command(commands); + } + "downstream" => { + let downstream = self.try_convert(key_str)?; + test = TestType::Downstream(downstream); + } + "package_contents" => { + let package_contents = as_mapping(value, key_str)?.try_convert(key_str)?; + test = TestType::PackageContents(package_contents); + } invalid => Err(vec![_partialerror!( *key.span(), ErrorKind::InvalidField(invalid.to_string().into()), - help = format!("expected fields for {name} is one of `files`, `site_packages`, `libs`, `bins`, `includes`") + help = format!("expected fields for {name} is one of `python`, `command`, `downstream`, `package_contents`") )])? } Ok(()) }).flatten_errors()?; - Ok(PackageContent { - files, - site_packages, - bins, - libs, - includes, - }) + Ok(test) } } -impl Test { - /// Get package content. - pub fn package_content(&self) -> Option<&PackageContent> { - self.package_contents.as_ref() - } +/////////////////////////// +/// Python Test /// +/////////////////////////// - /// Get the imports. - pub fn imports(&self) -> &[String] { - self.imports.as_slice() - } +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut imports = vec![]; + let mut pip_check = true; + + self.iter() + .map(|(key, value)| { + let key_str = key.as_str(); + match key_str { + "imports" => imports = value.try_convert(key_str)?, + "pip_check" => pip_check = value.try_convert(key_str)?, + invalid => Err(vec![_partialerror!( + *key.span(), + ErrorKind::InvalidField(invalid.to_string().into()), + help = format!("expected fields for {name} is one of `imports`") + )])?, + } + Ok(()) + }) + .flatten_errors()?; + + if imports.is_empty() { + Err(vec![_partialerror!( + *self.span(), + ErrorKind::MissingField("imports".into()), + help = "expected field `imports` in python test to be a list of imports" + )])?; + } - /// Get the commands. - pub fn commands(&self) -> &[String] { - self.commands.as_slice() + Ok(PythonTest { imports, pip_check }) } +} + +/////////////////////////// +/// Downstream Test /// +/////////////////////////// + +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut downstream = String::new(); + + self.iter() + .map(|(key, value)| { + let key_str = key.as_str(); + match key_str { + "downstream" => downstream = value.try_convert(key_str)?, + invalid => Err(vec![_partialerror!( + *key.span(), + ErrorKind::InvalidField(invalid.to_string().into()), + help = format!("expected fields for {name} is one of `downstream`") + )])?, + } + Ok(()) + }) + .flatten_errors()?; - /// Get the requires. - pub fn requires(&self) -> &[String] { - self.requires.as_slice() + Ok(DownstreamTest { downstream }) } +} + +/////////////////////////// +/// Commands Test /// +/////////////////////////// + +impl TryConvertNode for RenderedMappingNode { + fn try_convert( + &self, + name: &str, + ) -> Result> { + let mut run = vec![]; + let mut build = vec![]; - /// Get the source files. - pub fn source_files(&self) -> &[String] { - self.source_files.as_slice() + self.iter() + .map(|(key, value)| { + let key_str = key.as_str(); + match key_str { + "run" => run = value.try_convert(key_str)?, + "build" => build = value.try_convert(key_str)?, + invalid => Err(vec![_partialerror!( + *key.span(), + ErrorKind::InvalidField(invalid.to_string().into()), + help = format!("expected fields for {name} is one of `run`, `build`") + )])?, + } + Ok(()) + }) + .flatten_errors()?; + + Ok(CommandsTestRequirements { run, build }) } +} - /// Get the files. - pub fn files(&self) -> &[String] { - self.files.as_slice() +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut source = vec![]; + let mut recipe = vec![]; + + self.iter() + .map(|(key, value)| { + let key_str = key.as_str(); + match key_str { + "source" => source = value.try_convert(key_str)?, + "recipe" => recipe = value.try_convert(key_str)?, + invalid => Err(vec![_partialerror!( + *key.span(), + ErrorKind::InvalidField(invalid.to_string().into()), + help = format!("expected fields for {name} is one of `source`, `build`") + )])?, + } + Ok(()) + }) + .flatten_errors()?; + + Ok(CommandsTestFiles { source, recipe }) } +} - /// Check if there is not test commands to be run - pub fn is_empty(&self) -> bool { - self.commands.is_empty() +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut script = vec![]; + let mut requirements = CommandsTestRequirements::default(); + let mut files = CommandsTestFiles::default(); + + self.iter() + .map(|(key, value)| { + let key_str = key.as_str(); + match key_str { + "script" => script = value.try_convert(key_str)?, + "requirements" => { + requirements = as_mapping(value, key_str)?.try_convert(key_str)? + } + "files" => files = as_mapping(value, key_str)?.try_convert(key_str)?, + invalid => Err(vec![_partialerror!( + *key.span(), + ErrorKind::InvalidField(invalid.to_string().into()), + help = format!( + "expected fields for {name} is one of `script`, `requirements`, `files`" + ) + )])?, + } + Ok(()) + }) + .flatten_errors()?; + + if script.is_empty() { + Err(vec![_partialerror!( + *self.span(), + ErrorKind::MissingField("script".into()), + help = "expected field `script` to be a list of commands" + )])?; + } + + Ok(CommandsTest { + script, + requirements, + files, + }) } } -impl TryConvertNode for RenderedNode { - fn try_convert(&self, name: &str) -> Result> { +/////////////////////////// +/// Package Contents /// +/////////////////////////// + +impl TryConvertNode for RenderedNode { + fn try_convert(&self, name: &str) -> Result> { match self { RenderedNode::Mapping(map) => map.try_convert(name), - RenderedNode::Scalar(_) => Err(vec![_partialerror!( + RenderedNode::Sequence(_) | RenderedNode::Scalar(_) => Err(vec![_partialerror!( *self.span(), ErrorKind::ExpectedMapping, )])?, - RenderedNode::Null(_) => Ok(Test::default()), - RenderedNode::Sequence(_) => todo!("Not implemented yet: sequence on Test"), + RenderedNode::Null(_) => Ok(PackageContents::default()), } } } -impl TryConvertNode for RenderedMappingNode { - fn try_convert(&self, name: &str) -> Result> { - let mut test = Test::default(); +impl TryConvertNode for RenderedMappingNode { + fn try_convert(&self, name: &str) -> Result> { + let mut files = vec![]; + let mut site_packages = vec![]; + let mut libs = vec![]; + let mut bins = vec![]; + let mut includes = vec![]; self.iter().map(|(key, value)| { let key_str = key.as_str(); match key_str { - "package_contents" => test.package_contents = value.try_convert(key_str)?, - "imports" => test.imports = value.try_convert(key_str)?, - "commands" => test.commands = value.try_convert(key_str)?, - "requires" => test.requires = value.try_convert(key_str)?, - "source_files" => test.source_files = value.try_convert(key_str)?, - "files" => test.files = value.try_convert(key_str)?, + "files" => files = value.try_convert(key_str)?, + "site_packages" => site_packages = value.try_convert(key_str)?, + "lib" => libs = value.try_convert(key_str)?, + "bin" => bins = value.try_convert(key_str)?, + "include" => includes = value.try_convert(key_str)?, invalid => Err(vec![_partialerror!( *key.span(), ErrorKind::InvalidField(invalid.to_string().into()), - help = format!("expected fields for {name} is one of `imports`, `commands`, `requires`, `source_files`, `files`") + help = format!("expected fields for {name} is one of `files`, `site_packages`, `libs`, `bins`, `includes`") )])? - }; + } Ok(()) }).flatten_errors()?; - Ok(test) + Ok(PackageContents { + files, + site_packages, + bins, + libs, + includes, + }) + } +} + +#[cfg(test)] +mod test { + use super::TestType; + use insta::assert_snapshot; + + use crate::recipe::custom_yaml::{RenderedNode, TryConvertNode}; + + #[test] + fn test_parsing() { + let test_section = r#" + tests: + - python: + imports: + - import os + - import sys + "#; + + // parse the YAML + let yaml_root = RenderedNode::parse_yaml(0, test_section) + .map_err(|err| vec![err]) + .unwrap(); + let tests_node = yaml_root.as_mapping().unwrap().get("tests").unwrap(); + let tests: Vec = tests_node.try_convert("tests").unwrap(); + + assert_snapshot!(serde_yaml::to_string(&tests).unwrap()); } } diff --git a/src/recipe/snapshots/rattler_build__recipe__parser__tests__recipe_windows.snap b/src/recipe/snapshots/rattler_build__recipe__parser__tests__recipe_windows.snap index b93308e5..e4e4a2fd 100644 --- a/src/recipe/snapshots/rattler_build__recipe__parser__tests__recipe_windows.snap +++ b/src/recipe/snapshots/rattler_build__recipe__parser__tests__recipe_windows.snap @@ -1,6 +1,6 @@ --- source: src/recipe/parser.rs -expression: recipe.unwrap() +expression: win_recipe.unwrap() --- Recipe { package: Package { @@ -254,18 +254,25 @@ Recipe { from_package: {}, }, }, - test: Test { - imports: [], - commands: [ - "if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)", - "if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)", - "if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)", - ], - requires: [], - source_files: [], - files: [], - package_contents: None, - }, + tests: [ + Command( + CommandsTest { + script: [ + "if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)", + "if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)", + "if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)", + ], + requirements: CommandsTestRequirements { + run: [], + build: [], + }, + files: CommandsTestFiles { + source: [], + recipe: [], + }, + }, + ), + ], about: About { homepage: Some( Url { diff --git a/src/recipe/snapshots/rattler_build__recipe__parser__tests__it_works.snap b/src/recipe/snapshots/rattler_build__recipe__parser__tests__unix_recipe.snap similarity index 93% rename from src/recipe/snapshots/rattler_build__recipe__parser__tests__it_works.snap rename to src/recipe/snapshots/rattler_build__recipe__parser__tests__unix_recipe.snap index add76b7c..4053acff 100644 --- a/src/recipe/snapshots/rattler_build__recipe__parser__tests__it_works.snap +++ b/src/recipe/snapshots/rattler_build__recipe__parser__tests__unix_recipe.snap @@ -1,6 +1,6 @@ --- source: src/recipe/parser.rs -expression: recipe.unwrap() +expression: unix_recipe.unwrap() --- Recipe { package: Package { @@ -273,19 +273,26 @@ Recipe { from_package: {}, }, }, - test: Test { - imports: [], - commands: [ - "test -d ${PREFIX}/include/xtensor", - "test -f ${PREFIX}/include/xtensor/xarray.hpp", - "test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake", - "test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake", - ], - requires: [], - source_files: [], - files: [], - package_contents: None, - }, + tests: [ + Command( + CommandsTest { + script: [ + "test -d ${PREFIX}/include/xtensor", + "test -f ${PREFIX}/include/xtensor/xarray.hpp", + "test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake", + "test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake", + ], + requirements: CommandsTestRequirements { + run: [], + build: [], + }, + files: CommandsTestFiles { + source: [], + recipe: [], + }, + }, + ), + ], about: About { homepage: Some( Url { diff --git a/src/snapshots/rattler_build__metadata__test__read_full_recipe.snap b/src/snapshots/rattler_build__metadata__test__read_full_recipe.snap index fce4181b..62f5fc6f 100644 --- a/src/snapshots/rattler_build__metadata__test__read_full_recipe.snap +++ b/src/snapshots/rattler_build__metadata__test__read_full_recipe.snap @@ -25,13 +25,16 @@ recipe: - "pygments >=2.13.0,<3.0.0" - python ==3.10 - "typing_extensions >=4.0.0,<5.0.0" - test: - imports: - - rich - commands: - - pip check - requires: - - pip + tests: + - test_type: python + imports: + - rich + - test_type: command + script: + - pip check + requirements: + run: + - pip about: homepage: "https://github.com/Textualize/rich" repository: "https://github.com/Textualize/rich" @@ -52,13 +55,13 @@ build_configuration: hash_input: "{\"target_platform\": \"noarch\"}" hash_prefix: py directories: - host_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/host_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold - build_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/build_env - work_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/work - build_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439 + host_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/host_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold + build_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/build_env + work_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/work + build_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812 channels: - conda-forge - timestamp: "2023-11-21T13:30:39.246259Z" + timestamp: "2023-12-13T18:40:12.845800Z" subpackages: rich: name: rich @@ -225,39 +228,6 @@ finalized_dependencies: fn: tzdata-2023c-h71feb2d_0.conda url: "https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda" channel: "https://conda.anaconda.org/conda-forge/" - - build: hf2abe2d_0 - build_number: 0 - depends: - - libsqlite 3.44.0 h091b4b1_0 - - "libzlib >=1.2.13,<1.3.0a0" - - "ncurses >=6.4,<7.0a0" - - "readline >=8.2,<9.0a0" - license: Unlicense - md5: 0080e3f5d7d13d3b1e244ed24642ca9e - name: sqlite - sha256: 8263043d2a5762a5bbbb4ceee28382d97e70182fff8d45371b65fedda0b709ee - size: 800748 - subdir: osx-arm64 - timestamp: 1698855055771 - version: 3.44.0 - fn: sqlite-3.44.0-hf2abe2d_0.conda - url: "https://conda.anaconda.org/conda-forge/osx-arm64/sqlite-3.44.0-hf2abe2d_0.conda" - channel: "https://conda.anaconda.org/conda-forge/" - - build: h091b4b1_0 - build_number: 0 - depends: - - "libzlib >=1.2.13,<1.3.0a0" - license: Unlicense - md5: 28eb31a5b4e704353ed575758e2fcf1d - name: libsqlite - sha256: 38e98953b572e2871f2b318fa7fe8d9997b0927970916c2d09402273b60ff832 - size: 815079 - subdir: osx-arm64 - timestamp: 1698855024189 - version: 3.44.0 - fn: libsqlite-3.44.0-h091b4b1_0.conda - url: "https://conda.anaconda.org/conda-forge/osx-arm64/libsqlite-3.44.0-h091b4b1_0.conda" - channel: "https://conda.anaconda.org/conda-forge/" - build: pyhd8ed1ab_0 build_number: 0 depends: @@ -275,23 +245,56 @@ finalized_dependencies: fn: poetry-core-1.8.1-pyhd8ed1ab_0.conda url: "https://conda.anaconda.org/conda-forge/noarch/poetry-core-1.8.1-pyhd8ed1ab_0.conda" channel: "https://conda.anaconda.org/conda-forge/" - - build: h0d3ecfb_0 + - build: hf2abe2d_0 + build_number: 0 + depends: + - libsqlite 3.44.2 h091b4b1_0 + - "libzlib >=1.2.13,<1.3.0a0" + - "ncurses >=6.4,<7.0a0" + - "readline >=8.2,<9.0a0" + license: Unlicense + md5: c98aa8eb8f02260610c5bb981027ba5d + name: sqlite + sha256: b034405d93e7153f777d52c18fe26120356c568e4ca85626712d633d939a8923 + size: 803166 + subdir: osx-arm64 + timestamp: 1700863604745 + version: 3.44.2 + fn: sqlite-3.44.2-hf2abe2d_0.conda + url: "https://conda.anaconda.org/conda-forge/osx-arm64/sqlite-3.44.2-hf2abe2d_0.conda" + channel: "https://conda.anaconda.org/conda-forge/" + - build: h091b4b1_0 build_number: 0 + depends: + - "libzlib >=1.2.13,<1.3.0a0" + license: Unlicense + md5: d7e1af696cfadec251a0abdd7b79ed77 + name: libsqlite + sha256: f0dc2fe69eddb4bab72ff6bb0da51d689294f466ee1b01e80ced1e7878a21aa5 + size: 815254 + subdir: osx-arm64 + timestamp: 1700863572318 + version: 3.44.2 + fn: libsqlite-3.44.2-h091b4b1_0.conda + url: "https://conda.anaconda.org/conda-forge/osx-arm64/libsqlite-3.44.2-h091b4b1_0.conda" + channel: "https://conda.anaconda.org/conda-forge/" + - build: h0d3ecfb_1 + build_number: 1 constrains: - pyopenssl >=22.1 depends: - ca-certificates license: Apache-2.0 license_family: Apache - md5: 5a89552fececf4cd99628318ccbb67a3 + md5: 47d16d26100f19ca495882882b7bc93b name: openssl - sha256: 3c715b1d4940c7ad6065935db18924b85a54048dde066f963cfc250340639457 - size: 2147225 + sha256: a53e1c6c058b621fd1d13cca6f9cccd534d2b3f4b4ac789fe26f7902031d6c41 + size: 2856233 subdir: osx-arm64 - timestamp: 1698164947105 - version: 3.1.4 - fn: openssl-3.1.4-h0d3ecfb_0.conda - url: "https://conda.anaconda.org/conda-forge/osx-arm64/openssl-3.1.4-h0d3ecfb_0.conda" + timestamp: 1701162541844 + version: 3.2.0 + fn: openssl-3.2.0-h0d3ecfb_1.conda + url: "https://conda.anaconda.org/conda-forge/osx-arm64/openssl-3.2.0-h0d3ecfb_1.conda" channel: "https://conda.anaconda.org/conda-forge/" - build: hf0a4a13_0 build_number: 0 @@ -332,16 +335,16 @@ finalized_dependencies: - python >=3.7 license: MIT license_family: MIT - md5: 3fc026b9c87d091c4b34a6c997324ae8 + md5: 1cdea58981c5cbc17b51973bcaddcea7 name: wheel noarch: python - sha256: 84c3b57fba778add2bd47b7cc70e86f746d2c55549ffd2ccb6f3d6bf7c94d21d - size: 57901 + sha256: 80be0ccc815ce22f80c141013302839b0ed938a2edb50b846cf48d8a8c1cfa01 + size: 57553 subdir: noarch - timestamp: 1698670970223 - version: 0.41.3 - fn: wheel-0.41.3-pyhd8ed1ab_0.conda - url: "https://conda.anaconda.org/conda-forge/noarch/wheel-0.41.3-pyhd8ed1ab_0.conda" + timestamp: 1701013309664 + version: 0.42.0 + fn: wheel-0.42.0-pyhd8ed1ab_0.conda + url: "https://conda.anaconda.org/conda-forge/noarch/wheel-0.42.0-pyhd8ed1ab_0.conda" channel: "https://conda.anaconda.org/conda-forge/" - build: pyhd8ed1ab_0 build_number: 0 diff --git a/src/source/copy_dir.rs b/src/source/copy_dir.rs index ea49ff14..fbf5926b 100644 --- a/src/source/copy_dir.rs +++ b/src/source/copy_dir.rs @@ -75,6 +75,7 @@ impl<'a> CopyDir<'a> { self } + #[allow(unused)] pub fn with_include_globs(mut self, includes: I) -> Self where I: IntoIterator, @@ -89,6 +90,7 @@ impl<'a> CopyDir<'a> { self } + #[allow(unused)] pub fn with_exclude_globs(mut self, excludes: I) -> Self where I: IntoIterator, @@ -146,7 +148,7 @@ impl<'a> CopyDir<'a> { let folders = Arc::new(folders.into_iter().map(PathBuf::from).collect::>()); let mut result = CopyDirResult { - copied_pathes: Vec::with_capacity(0), // do not allocate as we overwrite this anyways + copied_paths: Vec::with_capacity(0), // do not allocate as we overwrite this anyways include_globs: make_glob_match_map(globs)?, exclude_globs: make_glob_match_map(self.exclude_globs)?, }; @@ -257,20 +259,20 @@ impl<'a> CopyDir<'a> { .filter_map(|res| res.transpose()) .collect::, SourceError>>()?; - result.copied_pathes = copied_pathes; + result.copied_paths = copied_pathes; Ok(result) } } pub(crate) struct CopyDirResult<'a> { - copied_pathes: Vec, + copied_paths: Vec, include_globs: HashMap, Match>, exclude_globs: HashMap, Match>, } impl<'a> CopyDirResult<'a> { - pub fn copied_pathes(&self) -> &[PathBuf] { - &self.copied_pathes + pub fn copied_paths(&self) -> &[PathBuf] { + &self.copied_paths } pub fn include_globs(&self) -> &HashMap, Match> { @@ -397,8 +399,8 @@ mod test { .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 1); - assert_eq!(copy_dir.copied_pathes()[0], dest_dir_2.join("test.txt")); + assert_eq!(copy_dir.copied_paths().len(), 1); + assert_eq!(copy_dir.copied_paths()[0], dest_dir_2.join("test.txt")); let dest_dir_3 = tmp_dir_path.as_path().join("test_copy_dir_dest_3"); // ignore all txt files @@ -408,13 +410,13 @@ mod test { .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 2); + assert_eq!(copy_dir.copied_paths().len(), 2); let expected = [ dest_dir_3.join("test_dir/test.md"), dest_dir_3.join("test_dir/test_dir2"), ]; let expected = expected.iter().collect::>(); - let result = copy_dir.copied_pathes().iter().collect::>(); + let result = copy_dir.copied_paths().iter().collect::>(); assert_eq!(result, expected); } @@ -434,7 +436,7 @@ mod test { .use_gitignore(false) .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 2); + assert_eq!(copy_dir.copied_paths().len(), 2); fs_extra::dir::create_all(&dest_dir, true).unwrap(); let copy_dir = super::CopyDir::new(tmp_dir.path(), dest_dir.path()) @@ -443,9 +445,9 @@ mod test { .use_gitignore(false) .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 1); + assert_eq!(copy_dir.copied_paths().len(), 1); assert_eq!( - copy_dir.copied_pathes()[0], + copy_dir.copied_paths()[0], dest_dir.path().join("test_copy_dir/test_1.txt") ); @@ -455,9 +457,9 @@ mod test { .use_gitignore(false) .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 1); + assert_eq!(copy_dir.copied_paths().len(), 1); assert_eq!( - copy_dir.copied_pathes()[0], + copy_dir.copied_paths()[0], dest_dir.path().join("test_copy_dir/test_1.txt") ); } @@ -493,7 +495,7 @@ mod test { .use_gitignore(false) .run() .unwrap(); - assert_eq!(copy_dir.copied_pathes().len(), 3); + assert_eq!(copy_dir.copied_paths().len(), 3); let broken_symlink_dest = dest_dir.path().join("broken_symlink"); assert_eq!( diff --git a/test-data/recipes/flask/recipe.yaml b/test-data/recipes/flask/recipe.yaml index ecc7776e..6017bbd5 100644 --- a/test-data/recipes/flask/recipe.yaml +++ b/test-data/recipes/flask/recipe.yaml @@ -35,15 +35,17 @@ requirements: - importlib-metadata >=3.6.0 - blinker >=1.6.2 -test: - requires: - - pip - imports: - - flask - - flask.json - commands: - - flask --help - - pip check +tests: + - python: + imports: + - flask + - flask.json + - requirements: + run: + - pip + script: + - flask --help + - pip check about: homepage: https://palletsprojects.com/p/flask diff --git a/test-data/recipes/package-content-tests/llama-recipe.yaml b/test-data/recipes/package-content-tests/llama-recipe.yaml index ce8bd13f..94ea7f0a 100644 --- a/test-data/recipes/package-content-tests/llama-recipe.yaml +++ b/test-data/recipes/package-content-tests/llama-recipe.yaml @@ -26,11 +26,11 @@ requirements: - if: win then: ninja -test: - package_contents: - bins: - - main - - quantize +tests: + - package_contents: + bin: + - main + - quantize about: homepage: https://github.com/ggerganov/llama.cpp diff --git a/test-data/recipes/package-content-tests/recipe-test-fail.yaml b/test-data/recipes/package-content-tests/recipe-test-fail.yaml index c0eb8365..1d98d9d3 100644 --- a/test-data/recipes/package-content-tests/recipe-test-fail.yaml +++ b/test-data/recipes/package-content-tests/recipe-test-fail.yaml @@ -10,8 +10,8 @@ build: else: - echo "Hello World" > %PREFIX%\test-execution.txt -test: - package_contents: - files: - - "**/*.txt" - - "**/*.php" +tests: + - package_contents: + files: + - "**/*.txt" + - "**/*.php" diff --git a/test-data/recipes/package-content-tests/recipe-test-succeed.yaml b/test-data/recipes/package-content-tests/recipe-test-succeed.yaml index 23571904..c20a958c 100644 --- a/test-data/recipes/package-content-tests/recipe-test-succeed.yaml +++ b/test-data/recipes/package-content-tests/recipe-test-succeed.yaml @@ -12,9 +12,9 @@ build: - echo "Hello World" > %PREFIX%\test-execution.txt - mkdir %PREFIX%\Library\bin && echo "Hello World" > %PREFIX%\Library\bin\rust.exe -test: - package_contents: - files: - - "**/*.txt" - bins: - - rust +tests: + - package_contents: + files: + - "**/*.txt" + bin: + - rust diff --git a/test-data/recipes/package-content-tests/rich-recipe.yaml b/test-data/recipes/package-content-tests/rich-recipe.yaml index 7eb10de2..e7641d60 100644 --- a/test-data/recipes/package-content-tests/rich-recipe.yaml +++ b/test-data/recipes/package-content-tests/rich-recipe.yaml @@ -29,16 +29,18 @@ requirements: - python 3.10 - typing_extensions >=4.0.0,<5.0.0 -test: - imports: - - rich - commands: +tests: + - python: + imports: + - rich + - script: - pip check - requires: - - pip - package_contents: - site_packages: - - rich + requirements: + run: + - pip + - package_contents: + site_packages: + - rich about: homepage: https://github.com/Textualize/rich diff --git a/test-data/recipes/test-execution/recipe-test-fail.yaml b/test-data/recipes/test-execution/recipe-test-fail.yaml index 2ef4005a..46a1928d 100644 --- a/test-data/recipes/test-execution/recipe-test-fail.yaml +++ b/test-data/recipes/test-execution/recipe-test-fail.yaml @@ -10,8 +10,8 @@ build: else: - echo "Hello World" > %PREFIX%\test-execution.txt -test: - commands: +tests: + - script: - if: unix then: - test -f $PREFIX/fail.txt diff --git a/test-data/recipes/test-execution/recipe-test-succeed.yaml b/test-data/recipes/test-execution/recipe-test-succeed.yaml index b791b83c..99b0523a 100644 --- a/test-data/recipes/test-execution/recipe-test-succeed.yaml +++ b/test-data/recipes/test-execution/recipe-test-succeed.yaml @@ -10,8 +10,8 @@ build: else: - echo "Hello World" > %PREFIX%\test-execution.txt -test: - commands: +tests: + - script: - if: unix then: - test -f $PREFIX/test-execution.txt diff --git a/test-data/recipes/test-sources/recipe.yaml b/test-data/recipes/test-sources/recipe.yaml index 55e74663..de40ef70 100644 --- a/test-data/recipes/test-sources/recipe.yaml +++ b/test-data/recipes/test-sources/recipe.yaml @@ -25,16 +25,17 @@ build: - test -f ./am-i-renamed.txt - test -f ./test.avi -test: - source_files: - - test.avi - files: - - test-file.txt - - test-folder/ - commands: - - echo "test" - - test -f ./test.avi - - test -f ./test-file.txt - - test -d ./test-folder - - test -f ./test-folder/test-file-2.txt - - test -f ./test-folder/test-file-3.txt +tests: + - script: + - echo "test" + - test -f ./test.avi + - test -f ./test-file.txt + - test -d ./test-folder + - test -f ./test-folder/test-file-2.txt + - test -f ./test-folder/test-file-3.txt + files: + source: + - test.avi + recipe: + - test-file.txt + - test-folder/ diff --git a/test-data/rendered_recipes/rich_recipe.yaml b/test-data/rendered_recipes/rich_recipe.yaml index 57cec86c..b6e2cf19 100644 --- a/test-data/rendered_recipes/rich_recipe.yaml +++ b/test-data/rendered_recipes/rich_recipe.yaml @@ -21,13 +21,16 @@ recipe: - pygments >=2.13.0,<3.0.0 - python ==3.10 - typing_extensions >=4.0.0,<5.0.0 - test: + tests: + - test_type: python imports: - rich - commands: + - test_type: command + script: - pip check - requires: - - pip + requirements: + run: + - pip about: homepage: https://github.com/Textualize/rich repository: https://github.com/Textualize/rich @@ -53,13 +56,13 @@ build_configuration: hash_input: '{"target_platform": "noarch"}' hash_prefix: py directories: - host_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/host_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold - build_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/build_env - work_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439/work - build_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1700573439 + host_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/host_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold + build_prefix: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/build_env + work_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812/work + build_dir: /Users/wolfv/Programs/rattler-build/output/bld/rattler-build_rich_1702492812 channels: - conda-forge - timestamp: 2023-11-21T13:30:39.246259Z + timestamp: 2023-12-13T18:40:12.845800Z subpackages: rich: name: rich @@ -226,39 +229,6 @@ finalized_dependencies: fn: tzdata-2023c-h71feb2d_0.conda url: https://conda.anaconda.org/conda-forge/noarch/tzdata-2023c-h71feb2d_0.conda channel: https://conda.anaconda.org/conda-forge/ - - build: hf2abe2d_0 - build_number: 0 - depends: - - libsqlite 3.44.0 h091b4b1_0 - - libzlib >=1.2.13,<1.3.0a0 - - ncurses >=6.4,<7.0a0 - - readline >=8.2,<9.0a0 - license: Unlicense - md5: 0080e3f5d7d13d3b1e244ed24642ca9e - name: sqlite - sha256: 8263043d2a5762a5bbbb4ceee28382d97e70182fff8d45371b65fedda0b709ee - size: 800748 - subdir: osx-arm64 - timestamp: 1698855055771 - version: 3.44.0 - fn: sqlite-3.44.0-hf2abe2d_0.conda - url: https://conda.anaconda.org/conda-forge/osx-arm64/sqlite-3.44.0-hf2abe2d_0.conda - channel: https://conda.anaconda.org/conda-forge/ - - build: h091b4b1_0 - build_number: 0 - depends: - - libzlib >=1.2.13,<1.3.0a0 - license: Unlicense - md5: 28eb31a5b4e704353ed575758e2fcf1d - name: libsqlite - sha256: 38e98953b572e2871f2b318fa7fe8d9997b0927970916c2d09402273b60ff832 - size: 815079 - subdir: osx-arm64 - timestamp: 1698855024189 - version: 3.44.0 - fn: libsqlite-3.44.0-h091b4b1_0.conda - url: https://conda.anaconda.org/conda-forge/osx-arm64/libsqlite-3.44.0-h091b4b1_0.conda - channel: https://conda.anaconda.org/conda-forge/ - build: pyhd8ed1ab_0 build_number: 0 depends: @@ -276,23 +246,56 @@ finalized_dependencies: fn: poetry-core-1.8.1-pyhd8ed1ab_0.conda url: https://conda.anaconda.org/conda-forge/noarch/poetry-core-1.8.1-pyhd8ed1ab_0.conda channel: https://conda.anaconda.org/conda-forge/ - - build: h0d3ecfb_0 + - build: hf2abe2d_0 build_number: 0 + depends: + - libsqlite 3.44.2 h091b4b1_0 + - libzlib >=1.2.13,<1.3.0a0 + - ncurses >=6.4,<7.0a0 + - readline >=8.2,<9.0a0 + license: Unlicense + md5: c98aa8eb8f02260610c5bb981027ba5d + name: sqlite + sha256: b034405d93e7153f777d52c18fe26120356c568e4ca85626712d633d939a8923 + size: 803166 + subdir: osx-arm64 + timestamp: 1700863604745 + version: 3.44.2 + fn: sqlite-3.44.2-hf2abe2d_0.conda + url: https://conda.anaconda.org/conda-forge/osx-arm64/sqlite-3.44.2-hf2abe2d_0.conda + channel: https://conda.anaconda.org/conda-forge/ + - build: h091b4b1_0 + build_number: 0 + depends: + - libzlib >=1.2.13,<1.3.0a0 + license: Unlicense + md5: d7e1af696cfadec251a0abdd7b79ed77 + name: libsqlite + sha256: f0dc2fe69eddb4bab72ff6bb0da51d689294f466ee1b01e80ced1e7878a21aa5 + size: 815254 + subdir: osx-arm64 + timestamp: 1700863572318 + version: 3.44.2 + fn: libsqlite-3.44.2-h091b4b1_0.conda + url: https://conda.anaconda.org/conda-forge/osx-arm64/libsqlite-3.44.2-h091b4b1_0.conda + channel: https://conda.anaconda.org/conda-forge/ + - build: h0d3ecfb_1 + build_number: 1 constrains: - pyopenssl >=22.1 depends: - ca-certificates license: Apache-2.0 license_family: Apache - md5: 5a89552fececf4cd99628318ccbb67a3 + md5: 47d16d26100f19ca495882882b7bc93b name: openssl - sha256: 3c715b1d4940c7ad6065935db18924b85a54048dde066f963cfc250340639457 - size: 2147225 + sha256: a53e1c6c058b621fd1d13cca6f9cccd534d2b3f4b4ac789fe26f7902031d6c41 + size: 2856233 subdir: osx-arm64 - timestamp: 1698164947105 - version: 3.1.4 - fn: openssl-3.1.4-h0d3ecfb_0.conda - url: https://conda.anaconda.org/conda-forge/osx-arm64/openssl-3.1.4-h0d3ecfb_0.conda + timestamp: 1701162541844 + version: 3.2.0 + fn: openssl-3.2.0-h0d3ecfb_1.conda + url: https://conda.anaconda.org/conda-forge/osx-arm64/openssl-3.2.0-h0d3ecfb_1.conda channel: https://conda.anaconda.org/conda-forge/ - build: hf0a4a13_0 build_number: 0 @@ -333,16 +336,16 @@ finalized_dependencies: - python >=3.7 license: MIT license_family: MIT - md5: 3fc026b9c87d091c4b34a6c997324ae8 + md5: 1cdea58981c5cbc17b51973bcaddcea7 name: wheel noarch: python - sha256: 84c3b57fba778add2bd47b7cc70e86f746d2c55549ffd2ccb6f3d6bf7c94d21d - size: 57901 + sha256: 80be0ccc815ce22f80c141013302839b0ed938a2edb50b846cf48d8a8c1cfa01 + size: 57553 subdir: noarch - timestamp: 1698670970223 - version: 0.41.3 - fn: wheel-0.41.3-pyhd8ed1ab_0.conda - url: https://conda.anaconda.org/conda-forge/noarch/wheel-0.41.3-pyhd8ed1ab_0.conda + timestamp: 1701013309664 + version: 0.42.0 + fn: wheel-0.42.0-pyhd8ed1ab_0.conda + url: https://conda.anaconda.org/conda-forge/noarch/wheel-0.42.0-pyhd8ed1ab_0.conda channel: https://conda.anaconda.org/conda-forge/ - build: pyhd8ed1ab_0 build_number: 0