Skip to content

Commit

Permalink
Merge upstream changes
Browse files Browse the repository at this point in the history
  • Loading branch information
ewels committed Jul 29, 2020
2 parents 50f54c7 + 24aa954 commit a0a12e8
Show file tree
Hide file tree
Showing 31 changed files with 757 additions and 598 deletions.
19 changes: 16 additions & 3 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,29 @@ is as follows:

If you're not used to this workflow with git, you can start with some [basic docs from GitHub](https://help.github.com/articles/fork-a-repo/).

## Installing dev requirements

If you want to work with developing the nf-core/tools code, you'll need a couple of extra Python packages.
These are listed in `requirements-dev.txt` and can be installed as follows:

```bash
pip install --upgrade -r requirements-dev.txt
```

Then install your local fork of nf-core/tools:

```bash
pip install -e .
```

## Code formatting with Black

All Python code in nf-core/tools must be passed through the [Black Python code formatter](https://black.readthedocs.io/en/stable/).
This ensures a harmonised code formatting style throughout the package, from all contributors.

You can run Black on the command line - first install using `pip` and then run recursively on the whole repository:
You can run Black on the command line (it's included in `requirements-dev.txt`) - eg. to run recursively on the whole repository:

```bash
pip install black
black .
```

Expand Down Expand Up @@ -61,7 +75,6 @@ New features should also come with new tests, to keep the test-coverage high (we
You can try running the tests locally before pushing code using the following command:

```bash
pip install --upgrade pip pytest pytest-datafiles pytest-cov mock
pytest --color=yes tests/
```

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ jobs:

- name: Install python dependencies
run: |
python -m pip install --upgrade pip pytest pytest-datafiles pytest-cov mock jsonschema
pip install .
python -m pip install --upgrade pip -r requirements-dev.txt
pip install -e .
- name: Install Nextflow
run: |
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/tools-api-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ on:
branches: [master, dev]

jobs:
build-n-publish:
name: Build and publish nf-core to PyPI
api-docs:
name: Build & push Sphinx API docs
runs-on: ubuntu-18.04

steps:
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
testing*
htmlcov/
.tox/
.coverage
Expand Down Expand Up @@ -107,3 +108,4 @@ ENV/

# backup files
*~
*\?
3 changes: 2 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ To support these new schema files, nf-core/tools now comes with a new set of com
* Pipeline schema can be generated or updated using `nf-core schema build` - this takes the parameters from
the pipeline config file and prompts the developer for any mismatch between schema and pipeline.
* Once a skeleton Schema file has been built, the command makes use of a new nf-core website tool to provide
a user friendly graphical interface for developers to add content to their schema: [https://nf-co.re/json_schema_build](https://nf-co.re/json_schema_build)
a user friendly graphical interface for developers to add content to their schema: [https://nf-co.re/pipeline_schema_builder](https://nf-co.re/pipeline_schema_builder)
* Pipelines will be automatically tested for valid schema that describe all pipeline parameters using the
`nf-core schema lint` command (also included as part of the main `nf-core lint` command).
* Users can validate their set of pipeline inputs using the `nf-core schema validate` command.
Expand Down Expand Up @@ -79,6 +79,7 @@ making a pull-request. See [`.github/CONTRIBUTING.md`](.github/CONTRIBUTING.md)
* Fail if `params.input` isn't defined.
* Beautiful new progress bar to look at whilst linting is running and awesome new formatted output on the command line :heart_eyes:
* All made using the excellent [`rich` python library](https://github.com/willmcgugan/rich) - check it out!
* Tests looking for `TODO` strings should now ignore editor backup files. [#477](https://github.com/nf-core/tools/issues/477)

### nf-core/tools Continuous Integration

Expand Down
342 changes: 151 additions & 191 deletions README.md

Large diffs are not rendered by default.

38 changes: 36 additions & 2 deletions docs/lint_errors.md
Original file line number Diff line number Diff line change
Expand Up @@ -347,8 +347,42 @@ Finding a placeholder like this means that something was probably copied and pas

Pipelines should have a `nextflow_schema.json` file that describes the different pipeline parameters (eg. `params.something`, `--something`).

Schema should be valid JSON files and adhere to [JSONSchema](https://json-schema.org/), Draft 7.
The top-level schema should be an `object`, where each of the `properties` corresponds to a pipeline parameter.
* Schema should be valid JSON files
* Schema should adhere to [JSONSchema](https://json-schema.org/), Draft 7.
* Parameters can be described in two places:
* As `properties` in the top-level schema object
* As `properties` within subschemas listed in a top-level `definitions` objects
* The schema must describe at least one parameter
* There must be no duplicate parameter IDs across the schema and definition subschema
* All subschema in `definitions` must be referenced in the top-level `allOf` key
* The top-level `allOf` key must not describe any non-existent definitions
* Core top-level schema attributes should exist and be set as follows:
* `$schema`: `https://json-schema.org/draft-07/schema`
* `$id`: URL to the raw schema file, eg. `https://raw.githubusercontent.com/YOURPIPELINE/master/nextflow_schema.json`
* `title`: `YOURPIPELINE pipeline parameters`
* `description`: The piepline config `manifest.description`

For example, an _extremely_ minimal schema could look like this:

```json
{
"$schema": "https://json-schema.org/draft-07/schema",
"$id": "https://raw.githubusercontent.com/YOURPIPELINE/master/nextflow_schema.json",
"title": "YOURPIPELINE pipeline parameters",
"description": "This pipeline is for testing",
"properties": {
"first_param": { "type": "string" }
},
"definitions": {
"my_first_group": {
"properties": {
"second_param": { "type": "string" }
}
}
},
"allOf": [{"$ref": "#/definitions/my_first_group"}]
}
```

## Error #15 - Schema config check ## {#15}

Expand Down
24 changes: 15 additions & 9 deletions nf_core/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,8 +107,8 @@ def nf_core_cli(verbose):
logging.basicConfig(
level=logging.DEBUG if verbose else logging.INFO,
format="%(message)s",
datefmt=".",
handlers=[rich.logging.RichHandler(console=stderr)],
datefmt=" ",
handlers=[rich.logging.RichHandler(console=stderr, markup=True)],
)


Expand Down Expand Up @@ -406,21 +406,20 @@ def schema():
Suite of tools for developers to manage pipeline schema.
All nf-core pipelines should have a nextflow_schema.json file in their
root directory. This is a JSON Schema that describes the different
pipeline parameters.
root directory that describes the different pipeline parameters.
"""
pass


@schema.command(help_priority=1)
@click.argument("pipeline", required=True, metavar="<pipeline name>")
@click.option("--params", type=click.Path(exists=True), required=True, help="JSON parameter file")
@click.argument("params", type=click.Path(exists=True), required=True, metavar="<JSON params file>")
def validate(pipeline, params):
"""
Validate a set of parameters against a pipeline schema.
Nextflow can be run using the -params-file flag, which loads
script parameters from a JSON/YAML file.
script parameters from a JSON file.
This command takes such a file and validates it against the pipeline
schema, checking whether all schema rules are satisfied.
Expand All @@ -447,7 +446,7 @@ def validate(pipeline, params):
@click.option(
"--url",
type=str,
default="https://nf-co.re/json_schema_build",
default="https://nf-co.re/pipeline_schema_builder",
help="Customise the builder URL (for development work)",
)
def build(pipeline_dir, no_prompts, web_only, url):
Expand All @@ -468,7 +467,7 @@ def build(pipeline_dir, no_prompts, web_only, url):


@schema.command(help_priority=3)
@click.argument("schema_path", type=click.Path(exists=True), required=True, metavar="<JSON Schema file>")
@click.argument("schema_path", type=click.Path(exists=True), required=True, metavar="<pipeline schema>")
def lint(schema_path):
"""
Check that a given pipeline schema is valid.
Expand Down Expand Up @@ -509,7 +508,14 @@ def bump_version(pipeline_dir, new_version, nextflow):

# First, lint the pipeline to check everything is in order
log.info("Running nf-core lint tests")
lint_obj = nf_core.lint.run_linting(pipeline_dir, False)

# Run the lint tests
try:
lint_obj = nf_core.lint.PipelineLint(pipeline_dir)
lint_obj.lint_pipeline()
except AssertionError as e:
log.error("Please fix lint errors before bumping versions")
return
if len(lint_obj.failed) > 0:
log.error("Please fix lint errors before bumping versions")
return
Expand Down
3 changes: 1 addition & 2 deletions nf_core/bump_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,8 +160,7 @@ def update_file_version(filename, lint_obj, pattern, newstr, allow_multiple=Fals
log.info(
"Updating version in {}\n".format(filename)
+ "[red] - {}\n".format("\n - ".join(matches_pattern).strip())
+ "[green] + {}\n".format("\n + ".join(matches_newstr).strip()),
extra={"markup": True},
+ "[green] + {}\n".format("\n + ".join(matches_newstr).strip())
)

with open(fn, "w") as fh:
Expand Down
6 changes: 2 additions & 4 deletions nf_core/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,8 +63,7 @@ def init_pipeline(self):
"[green bold]!!!!!! IMPORTANT !!!!!!\n\n"
+ "[green not bold]If you are interested in adding your pipeline to the nf-core community,\n"
+ "PLEASE COME AND TALK TO US IN THE NF-CORE SLACK BEFORE WRITING ANY CODE!\n\n"
+ "[default]Please read: [link=https://nf-co.re/developers/adding_pipelines#join-the-community]https://nf-co.re/developers/adding_pipelines#join-the-community[/link]",
extra={"markup": True},
+ "[default]Please read: [link=https://nf-co.re/developers/adding_pipelines#join-the-community]https://nf-co.re/developers/adding_pipelines#join-the-community[/link]"
)

def run_cookiecutter(self):
Expand Down Expand Up @@ -149,7 +148,6 @@ def git_init_pipeline(self):
"Done. Remember to add a remote and push to GitHub:\n"
+ "[white on grey23] cd {} \n".format(self.outdir)
+ " git remote add origin git@github.com:USERNAME/REPO_NAME.git \n"
+ " git push --all origin ",
extra={"markup": True},
+ " git push --all origin "
)
log.info("This will also push your newly created dev branch and the TEMPLATE branch for syncing.")
Loading

0 comments on commit a0a12e8

Please sign in to comment.