Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated with recent cookiecutter updates #107

Merged
merged 1 commit into from
Aug 8, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ install:
- cd ${TRAVIS_BUILD_DIR}/tests

env:
- NXF_VER=0.30.0
- NXF_VER=''
- NXF_VER=0.30.0 # Specify a minimum NF version that should be tested and work
- NXF_VER='' # Plus: get the latest NF version and check, that it works

script:
# Lint the pipeline code
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
/*
* -------------------------------------------------
* Nextflow config file for AWS Batch
* -------------------------------------------------
* Imported under the 'awsbatch' Nextflow profile in nextflow.config
* Uses docker for software depedencies automagically, so not specified here.
*/

aws.region = params.awsregion
process.executor = 'awsbatch'
process.queue = params.awsqueue
executor.awscli = '/home/ec2-user/miniconda/bin/aws'
params.tracedir = './'
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ process {
errorStrategy = { task.exitStatus in [143,137] ? 'retry' : 'ignore' }
}
$multiqc {
executor = 'local'
errorStrategy = { task.exitStatus in [143,137] ? 'retry' : 'ignore' }
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,3 +41,9 @@ unzip master.zip -d /my-pipelines/
cd /my_data/
nextflow run /my-pipelines/{{ cookiecutter.pipeline_slug }}-master
```

To stop nextflow from looking for updates online, you can tell it to run in offline mode by specifying the following environment variable in your ~/.bashrc file:

```bash
export NXF_OFFLINE='TRUE'
```
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,8 @@ Use this parameter to choose a configuration profile. Each profile is designed f
* `docker`
* A generic configuration profile to be used with [Docker](http://docker.com/)
* Runs using the `local` executor and pulls software from dockerhub: [`{{ cookiecutter.pipeline_slug }}`](http://hub.docker.com/r/{{ cookiecutter.pipeline_slug }}/)
* `aws`
* A starter configuration for running the pipeline on Amazon Web Services. Uses docker and Spark.
* See [`docs/configuration/aws.md`](configuration/aws.md)
* `awsbatch`
* A generic configuration profile to be used with AWS Batch.
* `standard`
* The default profile, used if `-profile` is not specified at all. Runs locally and expects all software to be installed and available on the `PATH`.
* This profile is mainly designed to be used as a starting point for other configurations and is inherited by most of the other profiles.
Expand Down Expand Up @@ -132,6 +131,15 @@ Each step in the pipeline has a default set of requirements for number of CPUs,
### Custom resource requests
Wherever process-specific requirements are set in the pipeline, the default value can be changed by creating a custom config file. See the files in [`conf`](../conf) for examples.

## AWS Batch specific parameters
Running the pipeline on AWS Batch requires a couple of specific parameters to be set according to your AWS Batch configuration. Please use the `-awsbatch` profile and then specify all of the following parameters.
### `--awsqueue`
The JobQueue that you intend to use on AWS Batch.
### `--awsregion`
The AWS region to run your job in. Default is set to `eu-west-1` but can be adjusted to your needs.

Please make sure to also set the `-w/--work-dir` and `--outdir` parameters to a S3 storage bucket of your choice - you'll get an error message notifying you if you didn't.

## Other command line parameters
### `--outdir`
The output directory where the results will be saved.
Expand Down
13 changes: 12 additions & 1 deletion nf_core/pipeline-template/{{cookiecutter.pipeline_slug}}/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,8 @@ def helpMessage() {
Mandatory arguments:
--reads Path to input data (must be surrounded with quotes)
--genome Name of iGenomes reference
-profile Hardware config to use. docker / aws
-profile Configuration profile to use. Can use multiple (comma separated)
Available: standard, conda, docker, singularity, awsbatch, test

Options:
--singleEnd Specifies that the input is single end reads
Expand Down Expand Up @@ -79,6 +80,12 @@ if( !(workflow.runName ==~ /[a-z]+_[a-z]+/) ){
custom_runName = workflow.runName
}

// Check workDir/outdir paths to be S3 buckets if running on AWSBatch
// related: https://github.com/nextflow-io/nextflow/issues/813
if( workflow.profile == 'awsbatch') {
if(!workflow.workDir.startsWith('s3:') || !params.outdir.startsWith('s3:')) exit 1, "Workdir or Outdir not on S3 - specify S3 Buckets for each to run on AWSBatch!"
}

/*
* Create a channel for input read files
*/
Expand Down Expand Up @@ -135,6 +142,10 @@ summary['Working dir'] = workflow.workDir
summary['Output dir'] = params.outdir
summary['Script dir'] = workflow.projectDir
summary['Config Profile'] = workflow.profile
if(workflow.profile == 'awsbatch'){
summary['AWS Region'] = params.awsregion
summary['AWS Queue'] = params.awsqueue
}
if(params.email) summary['E-mail Address'] = params.email
log.info summary.collect { k,v -> "${k.padRight(15)}: $v" }.join("\n")
log.info "========================================="
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,10 @@ params {
singleEnd = false
outdir = './results'
igenomes_base = "./iGenomes"
tracedir = "${params.outdir}/pipeline_info"
clusterOptions = false
awsqueue = false
awsregion = 'eu-west-1'
}

profiles {
Expand All @@ -29,11 +32,17 @@ profiles {
includeConfig 'conf/base.config'
}
conda { process.conda = "$baseDir/environment.yml" }
docker { docker.enabled = true }
singularity { singularity.enabled = true }
aws {
docker {
docker.enabled = true
process.container = 'nfcore/{{ cookiecutter.pipeline_slug }}:{{ cookiecutter.version }}'
}
singularity {
enabled = true
process.container = 'shub://nf-core/{{ cookiecutter.pipeline_slug }}:{{ cookiecutter.version }}'
}
awsbatch {
includeConfig 'conf/base.config'
includeConfig 'conf/aws.config'
includeConfig 'conf/awsbatch.config'
includeConfig 'conf/igenomes.config'
}
test {
Expand All @@ -51,19 +60,19 @@ process.shell = ['/bin/bash', '-euo', 'pipefail']

timeline {
enabled = true
file = "${params.outdir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_timeline.html"
file = "${params.tracedir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_timeline.html"
}
report {
enabled = true
file = "${params.outdir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_report.html"
file = "${params.tracedir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_report.html"
}
trace {
enabled = true
file = "${params.outdir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_trace.txt"
file = "${params.tracedir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_trace.txt"
}
dag {
enabled = true
file = "${params.outdir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_dag.svg"
file = "${params.tracedir}/pipeline_info/{{ cookiecutter.pipeline_name.replace(' ', '-') }}_dag.svg"
}

manifest {
Expand Down