Skip to content

Commit

Permalink
Merge pull request #130 from nf-core/dev
Browse files Browse the repository at this point in the history
Release 1.1.1
  • Loading branch information
skrakau authored Nov 10, 2020
2 parents 4fed04d + ef3bd0b commit a3122b1
Show file tree
Hide file tree
Showing 24 changed files with 2,318 additions and 150 deletions.
2 changes: 1 addition & 1 deletion .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,4 +54,4 @@ These tests are run both with the latest available version of `Nextflow` and als

## Getting help

For further information/help, please consult the [nf-core/mag documentation](https://nf-co.re/mag/docs) and don't hesitate to get in touch on the nf-core Slack [#mag](https://nfcore.slack.com/channels/mag) channel ([join our Slack here](https://nf-co.re/join/slack)).
For further information/help, please consult the [nf-core/mag documentation](https://nf-co.re/mag/usage) and don't hesitate to get in touch on the nf-core Slack [#mag](https://nfcore.slack.com/channels/mag) channel ([join our Slack here](https://nf-co.re/join/slack)).
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Steps to reproduce the behaviour:

## Container engine

- Engine: <!-- [e.g. Conda, Docker or Singularity] -->
- Engine: <!-- [e.g. Conda, Docker, Singularity or Podman] -->
- version: <!-- [e.g. 1.0.0] -->
- Image tag: <!-- [e.g. nfcore/mag:1.0.0] -->

Expand Down
4 changes: 4 additions & 0 deletions .github/markdownlint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,7 @@ default: true,
line-length: false
no-duplicate-header:
siblings_only: true
no-inline-html:
allowed_elements:
- img
- p
10 changes: 4 additions & 6 deletions .github/workflows/awsfulltest.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
name: nf-core AWS full size tests
# This workflow is triggered on push to the master branch.
# This workflow is triggered on published releases.
# It can be additionally triggered manually with GitHub actions workflow dispatch.
# It runs the -profile 'test_full' on AWS batch

on:
release:
types: [published]
workflow_dispatch:

jobs:
run-awstest:
Expand All @@ -20,10 +22,6 @@ jobs:
- name: Install awscli
run: conda install -c conda-forge awscli
- name: Start AWS batch job
# TODO nf-core: You can customise AWS full pipeline tests as required
# Add full size test data (but still relatively small datasets for few samples)
# on the `test_full.config` test runs with only one set of parameters
# Then specify `-profile test_full` instead of `-profile test` on the AWS batch command
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
Expand All @@ -37,4 +35,4 @@ jobs:
--job-name nf-core-mag \
--job-queue $AWS_JOB_QUEUE \
--job-definition $AWS_JOB_DEFINITION \
--container-overrides '{"command": ["nf-core/mag", "-r '"${GITHUB_SHA}"' -profile test --outdir s3://'"${AWS_S3_BUCKET}"'/mag/results-'"${GITHUB_SHA}"' -w s3://'"${AWS_S3_BUCKET}"'/mag/work-'"${GITHUB_SHA}"' -with-tower"], "environment": [{"name": "TOWER_ACCESS_TOKEN", "value": "'"$TOWER_ACCESS_TOKEN"'"}]}'
--container-overrides '{"command": ["nf-core/mag", "-r '"${GITHUB_SHA}"' -profile test_full --outdir s3://'"${AWS_S3_BUCKET}"'/mag/results-'"${GITHUB_SHA}"' -w s3://'"${AWS_S3_BUCKET}"'/mag/work-'"${GITHUB_SHA}"' -with-tower"], "environment": [{"name": "TOWER_ACCESS_TOKEN", "value": "'"$TOWER_ACCESS_TOKEN"'"}]}'
7 changes: 3 additions & 4 deletions .github/workflows/awstest.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
name: nf-core AWS test
# This workflow is triggered on push to the master branch.
# It runs the -profile 'test' on AWS batch
# It can be additionally triggered manually with GitHub actions workflow dispatch.
# It runs the -profile 'test' on AWS batch.

on:
push:
branches:
- master
workflow_dispatch:

jobs:
run-awstest:
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,13 +34,13 @@ jobs:
- name: Build new docker image
if: env.GIT_DIFF
run: docker build --no-cache . -t nfcore/mag:1.1.0
run: docker build --no-cache . -t nfcore/mag:1.1.1

- name: Pull docker image
if: ${{ !env.GIT_DIFF }}
run: |
docker pull nfcore/mag:dev
docker tag nfcore/mag:dev nfcore/mag:1.1.0
docker tag nfcore/mag:dev nfcore/mag:1.1.1
- name: Check if BUSCO Dockerfile or Conda environment changed
uses: technote-space/get-diff-action@v1
Expand All @@ -51,13 +51,13 @@ jobs:
- name: Build new docker image for BUSCO
if: env.GIT_DIFF
run: docker build --no-cache ./containers/busco/ -t nfcore/magbusco:1.1.0
run: docker build --no-cache ./containers/busco/ -t nfcore/magbusco:1.1.1

- name: Pull docker image for BUSCO
if: ${{ !env.GIT_DIFF }}
run: |
docker pull nfcore/magbusco:dev
docker tag nfcore/magbusco:dev nfcore/magbusco:1.1.0
docker tag nfcore/magbusco:dev nfcore/magbusco:1.1.1
- name: Install Nextflow
run: |
Expand Down
17 changes: 17 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,23 @@
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## v1.1.1 - 2019/11/10

### `Added`

- [#121](https://github.com/nf-core/mag/pull/121) - Add full-size test
- [#124](https://github.com/nf-core/mag/pull/124) - Add worfklow overview figure to `README`

### `Changed`

- [#123](https://github.com/nf-core/mag/pull/123) - Update to new nf-core 1.11 `TEMPLATE`

### `Fixed`

- [#118](https://github.com/nf-core/mag/pull/118) - Fix `seaborn` to `v0.10.1` to avoid `nanoplot` error
- [#120](https://github.com/nf-core/mag/pull/120) - Fix link to CAT database in help message
- [#124](https://github.com/nf-core/mag/pull/124) - Fix description of `CAT` process in `output.md`

## v1.1.0 - 2020/10/06

### `Added`
Expand Down
6 changes: 3 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM nfcore/base:1.10.2
FROM nfcore/base:1.11
LABEL authors="Hadrien Gourlé <[email protected]>, Daniel Straub <[email protected]>, Sabrina Krakau <[email protected]>" \
description="Docker image containing all software requirements for the nf-core/mag pipeline"

Expand All @@ -7,10 +7,10 @@ COPY environment.yml /
RUN conda env create --quiet -f /environment.yml && conda clean -a

# Add conda installation dir to PATH (instead of doing 'conda activate')
ENV PATH /opt/conda/envs/nf-core-mag-1.1.0/bin:$PATH
ENV PATH /opt/conda/envs/nf-core-mag-1.1.1/bin:$PATH

# Dump the details of the installed packages to a file for posterity
RUN conda env export --name nf-core-mag-1.1.0 > nf-core-mag-1.1.0.yml
RUN conda env export --name nf-core-mag-1.1.1 > nf-core-mag-1.1.1.yml

# Instruct R processes to use these empty files instead of clashing with a local version
RUN touch .Rprofile
Expand Down
31 changes: 18 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,43 +14,48 @@
## Introduction

This pipeline is for assembly, binning, and annotation of metagenomes.
It supports both short and long reads, quality trims the reads and adapters with [fastp](https://github.com/OpenGene/fastp) and [porechop](https://github.com/rrwick/Porechop), and performs basic QC with [fastqc](https://www.bioinformatics.babraham.ac.uk/projects/fastqc/).

The pipeline then:

* assigns taxonomy to reads using [centrifuge](https://ccb.jhu.edu/software/centrifuge/) and/or [kraken2](https://github.com/DerrickWood/kraken2/wiki)
* performs assembly using [megahit](https://github.com/voutcn/megahit) and [spades](http://cab.spbu.ru/software/spades/), and checks their quality using [quast](http://quast.sourceforge.net/quast)
* performs metagenome binning using [metabat2](https://bitbucket.org/berkeleylab/metabat/src/master/), and checks the quality of the genome bins using [busco](https://busco.ezlab.org/)

Furthermore, the pipeline creates various reports in the results directory specified, including a [multiqc](https://multiqc.info/) report summarizing some of the findings and software versions.
<p align="center">
<img src="docs/images/mag_workflow.png" alt="nf-core/mag workflow overview" width="60%">
</p>

The pipeline is built using [Nextflow](https://www.nextflow.io), a workflow tool to run tasks across multiple compute infrastructures in a very portable manner. It comes with docker containers making installation trivial and results highly reproducible.

## Quick Start

1. Install [`nextflow`](https://nf-co.re/usage/installation)

2. Install either [`Docker`](https://docs.docker.com/engine/installation/) or [`Singularity`](https://www.sylabs.io/guides/3.0/user-guide/) for full pipeline reproducibility _(please only use [`Conda`](https://conda.io/miniconda.html) as a last resort; see [docs](https://nf-co.re/usage/configuration#basic-configuration-profiles))_
2. Install any of [`Docker`](https://docs.docker.com/engine/installation/), [`Singularity`](https://www.sylabs.io/guides/3.0/user-guide/) or [`Podman`](https://podman.io/) for full pipeline reproducibility _(please only use [`Conda`](https://conda.io/miniconda.html) as a last resort; see [docs](https://nf-co.re/usage/configuration#basic-configuration-profiles))_

3. Download the pipeline and test it on a minimal dataset with a single command:

```bash
nextflow run nf-core/mag -profile test,<docker/singularity/conda/institute>
nextflow run nf-core/mag -profile test,<docker/singularity/podman/conda/institute>
```

> Please check [nf-core/configs](https://github.com/nf-core/configs#documentation) to see if a custom config file to run nf-core pipelines already exists for your Institute. If so, you can simply use `-profile <institute>` in your command. This will enable either `docker` or `singularity` and set the appropriate execution settings for your local compute environment.

4. Start running your own analysis!

```bash
nextflow run nf-core/mag -profile <docker/singularity/conda/institute> --input '*_R{1,2}.fastq.gz'
nextflow run nf-core/mag -profile <docker/singularity/podman/conda/institute> --input '*_R{1,2}.fastq.gz'
```

See [usage docs](docs/usage.md) for all of the available options when running the pipeline.
See [usage docs](https://nf-co.re/mag/usage) for all of the available options when running the pipeline.

## Documentation

The nf-core/mag pipeline comes with documentation about the pipeline which you can read at [https://nf-co.re/mag](https://nf-co.re/mag) or find in the [`docs/` directory](docs).
The nf-core/mag pipeline comes with documentation about the pipeline: [usage](https://nf-co.re/mag/usage) and [output](https://nf-co.re/mag/output).

In short, it supports both short and long reads, quality trims the reads and adapters with [fastp](https://github.com/OpenGene/fastp) and [porechop](https://github.com/rrwick/Porechop), and performs basic QC with [fastqc](https://www.bioinformatics.babraham.ac.uk/projects/fastqc/).
The pipeline then:

* assigns taxonomy to reads using [centrifuge](https://ccb.jhu.edu/software/centrifuge/) and/or [kraken2](https://github.com/DerrickWood/kraken2/wiki)
* performs assembly using [megahit](https://github.com/voutcn/megahit) and [spades](http://cab.spbu.ru/software/spades/), and checks their quality using [quast](http://quast.sourceforge.net/quast)
* performs metagenome binning using [metabat2](https://bitbucket.org/berkeleylab/metabat/src/master/), and checks the quality of the genome bins using [busco](https://busco.ezlab.org/)
* assigns taxonomy to bins using [CAT](https://github.com/dutilh/CAT)

Furthermore, the pipeline creates various reports in the results directory specified, including a [multiqc](https://multiqc.info/) report summarizing some of the findings and software versions.

## Credits

Expand Down
6 changes: 3 additions & 3 deletions conf/base.config
Original file line number Diff line number Diff line change
Expand Up @@ -106,23 +106,23 @@ process {
time = { check_max (8.h * task.attempt, 'time' ) }
}
withName: busco {
container = 'nfcore/magbusco:1.1.0'
container = 'nfcore/magbusco:1.1.1'
profiles {
conda {
conda = "$baseDir/containers/busco/environment.yml"
}
}
}
withName: busco_plot {
container = 'nfcore/magbusco:1.1.0'
container = 'nfcore/magbusco:1.1.1'
profiles {
conda {
conda = "$baseDir/containers/busco/environment.yml"
}
}
}
withName: get_busco_version {
container = 'nfcore/magbusco:1.1.0'
container = 'nfcore/magbusco:1.1.1'
profiles {
conda {
conda = "$baseDir/containers/busco/environment.yml"
Expand Down
1 change: 0 additions & 1 deletion conf/test.config
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ params {
max_time = 48.h

// Input data
single_end = false
input_paths = [
['test_minigut', ['https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_R1.fastq.gz', 'https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_R2.fastq.gz']],
['test_minigut_sample2', ['https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_sample2_R1.fastq.gz', 'https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_sample2_R2.fastq.gz']]
Expand Down
25 changes: 11 additions & 14 deletions conf/test_full.config
Original file line number Diff line number Diff line change
Expand Up @@ -6,24 +6,21 @@
* to run a full size pipeline test. Use as follows:
* nextflow run nf-core/mag -profile test_full,<docker/singularity>
*/
// TODO adjust for full test dataset and specify in .github/workflows/awsfulltest.yml
// currently this is not used!

params {
config_profile_name = 'Full test profile'
config_profile_description = 'Full test dataset to check pipeline function'

// Input data for full size test
// TODO nf-core: Specify the paths to your full test data ( on nf-core/test-datasets or directly in repositories, e.g. SRA)
// TODO nf-core: Give any required params for the test so that command line flags are not needed
single_end = false
input_paths = [
['test_minigut', ['https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_R1.fastq.gz', 'https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_R2.fastq.gz']],
['test_minigut_sample2', ['https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_sample2_R1.fastq.gz', 'https://github.com/nf-core/test-datasets/raw/mag/test_data/test_minigut_sample2_R2.fastq.gz']]
]
centrifuge_db = "https://github.com/nf-core/test-datasets/raw/mag/test_data/minigut_cf.tar.gz"
kraken2_db = "https://github.com/nf-core/test-datasets/raw/mag/test_data/minigut_kraken.tgz"
skip_krona = true
min_length_unbinned_contigs = 1
max_unbinned_contigs = 2
// hg19 reference with highly conserved and low-complexity regions masked by Brian Bushnell
host_fasta = "s3://nf-core-awsmegatests/mag/input_data/hg19_main_mask_ribo_animal_allplant_allfungus.fa.gz"
manifest = "s3://nf-core-awsmegatests/mag/input_data/manifest.full.txt"

centrifuge_db = "s3://nf-core-awsmegatests/mag/input_data/p_compressed+h+v.tar.gz"
kraken2_db = "s3://nf-core-awsmegatests/mag/input_data/minikraken_8GB_202003.tgz"
cat_db = "s3://nf-core-awsmegatests/mag/input_data/CAT_prepare_20200304.tar.gz"

busco_reference = "s3://nf-core-awsmegatests/mag/input_data/bacteria_odb10.2020-03-06.tar.gz"

// reproducibility options for megahit and spades not turned on
}
3 changes: 1 addition & 2 deletions conf/test_hybrid.config
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,7 @@ params {
max_time = 48.h

// Input data
single_end = false
params.manifest = 'https://github.com/nf-core/test-datasets/raw/mag/test_data/manifest.txt'
manifest = 'https://github.com/nf-core/test-datasets/raw/mag/test_data/manifest.txt'
min_length_unbinned_contigs = 1
max_unbinned_contigs = 2
}
2 changes: 1 addition & 1 deletion conf/test_hybrid_host_rm.config
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ params {

// Input data
host_fasta = "https://github.com/nf-core/test-datasets/raw/mag/host_reference/genome.hg38.chr21_10000bp_region.fa"
params.manifest = 'https://github.com/nf-core/test-datasets/raw/mag/test_data/manifest_hg38host.txt'
manifest = 'https://github.com/nf-core/test-datasets/raw/mag/test_data/manifest_hg38host.txt'
min_length_unbinned_contigs = 1
max_unbinned_contigs = 2
}
6 changes: 3 additions & 3 deletions containers/busco/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM nfcore/base:1.10.2
FROM nfcore/base:1.11
LABEL authors="Hadrien Gourlé <[email protected]>, Daniel Straub <[email protected]>, Sabrina Krakau <[email protected]>" \
description="Docker image containing BUSCO requirements for the nf-core/mag pipeline"

Expand All @@ -11,10 +11,10 @@ RUN apt-get update
RUN apt-get install -y libxt6

# Add conda installation dir to PATH (instead of doing 'conda activate')
ENV PATH /opt/conda/envs/nf-core-mag-busco-1.1.0/bin:$PATH
ENV PATH /opt/conda/envs/nf-core-mag-busco-1.1.1/bin:$PATH

# Dump the details of the installed packages to a file for posterity
RUN conda env export --name nf-core-mag-busco-1.1.0 > nf-core-mag-busco-1.1.0.yml
RUN conda env export --name nf-core-mag-busco-1.1.1 > nf-core-mag-busco-1.1.1.yml

# Instruct R processes to use these empty files instead of clashing with a local version
RUN touch .Rprofile
Expand Down
2 changes: 1 addition & 1 deletion containers/busco/environment.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# You can use this file to create a conda environment for this pipeline:
# conda env create -f environment.yml
name: nf-core-mag-busco-1.1.0
name: nf-core-mag-busco-1.1.1
channels:
- conda-forge
- bioconda
Expand Down
Binary file added docs/images/mag_workflow.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit a3122b1

Please sign in to comment.