Skip to content

Commit

Permalink
Merge pull request #4 from BIDS-Apps/develop
Browse files Browse the repository at this point in the history
Merge develop changes
  • Loading branch information
gdevenyi authored Aug 22, 2018
2 parents 8fc1202 + 1a3fe43 commit a0772f3
Show file tree
Hide file tree
Showing 6 changed files with 165 additions and 127 deletions.
33 changes: 19 additions & 14 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,32 +1,37 @@
# Use an image with pre-built ANTs included
FROM gdevenyi/magetbrain-bids-ants:21d7c12ee1e332827b04848eb5f70f55d14cac23
FROM gdevenyi/magetbrain-bids-ants:82dcdd647211004f3220e4073ea4daf06fdf89f9

RUN apt-get update \
&& apt-get install --auto-remove --no-install-recommends -y parallel \
&& apt-get install --auto-remove --no-install-recommends -y parallel git curl gzip bzip2 gnupg2 unzip coreutils ca-certificates \
&& rm -rf /var/lib/apt/lists/*

RUN apt-get update \
&& apt-get install -y --no-install-recommends --auto-remove git curl unzip bzip2 \
&& curl -o anaconda.sh https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh \
&& bash anaconda.sh -b -p /opt/anaconda && rm -f anaconda.sh \
&& git clone https://github.com/CobraLab/antsRegistration-MAGeT.git /opt/antsRegistration-MAGeT \
&& (cd /opt/antsRegistration-MAGeT && git checkout tags/v0.2.2.1) \
&& curl -o /opt/atlases-nifti.zip -sL http://cobralab.net/files/atlases-nifti.zip \
RUN curl --insecure -o anaconda.sh https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh \
&& bash anaconda.sh -b -p /opt/anaconda && rm -f anaconda.sh

RUN curl -o /opt/atlases-nifti.zip -sL http://cobralab.net/files/atlases-nifti.zip \
&& mkdir /opt/atlases-nifti \
&& unzip /opt/atlases-nifti.zip -d /opt \
&& curl -sL http://cobralab.net/files/brains_t1_nifti.tar.bz2 | tar xvj -C /opt/atlases-nifti \
&& curl -o /opt/atlases-nifti/colin.zip -sL http://packages.bic.mni.mcgill.ca/mni-models/colin27/mni_colin27_1998_nifti.zip \
&& mkdir /opt/atlases-nifti/colin && unzip /opt/atlases-nifti/colin.zip -d /opt/atlases-nifti/colin && rm -f /opt/atlases-nifti/colin.zip \
&& curl -sL https://deb.nodesource.com/setup_4.x | bash - \
&& apt-get install -y nodejs \
&& apt-get purge --auto-remove -y curl unzip bzip2 \
&& gzip /opt/atlases-nifti/colin/colin27_t1_tal_lin.nii

RUN curl --insecure -sL https://deb.nodesource.com/setup_10.x | bash - \
&& apt-get install -y --no-install-recommends --auto-remove nodejs \
&& rm -rf /var/lib/apt/lists/*

ENV CONDA_PATH "/opt/anaconda"

RUN /opt/anaconda/bin/pip install git+https://github.com/pipitone/qbatch.git@aade5b9a17c5a5a2fe6b28267b3bca10b05a5936
RUN /opt/anaconda/bin/conda config --append channels conda-forge
RUN /opt/anaconda/bin/conda install -y numpy scipy nibabel pandas
RUN /opt/anaconda/bin/pip install future six
RUN /opt/anaconda/bin/pip install duecredit
RUN /opt/anaconda/bin/pip install pybids
RUN npm install -g [email protected] --unsafe-perm

RUN npm install -g [email protected]
RUN git clone https://github.com/CobraLab/antsRegistration-MAGeT.git /opt/antsRegistration-MAGeT && \
(cd /opt/antsRegistration-MAGeT && git checkout tags/v0.3.1)
RUN /opt/anaconda/bin/pip install git+https://github.com/pipitone/qbatch.git@951dd1bdfdcbb5fd3f27ee6a3e261eaecac1ef70

ENV PATH /opt/ANTs/bin:/opt/anaconda/bin:/opt/antsRegistration-MAGeT/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ENV QBATCH_SYSTEM local
Expand Down
54 changes: 29 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
## MAGeTbrain segmentation pipeline

### Description
This pipeline takes in native-space T1 or T2 (or multiple co-registered modalities) brain images and volumetrically segments
them using the MAGeTbrain algorithm.
This pipeline takes in native-space T1 brain images and volumetrically segments
them using the MAGeTbrain algorithm using a variety of input atlases.

### Documentation
Provide a link to the documention of your pipeline.
https://github.com/cobralab/antsRegistration-MAGet.

### How to report errors
Provide instructions for users on how to get help and report errors.
Please open an issue at https://github.com/BIDS-Apps/MAGeTbrain/issues

### Acknowledgements
Describe how would you would like users to acknowledge use of your App in their papers (citation, a paragraph that can be copy pasted, etc.)
Expand All @@ -21,24 +21,24 @@ usage: run.py [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--segmentation_type {amygdala,cerebellum,hippocampus-whitematter,colin27-subcortical,all}]
[-v] [--n_cpus N_CPUS] [--fast] [--label-masking] [--no-cleanup]
bids_dir output_dir {participant1,participant2,group}
bids_dir output_dir {participant1,participant2}
MAGeTbrain BIDS App entrypoint script.
positional arguments:
bids_dir The directory with the input dataset formatted
according to the BIDS standard.
output_dir The directory where the output files should be stored.
When you are running group level analysis this folder
When you are running partipant2 level analysis this folder
must be prepopulated with the results of
theparticipant level analysis.
{participant1,participant2,group}
the participant1 level analysis.
{participant1,participant2}
Level of the analysis that will be performed. Multiple
participant level analyses can be run independently
(in parallel) using the same output_dir. In MAGeTbrain
parlance, participant1 = template stage, partipant2 =
subject stage group = resample + vote + qc stage. The
proper order is participant1, participant2, group
participant{1,2} level analyses can be run
independently (in parallel) using the same output_dir.
In MAGeTbrain parlance, participant1 = template stage,
partipant2 = subject + resample + vote + qc stage. The
proper order is participant1, participant2
optional arguments:
-h, --help show this help message and exit
Expand All @@ -53,35 +53,39 @@ optional arguments:
The segmentation label type to be used.
colin27-subcortical, since it is on a different atlas,
is not included in the all setting and must be run
seperately
separately
-v, --version show program's version number and exit
--n_cpus N_CPUS Number of CPUs/cores available to use.
--fast Use faster (less accurate) registration calls
--label-masking Use the input labels as registration masks to reduce
computation and (possibily) improve registration
computation and (possibly) improve registration
--no-cleanup Do no cleanup intermediate files after group phase
```

To run it in participant level mode (for one participant):
To run construct the template library, run the participant1 stage:
```sh
docker run -i --rm \
-v /Users/filo/data/ds005:/bids_dataset:ro \
-v /Users/filo/outputs:/outputs \
bids/example \
/bids_dataset /outputs participant --participant_label 01
/bids_dataset /outputs participant1 --participant_label 01
```
After doing this for all subjects (potentially in parallel), the group level analysis

After doing this for approximately 21 representative subjects (potentially in parallel),
the subject level labeling can be done:
can be run:
```sh
docker run -i --rm \
-v /Users/filo/data/ds005:/bids_dataset:ro \
-v /Users/filo/outputs:/outputs \
bids/example \
/bids_dataset /outputs group
bids/example /outputs participants2 --participant_label 01
```
### Special considerations
Describe whether your app has any special requirements. For example:
This can also happen in parallel on a per-subject basis

- Multiple map reduce steps (participant, group, participant2, group2 etc.)
- Unusual memory requirements
- etc.
### Special considerations
- segmentation_types output directories must be kept separate for each type
- participant1 stages can be run in parallel per subject, approximately 21
subjects should be selected which are a representative subset of the population
under study
- participant2 stages can also be run in parallel, but must be started after
participant1 stages are complete
8 changes: 3 additions & 5 deletions ants-build/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,17 +1,15 @@
# Use phusion/baseimage as base image
FROM phusion/baseimage:0.10.1

# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]
FROM ubuntu:latest

ENV DEBIAN_FRONTEND noninteractive

RUN buildDeps='cmake build-essential git zlib1g-dev' \
&& apt-get update \
&& apt-get install -y $buildDeps --no-install-recommends \
&& apt-get install -y ca-certificates \
&& rm -rf /var/lib/apt/lists/* \
&& git clone https://github.com/stnava/ANTs.git /opt/ANTs-src \
&& cd /opt/ANTs-src && git checkout 21d7c12ee1e332827b04848eb5f70f55d14cac23 \
&& cd /opt/ANTs-src && git checkout 82dcdd647211004f3220e4073ea4daf06fdf89f9 \
&& mkdir /opt/ANTs-src/build && cd /opt/ANTs-src/build \
&& cmake -DCMAKE_LINKER=/usr/bin/gold -DITK_BUILD_MINC_SUPPORT:BOOL=ON \
-DBUILD_TESTING:BOOL=OFF -DRUN_LONG_TESTS:BOOL=OFF -DRUN_SHORT_TESTS:BOOL=OFF \
Expand Down
28 changes: 19 additions & 9 deletions circle.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
general:
artifacts:
- "~/outputs"
- "~/outputs-colin"
- "~/outputs-colin-labelmask"

machine:
services:
Expand All @@ -20,23 +21,32 @@ dependencies:
timeout: 21600
- mkdir -p ~/docker; docker save "bids/${CIRCLE_PROJECT_REPONAME,,}" > ~/docker/image.tar :
timeout: 21600
- mkdir -p ${HOME}/outputs
- mkdir -p ${HOME}/outputs-colin
- mkdir -p ${HOME}/outputs-colin-labelmask

test:
override:
# print version
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset bids/${CIRCLE_PROJECT_REPONAME,,} --version
# template level run for downsampled dataset
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --fast --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 01 :
# template level run for downsampled dataset no masking
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 01 :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --fast --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 02 :
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 02 :
timeout: 2160
# participant level tests for a longitudinal dataset
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --fast --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 01 :
# participant level tests for a longitudinal dataset no masking
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 01 :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --fast --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 02 :
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 02 :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --fast --n_cpus 2 --segmentation_type colin27-subcortical /bids_dataset /outputs group :
# template level run for downsampled dataset with masking
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin-labelmask:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --label-masking --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 01 :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin-labelmask:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --label-masking --segmentation_type colin27-subcortical /bids_dataset /outputs participant1 --participant_label 02 :
timeout: 2160
# participant level tests for a longitudinal dataset with masking
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin-labelmask:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --label-masking --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 01 :
timeout: 21600
- docker run -ti --rm --read-only -v /tmp:/tmp -v /var/tmp:/var/tmp -v ${HOME}/data/ds003_downsampled:/bids_dataset -v ${HOME}/outputs-colin-labelmask:/outputs bids/${CIRCLE_PROJECT_REPONAME,,} --n_cpus 2 --label-masking --segmentation_type colin27-subcortical /bids_dataset /outputs participant2 --participant_label 02 :
timeout: 21600

deployment:
Expand Down
Loading

0 comments on commit a0772f3

Please sign in to comment.