Testing nipype

In order to ensure the stability of each release of Nipype, the project uses two continuous integration services: CircleCI and Travis CI. If both batteries of tests are passing, the following badges should be shown in green color:

https://travis-ci.org/nipy/nipype.png?branch=master https://circleci.com/gh/nipy/nipype/tree/master.svg?style=svg

Installation for developers

To check out the latest development version:

git clone https://github.com/nipy/nipype.git

After cloning:

cd nipype
pip install -r requirements.txt
python setup.py develop

or:

cd nipype
pip install -r requirements.txt
pip install -e .[tests]

Test implementation

Nipype testing framework is built upon pytest. By the time these guidelines are written, Nipype implements 17638 tests.

After installation in developer mode, the tests can be run with the following simple command at the root folder of the project

make tests

If make is not installed in the system, it is possible to run the tests using:

py.test --doctest-modules --cov=nipype nipype

A successful test run should complete in 10-30 minutes and end with something like:

----------------------------------------------------------------------
2445 passed, 41 skipped, 7 xfailed in 1277.66 seconds

No test should fail (unless you’re missing a dependency). If the SUBJECTS_DIR` environment variable is not set, some FreeSurfer related tests will fail. If any of the tests failed, please report them on our bug tracker.

On Debian systems, set the following environment variable before running tests:

export MATLABCMD=$pathtomatlabdir/bin/$platform/MATLAB

where $pathtomatlabdir is the path to your matlab installation and $platform is the directory referring to x86 or x64 installations (typically glnxa64 on 64-bit installations).

Skip tests

Nipype will skip some tests depending on the currently available software and data dependencies. Installing software dependencies and downloading the necessary data will reduce the number of skip tests.

Some tests in Nipype make use of some images distributed within the FSL course data. This reduced version of the package can be downloaded here. To enable the tests depending on these data, just unpack the targz file and set the FSL_COURSE_DATA environment variable to point to that folder.

Xfail tests

Some tests are expect to fail until the code will be changed or for other reasons.

Avoiding any MATLAB calls from testing

On unix systems, set an empty environment variable:

export NIPYPE_NO_MATLAB=

This will skip any tests that require matlab.

Testing Nipype using Docker

As of nipype-0.13, Nipype is tested inside Docker containers. Once the developer has installed the Docker Engine, testing Nipype is as easy as follows:

cd path/to/nipype/
docker build -f docker/nipype_test/Dockerfile_py27 -t nipype/nipype_test:py27
docker run -it --rm -v /etc/localtime:/etc/localtime:ro \
                    -e FSL_COURSE_DATA="/root/examples/nipype-fsl_course_data" \
                    -v ~/examples:/root/examples:ro \
                    -v ~/scratch:/scratch \
                    -w /root/src/nipype \
                    nipype/nipype_test:py27 /usr/bin/run_pytest.sh

For running nipype in Python 3.5:

cd path/to/nipype/
docker build -f docker/nipype_test/Dockerfile_py35 -t nipype/nipype_test:py35
docker run -it --rm -v /etc/localtime:/etc/localtime:ro \
                    -e FSL_COURSE_DATA="/root/examples/nipype-fsl_course_data" \
                    -v ~/examples:/root/examples:ro \
                    -v ~/scratch:/scratch \
                    -w /root/src/nipype \
                    nipype/nipype_test:py35 /usr/bin/run_pytest.sh