nipype.interfaces.ants package

Top-level namespace for ants.

AI

Link to code

Bases: ANTSCommand

Wrapped executable: antsAI.

Calculate the optimal linear transform parameters for aligning two images.

Examples

>>> AI(
...     fixed_image='structural.nii',
...     moving_image='epi.nii',
...     metric=('Mattes', 32, 'Regular', 1),
... ).cmdline
'antsAI -c [10,1e-06,10] -d 3 -m Mattes[structural.nii,epi.nii,32,Regular,1]
-o initialization.mat -p 0 -s [20,0.12] -t Affine[0.1] -v 0'
>>> AI(fixed_image='structural.nii',
...    moving_image='epi.nii',
...    metric=('Mattes', 32, 'Regular', 1),
...    search_grid=(12, (1, 1, 1)),
... ).cmdline
'antsAI -c [10,1e-06,10] -d 3 -m Mattes[structural.nii,epi.nii,32,Regular,1]
-o initialization.mat -p 0 -s [20,0.12] -g [12.0,1x1x1] -t Affine[0.1] -v 0'
fixed_imagea pathlike object or string representing an existing file

Image to which the moving_image should be transformed.

metrica tuple of the form: (‘Mattes’ or ‘GC’ or ‘MI’, an integer (int or long), ‘Regular’ or ‘Random’ or ‘None’, 0.0 <= a floating point number <= 1.0)

The metric(s) to use. Maps to a command-line argument: -m %s.

moving_imagea pathlike object or string representing an existing file

Image that will be transformed to fixed_image.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

convergencea tuple of the form: (1 <= a long integer <= 10000, a float, 1 <= a long integer <= 100)

Convergence. Maps to a command-line argument: -c [%d,%g,%d]. (Nipype default value: (10, 1e-06, 10))

dimension3 or 2

Dimension of output image. Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

fixed_image_maska pathlike object or string representing an existing file

Fixed mage mask. Maps to a command-line argument: -x %s.

moving_image_maska pathlike object or string representing an existing file

Moving mage mask. Requires inputs: fixed_image_mask.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_transforma pathlike object or string representing a file

Output file name. Maps to a command-line argument: -o %s. (Nipype default value: initialization.mat)

principal_axesa boolean

Align using principal axes. Maps to a command-line argument: -p %d. Mutually exclusive with inputs: blobs. (Nipype default value: False)

search_factora tuple of the form: (a float, 0.0 <= a floating point number <= 1.0)

Search factor. Maps to a command-line argument: -s [%g,%g]. (Nipype default value: (20, 0.12))

search_grida tuple of the form: (a float, a tuple of the form: (a float, a float, a float)) or a tuple of the form: (a float, a tuple of the form: (a float, a float))

Translation search grid in mm. Maps to a command-line argument: -g %s.

transforma tuple of the form: (‘Affine’ or ‘Rigid’ or ‘Similarity’, a floating point number > 0.0)

Several transform options are available. Maps to a command-line argument: -t %s[%g]. (Nipype default value: ('Affine', 0.1))

verbosea boolean

Enable verbosity. Maps to a command-line argument: -v %d. (Nipype default value: False)

output_transforma pathlike object or string representing an existing file

Output file name.

ANTS

Link to code

Bases: ANTSCommand

Wrapped executable: ANTS.

ANTS wrapper for registration of images (old, use Registration instead)

Examples

>>> from nipype.interfaces.ants import ANTS
>>> ants = ANTS()
>>> ants.inputs.dimension = 3
>>> ants.inputs.output_transform_prefix = 'MY'
>>> ants.inputs.metric = ['CC']
>>> ants.inputs.fixed_image = ['T1.nii']
>>> ants.inputs.moving_image = ['resting.nii']
>>> ants.inputs.metric_weight = [1.0]
>>> ants.inputs.radius = [5]
>>> ants.inputs.transformation_model = 'SyN'
>>> ants.inputs.gradient_step_length = 0.25
>>> ants.inputs.number_of_iterations = [50, 35, 15]
>>> ants.inputs.use_histogram_matching = True
>>> ants.inputs.mi_option = [32, 16000]
>>> ants.inputs.regularization = 'Gauss'
>>> ants.inputs.regularization_gradient_field_sigma = 3
>>> ants.inputs.regularization_deformation_field_sigma = 0
>>> ants.inputs.number_of_affine_iterations = [10000,10000,10000,10000,10000]
>>> ants.cmdline
'ANTS 3 --MI-option 32x16000 --image-metric CC[ T1.nii, resting.nii, 1, 5 ] --number-of-affine-iterations 10000x10000x10000x10000x10000 --number-of-iterations 50x35x15 --output-naming MY --regularization Gauss[3.0,0.0] --transformation-model SyN[0.25] --use-Histogram-Matching 1'
fixed_imagea list of items which are a pathlike object or string representing an existing file

Image to which the moving image is warped.

metric : a list of items which are ‘CC’ or ‘MI’ or ‘SMI’ or ‘PR’ or ‘SSD’ or ‘MSQ’ or ‘PSE’ metric_weight : a list of items which are a float

The metric weight(s) for each stage. The weights must sum to 1 per stage. Requires inputs: metric. (Nipype default value: [1.0])

moving_imagea list of items which are a pathlike object or string representing an existing file

Image to apply transformation to (generally a coregisteredfunctional). Maps to a command-line argument: %s.

output_transform_prefixa unicode string

Maps to a command-line argument: --output-naming %s. (Nipype default value: out)

radiusa list of items which are an integer (int or long)

Radius of the region (i.e. number of layers around a voxel/pixel) that is used for computing cross correlation. Requires inputs: metric.

transformation_model‘Diff’ or ‘Elast’ or ‘Exp’ or ‘Greedy Exp’ or ‘SyN’

Maps to a command-line argument: %s.

affine_gradient_descent_optiona list of items which are a float

Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

delta_timea float

Requires inputs: number_of_time_steps.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 1).

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

gradient_step_lengtha float

Requires inputs: transformation_model.

mi_optiona list of items which are an integer (int or long)

Maps to a command-line argument: --MI-option %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

number_of_affine_iterationsa list of items which are an integer (int or long)

Maps to a command-line argument: --number-of-affine-iterations %s.

number_of_iterationsa list of items which are an integer (int or long)

Maps to a command-line argument: --number-of-iterations %s.

number_of_time_stepsan integer (int or long)

Requires inputs: gradient_step_length.

regularization‘Gauss’ or ‘DMFFD’

Maps to a command-line argument: %s.

regularization_deformation_field_sigmaa float

Requires inputs: regularization.

regularization_gradient_field_sigmaa float

Requires inputs: regularization.

smoothing_sigmasa list of items which are an integer (int or long)

Maps to a command-line argument: --gaussian-smoothing-sigmas %s.

subsampling_factorsa list of items which are an integer (int or long)

Maps to a command-line argument: --subsampling-factors %s.

symmetry_typea float

Requires inputs: delta_time.

use_histogram_matchinga boolean

Maps to a command-line argument: %s. (Nipype default value: True)

affine_transforma pathlike object or string representing an existing file

Affine transform file.

inverse_warp_transforma pathlike object or string representing an existing file

Inverse warping deformation field.

metaheadera pathlike object or string representing an existing file

VTK metaheader .mhd file.

metaheader_rawa pathlike object or string representing an existing file

VTK metaheader .raw file.

warp_transforma pathlike object or string representing an existing file

Warping deformation field.

AffineInitializer

Link to code

Bases: ANTSCommand

Wrapped executable: antsAffineInitializer.

Initialize an affine transform (as in antsBrainExtraction.sh)

>>> from nipype.interfaces.ants import AffineInitializer
>>> init = AffineInitializer()
>>> init.inputs.fixed_image = 'fixed1.nii'
>>> init.inputs.moving_image = 'moving1.nii'
>>> init.cmdline
'antsAffineInitializer 3 fixed1.nii moving1.nii transform.mat 15.000000 0.100000 0 10'
fixed_imagea pathlike object or string representing an existing file

Reference image. Maps to a command-line argument: %s (position: 1).

moving_imagea pathlike object or string representing an existing file

Moving image. Maps to a command-line argument: %s (position: 2).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Dimension. Maps to a command-line argument: %s (position: 0). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

local_searchan integer (int or long)

determines if a local optimization is run at each search point for the set number of iterations.

Maps to a command-line argument: %d (position: 7). (Nipype default value: 10)

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_filea pathlike object or string representing a file

Output transform file. Maps to a command-line argument: %s (position: 3). (Nipype default value: transform.mat)

principal_axesa boolean

Whether the rotation is searched around an initial principal axis alignment. Maps to a command-line argument: %d (position: 6). (Nipype default value: False)

radian_fraction0.0 <= a floating point number <= 1.0

Search this arc +/- principal axes. Maps to a command-line argument: %f (position: 5). (Nipype default value: 0.1)

search_factora float

Increments (degrees) for affine search. Maps to a command-line argument: %f (position: 4). (Nipype default value: 15.0)

out_filea pathlike object or string representing a file

Output transform file.

AntsJointFusion

Link to code

alias of nipype.interfaces.ants.segmentation.JointFusion

ApplyTransforms

Link to code

Bases: ANTSCommand

Wrapped executable: antsApplyTransforms.

ApplyTransforms, applied to an input image, transforms it according to a reference image and a transform (or a set of transforms).

Examples

>>> from nipype.interfaces.ants import ApplyTransforms
>>> at = ApplyTransforms()
>>> at.inputs.input_image = 'moving1.nii'
>>> at.inputs.reference_image = 'fixed1.nii'
>>> at.inputs.transforms = 'identity'
>>> at.cmdline
'antsApplyTransforms --default-value 0 --float 0 --input moving1.nii --interpolation Linear --output moving1_trans.nii --reference-image fixed1.nii --transform identity'
>>> at = ApplyTransforms()
>>> at.inputs.dimension = 3
>>> at.inputs.input_image = 'moving1.nii'
>>> at.inputs.reference_image = 'fixed1.nii'
>>> at.inputs.output_image = 'deformed_moving1.nii'
>>> at.inputs.interpolation = 'Linear'
>>> at.inputs.default_value = 0
>>> at.inputs.transforms = ['ants_Warp.nii.gz', 'trans.mat']
>>> at.inputs.invert_transform_flags = [False, True]
>>> at.cmdline
'antsApplyTransforms --default-value 0 --dimensionality 3 --float 0 --input moving1.nii --interpolation Linear --output deformed_moving1.nii --reference-image fixed1.nii --transform ants_Warp.nii.gz --transform [ trans.mat, 1 ]'
>>> at1 = ApplyTransforms()
>>> at1.inputs.dimension = 3
>>> at1.inputs.input_image = 'moving1.nii'
>>> at1.inputs.reference_image = 'fixed1.nii'
>>> at1.inputs.output_image = 'deformed_moving1.nii'
>>> at1.inputs.interpolation = 'BSpline'
>>> at1.inputs.interpolation_parameters = (5,)
>>> at1.inputs.default_value = 0
>>> at1.inputs.transforms = ['ants_Warp.nii.gz', 'trans.mat']
>>> at1.inputs.invert_transform_flags = [False, False]
>>> at1.cmdline
'antsApplyTransforms --default-value 0 --dimensionality 3 --float 0 --input moving1.nii --interpolation BSpline[ 5 ] --output deformed_moving1.nii --reference-image fixed1.nii --transform ants_Warp.nii.gz --transform trans.mat'

Identity transforms may be used as part of a chain:

>>> at2 = ApplyTransforms()
>>> at2.inputs.dimension = 3
>>> at2.inputs.input_image = 'moving1.nii'
>>> at2.inputs.reference_image = 'fixed1.nii'
>>> at2.inputs.output_image = 'deformed_moving1.nii'
>>> at2.inputs.interpolation = 'BSpline'
>>> at2.inputs.interpolation_parameters = (5,)
>>> at2.inputs.default_value = 0
>>> at2.inputs.transforms = ['identity', 'ants_Warp.nii.gz', 'trans.mat']
>>> at2.cmdline
'antsApplyTransforms --default-value 0 --dimensionality 3 --float 0 --input moving1.nii --interpolation BSpline[ 5 ] --output deformed_moving1.nii --reference-image fixed1.nii --transform identity --transform ants_Warp.nii.gz --transform trans.mat'
input_imagea pathlike object or string representing an existing file

Image to apply transformation to (generally a coregistered functional). Maps to a command-line argument: --input %s.

reference_imagea pathlike object or string representing an existing file

Reference image space that you wish to warp INTO. Maps to a command-line argument: --reference-image %s.

transformsa list of items which are a pathlike object or string representing an existing file or ‘identity’

Transform files: will be applied in reverse order. For example, the last specified transform will be applied first. Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

default_valuea float

Maps to a command-line argument: --default-value %g. (Nipype default value: 0.0)

dimension2 or 3 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, antsWarp tries to infer the dimensionality from the input image. Maps to a command-line argument: --dimensionality %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

floata boolean

Use float instead of double for computations. Maps to a command-line argument: --float %d. (Nipype default value: False)

input_image_type0 or 1 or 2 or 3

Option specifying the input image type of scalar (default), vector, tensor, or time series. Maps to a command-line argument: --input-image-type %d.

interpolation‘Linear’ or ‘NearestNeighbor’ or ‘CosineWindowedSinc’ or ‘WelchWindowedSinc’ or ‘HammingWindowedSinc’ or ‘LanczosWindowedSinc’ or ‘MultiLabel’ or ‘Gaussian’ or ‘BSpline’

Maps to a command-line argument: %s. (Nipype default value: Linear)

interpolation_parameters : a tuple of the form: (an integer (int or long)) or a tuple of the form: (a float, a float) invert_transform_flags : a list of items which are a boolean num_threads : an integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_postfixa unicode string

Postfix that is appended to all output files (default = _trans). (Nipype default value: _trans)

output_imagea unicode string

Output file name. Maps to a command-line argument: --output %s.

print_out_composite_warp_filea boolean

Output a composite warp file instead of a transformed image. Requires inputs: output_image.

output_imagea pathlike object or string representing an existing file

Warped image.

ApplyTransformsToPoints

Link to code

Bases: ANTSCommand

Wrapped executable: antsApplyTransformsToPoints.

ApplyTransformsToPoints, applied to an CSV file, transforms coordinates using provided transform (or a set of transforms).

Examples

>>> from nipype.interfaces.ants import ApplyTransforms
>>> at = ApplyTransformsToPoints()
>>> at.inputs.dimension = 3
>>> at.inputs.input_file = 'moving.csv'
>>> at.inputs.transforms = ['trans.mat', 'ants_Warp.nii.gz']
>>> at.inputs.invert_transform_flags = [False, False]
>>> at.cmdline
'antsApplyTransformsToPoints --dimensionality 3 --input moving.csv --output moving_transformed.csv --transform [ trans.mat, 0 ] --transform [ ants_Warp.nii.gz, 0 ]'
input_filea pathlike object or string representing an existing file

Currently, the only input supported is a csv file with columns including x,y (2D), x,y,z (3D) or x,y,z,t,label (4D) column headers. The points should be defined in physical space. If in doubt how to convert coordinates from your files to the space required by antsApplyTransformsToPoints try creating/drawing a simple label volume with only one voxel set to 1 and all others set to 0. Write down the voxel coordinates. Then use ImageMaths LabelStats to find out what coordinates for this voxel antsApplyTransformsToPoints is expecting. Maps to a command-line argument: --input %s.

transformsa list of items which are a pathlike object or string representing an existing file

Transforms that will be applied to the points. Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension2 or 3 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, antsWarp tries to infer the dimensionality from the input image. Maps to a command-line argument: --dimensionality %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

invert_transform_flagsa list of items which are a boolean

List indicating if a transform should be reversed.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_filea unicode string

Name of the output CSV file. Maps to a command-line argument: --output %s.

output_filea pathlike object or string representing an existing file

Csv file with transformed coordinates.

Atropos

Link to code

Bases: ANTSCommand

Wrapped executable: Atropos.

A multivariate n-class segmentation algorithm.

A finite mixture modeling (FMM) segmentation approach with possibilities for specifying prior constraints. These prior constraints include the specification of a prior label image, prior probability images (one for each class), and/or an MRF prior to enforce spatial smoothing of the labels. Similar algorithms include FAST and SPM.

Examples

>>> from nipype.interfaces.ants import Atropos
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'Random'
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization Random[2] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'KMeans'
>>> at.inputs.kmeans_init_centers = [100, 200]
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization KMeans[2,100,200] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'PriorProbabilityImages'
>>> at.inputs.prior_image = 'BrainSegmentationPrior%02d.nii.gz'
>>> at.inputs.prior_weighting = 0.8
>>> at.inputs.prior_probability_threshold = 0.0000001
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization PriorProbabilityImages[2,BrainSegmentationPrior%02d.nii.gz,0.8,1e-07]
--intensity-image structural.nii --likelihood-model Gaussian --mask-image mask.nii
--mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz]
--posterior-formulation Socrates[1] --use-random-seed 1'
>>> at = Atropos(
...     dimension=3, intensity_images='structural.nii', mask_image='mask.nii',
...     number_of_tissue_classes=2, likelihood_model='Gaussian', save_posteriors=True,
...     mrf_smoothing_factor=0.2, mrf_radius=[1, 1, 1], icm_use_synchronous_update=True,
...     maximum_number_of_icm_terations=1, n_iterations=5, convergence_threshold=0.000001,
...     posterior_formulation='Socrates', use_mixture_model_proportions=True)
>>> at.inputs.initialization = 'PriorLabelImage'
>>> at.inputs.prior_image = 'segmentation0.nii.gz'
>>> at.inputs.number_of_tissue_classes = 2
>>> at.inputs.prior_weighting = 0.8
>>> at.cmdline
'Atropos --image-dimensionality 3 --icm [1,1]
--initialization PriorLabelImage[2,segmentation0.nii.gz,0.8] --intensity-image structural.nii
--likelihood-model Gaussian --mask-image mask.nii --mrf [0.2,1x1x1] --convergence [5,1e-06]
--output [structural_labeled.nii,POSTERIOR_%02d.nii.gz] --posterior-formulation Socrates[1]
--use-random-seed 1'
initialization‘Random’ or ‘Otsu’ or ‘KMeans’ or ‘PriorProbabilityImages’ or ‘PriorLabelImage’

Maps to a command-line argument: %s. Requires inputs: number_of_tissue_classes.

intensity_imagesa list of items which are a pathlike object or string representing an existing file

Maps to a command-line argument: --intensity-image %s....

mask_imagea pathlike object or string representing an existing file

Maps to a command-line argument: --mask-image %s.

number_of_tissue_classes : an integer (int or long)

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

convergence_thresholda float

Requires inputs: n_iterations.

dimension3 or 2 or 4

Image dimension (2, 3, or 4). Maps to a command-line argument: --image-dimensionality %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

icm_use_synchronous_updatea boolean

Maps to a command-line argument: %s.

kmeans_init_centers : a list of at least 1 items which are an integer (int or long) or a float likelihood_model : a unicode string

Maps to a command-line argument: --likelihood-model %s.

maximum_number_of_icm_terationsan integer (int or long)

Requires inputs: icm_use_synchronous_update.

mrf_radiusa list of items which are an integer (int or long)

Requires inputs: mrf_smoothing_factor.

mrf_smoothing_factora float

Maps to a command-line argument: %s.

n_iterationsan integer (int or long)

Maps to a command-line argument: %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_classified_image_namea pathlike object or string representing a file

Maps to a command-line argument: %s.

output_posteriors_name_templatea unicode string

(Nipype default value: POSTERIOR_%02d.nii.gz)

posterior_formulationa unicode string

Maps to a command-line argument: %s.

prior_imagea pathlike object or string representing an existing file or a unicode string

Either a string pattern (e.g., ‘prior%02d.nii’) or an existing vector-image file.

prior_probability_thresholda float

Requires inputs: prior_weighting.

prior_weighting : a float save_posteriors : a boolean use_mixture_model_proportions : a boolean

Requires inputs: posterior_formulation.

use_random_seeda boolean

Use random seed value over constant. Maps to a command-line argument: --use-random-seed %d. (Nipype default value: True)

classified_image : a pathlike object or string representing an existing file posteriors : a list of items which are a pathlike object or string representing a file

AverageAffineTransform

Link to code

Bases: ANTSCommand

Wrapped executable: AverageAffineTransform.

Examples

>>> from nipype.interfaces.ants import AverageAffineTransform
>>> avg = AverageAffineTransform()
>>> avg.inputs.dimension = 3
>>> avg.inputs.transforms = ['trans.mat', 'func_to_struct.mat']
>>> avg.inputs.output_affine_transform = 'MYtemplatewarp.mat'
>>> avg.cmdline
'AverageAffineTransform 3 MYtemplatewarp.mat trans.mat func_to_struct.mat'
dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0).

output_affine_transforma pathlike object or string representing a file

Outputfname.txt: the name of the resulting transform. Maps to a command-line argument: %s (position: 1).

transformsa list of items which are a pathlike object or string representing an existing file

Transforms to average. Maps to a command-line argument: %s (position: 3).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

affine_transforma pathlike object or string representing an existing file

Average transform file.

AverageImages

Link to code

Bases: ANTSCommand

Wrapped executable: AverageImages.

Examples

>>> from nipype.interfaces.ants import AverageImages
>>> avg = AverageImages()
>>> avg.inputs.dimension = 3
>>> avg.inputs.output_average_image = "average.nii.gz"
>>> avg.inputs.normalize = True
>>> avg.inputs.images = ['rc1s1.nii', 'rc1s1.nii']
>>> avg.cmdline
'AverageImages 3 average.nii.gz 1 rc1s1.nii rc1s1.nii'
dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0).

imagesa list of items which are a pathlike object or string representing an existing file

Image to apply transformation to (generally a coregistered functional). Maps to a command-line argument: %s (position: 3).

normalizea boolean

Normalize: if true, the 2nd image is divided by its mean. This will select the largest image to average into. Maps to a command-line argument: %d (position: 2).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_average_imagea pathlike object or string representing a file

The name of the resulting image. Maps to a command-line argument: %s (position: 1). (Nipype default value: average.nii)

output_average_imagea pathlike object or string representing an existing file

Average image file.

BrainExtraction

Link to code

Bases: ANTSCommand

Wrapped executable: antsBrainExtraction.sh.

Atlas-based brain extraction.

Examples

>>> from nipype.interfaces.ants.segmentation import BrainExtraction
>>> brainextraction = BrainExtraction()
>>> brainextraction.inputs.dimension = 3
>>> brainextraction.inputs.anatomical_image ='T1.nii.gz'
>>> brainextraction.inputs.brain_template = 'study_template.nii.gz'
>>> brainextraction.inputs.brain_probability_mask ='ProbabilityMaskOfStudyTemplate.nii.gz'
>>> brainextraction.cmdline
'antsBrainExtraction.sh -a T1.nii.gz -m ProbabilityMaskOfStudyTemplate.nii.gz
-e study_template.nii.gz -d 3 -s nii.gz -o highres001_'
anatomical_imagea pathlike object or string representing an existing file

Structural image, typically T1. If more than one anatomical image is specified, subsequently specified images are used during the segmentation process. However, only the first image is used in the registration of priors. Our suggestion would be to specify the T1 as the first image. Anatomical template created using e.g. LPBA40 data set with buildtemplateparallel.sh in ANTs. Maps to a command-line argument: -a %s.

brain_probability_maska pathlike object or string representing an existing file

Brain probability mask created using e.g. LPBA40 data set which have brain masks defined, and warped to anatomical template and averaged resulting in a probability image. Maps to a command-line argument: -m %s.

brain_templatea pathlike object or string representing an existing file

Anatomical template created using e.g. LPBA40 data set with buildtemplateparallel.sh in ANTs. Maps to a command-line argument: -e %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

debuga boolean

If > 0, runs a faster version of the script. Only for testing. Implies -u 0. Requires single thread computation for complete reproducibility. Maps to a command-line argument: -z 1.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

extraction_registration_maska pathlike object or string representing an existing file

Mask (defined in the template space) used during registration for brain extraction. To limit the metric computation to a specific region. Maps to a command-line argument: -f %s.

image_suffixa unicode string

Any of standard ITK formats, nii.gz is default. Maps to a command-line argument: -s %s. (Nipype default value: nii.gz)

keep_temporary_filesan integer (int or long)

Keep brain extraction/segmentation warps, etc (default = 0). Maps to a command-line argument: -k %d.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_prefixa unicode string

Prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: highres001_)

use_floatingpoint_precision0 or 1

Use floating point precision in registrations (default = 0). Maps to a command-line argument: -q %d.

use_random_seeding0 or 1

Use random number generated from system clock in Atropos (default = 1). Maps to a command-line argument: -u %d.

BrainExtractionBraina pathlike object or string representing an existing file

Brain extraction image.

BrainExtractionCSFa pathlike object or string representing an existing file

Segmentation mask with only CSF.

BrainExtractionGMa pathlike object or string representing an existing file

Segmentation mask with only grey matter.

BrainExtractionInitialAffine : a pathlike object or string representing an existing file BrainExtractionInitialAffineFixed : a pathlike object or string representing an existing file BrainExtractionInitialAffineMoving : a pathlike object or string representing an existing file BrainExtractionLaplacian : a pathlike object or string representing an existing file BrainExtractionMask : a pathlike object or string representing an existing file

Brain extraction mask.

BrainExtractionPrior0GenericAffine : a pathlike object or string representing an existing file BrainExtractionPrior1InverseWarp : a pathlike object or string representing an existing file BrainExtractionPrior1Warp : a pathlike object or string representing an existing file BrainExtractionPriorWarped : a pathlike object or string representing an existing file BrainExtractionSegmentation : a pathlike object or string representing an existing file

Segmentation mask with CSF, GM, and WM.

BrainExtractionTemplateLaplacian : a pathlike object or string representing an existing file BrainExtractionTmp : a pathlike object or string representing an existing file BrainExtractionWM : a pathlike object or string representing an existing file

Segmenration mask with only white matter.

N4Corrected0a pathlike object or string representing an existing file

N4 bias field corrected image.

N4Truncated0 : a pathlike object or string representing an existing file

ComposeMultiTransform

Link to code

Bases: ANTSCommand

Wrapped executable: ComposeMultiTransform.

Take a set of transformations and convert them to a single transformation matrix/warpfield.

Examples

>>> from nipype.interfaces.ants import ComposeMultiTransform
>>> compose_transform = ComposeMultiTransform()
>>> compose_transform.inputs.dimension = 3
>>> compose_transform.inputs.transforms = ['struct_to_template.mat', 'func_to_struct.mat']
>>> compose_transform.cmdline
'ComposeMultiTransform 3 struct_to_template_composed.mat
struct_to_template.mat func_to_struct.mat'
transformsa list of items which are a pathlike object or string representing an existing file

Transforms to average. Maps to a command-line argument: %s (position: 3).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_transforma pathlike object or string representing a file

The name of the resulting transform. Maps to a command-line argument: %s (position: 1).

reference_imagea pathlike object or string representing a file

Reference image (only necessary when output is warpfield). Maps to a command-line argument: %s (position: 2).

output_transforma pathlike object or string representing an existing file

Composed transform file.

CompositeTransformUtil

Link to code

Bases: ANTSCommand

Wrapped executable: CompositeTransformUtil.

ANTs utility which can combine or break apart transform files into their individual constituent components.

Examples

>>> from nipype.interfaces.ants import CompositeTransformUtil
>>> tran = CompositeTransformUtil()
>>> tran.inputs.process = 'disassemble'
>>> tran.inputs.in_file = 'output_Composite.h5'
>>> tran.cmdline
'CompositeTransformUtil --disassemble output_Composite.h5 transform'
>>> tran.run()  

example for assembling transformation files

>>> from nipype.interfaces.ants import CompositeTransformUtil
>>> tran = CompositeTransformUtil()
>>> tran.inputs.process = 'assemble'
>>> tran.inputs.out_file = 'my.h5'
>>> tran.inputs.in_file = ['AffineTransform.mat', 'DisplacementFieldTransform.nii.gz']
>>> tran.cmdline
'CompositeTransformUtil --assemble my.h5 AffineTransform.mat DisplacementFieldTransform.nii.gz '
>>> tran.run()  
in_filea list of items which are a pathlike object or string representing an existing file

Input transform file(s). Maps to a command-line argument: %s... (position: 3).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_filea pathlike object or string representing a file

Output file path (only used for disassembly). Maps to a command-line argument: %s (position: 2).

output_prefixa unicode string

A prefix that is prepended to all output files (only used for assembly). Maps to a command-line argument: %s (position: 4). (Nipype default value: transform)

process‘assemble’ or ‘disassemble’

What to do with the transform inputs (assemble or disassemble). Maps to a command-line argument: --%s (position: 1). (Nipype default value: assemble)

affine_transforma pathlike object or string representing a file

Affine transform component.

displacement_fielda pathlike object or string representing a file

Displacement field component.

out_filea pathlike object or string representing a file

Compound transformation file.

ConvertScalarImageToRGB

Link to code

Bases: ANTSCommand

Wrapped executable: ConvertScalarImageToRGB.

Convert scalar images to RGB.

Examples

>>> from nipype.interfaces.ants.visualization import ConvertScalarImageToRGB
>>> converter = ConvertScalarImageToRGB()
>>> converter.inputs.dimension = 3
>>> converter.inputs.input_image = 'T1.nii.gz'
>>> converter.inputs.colormap = 'jet'
>>> converter.inputs.minimum_input = 0
>>> converter.inputs.maximum_input = 6
>>> converter.cmdline
'ConvertScalarImageToRGB 3 T1.nii.gz rgb.nii.gz none jet none 0 6 0 255'
colormap‘grey’ or ‘red’ or ‘green’ or ‘blue’ or ‘copper’ or ‘jet’ or ‘hsv’ or ‘spring’ or ‘summer’ or ‘autumn’ or ‘winter’ or ‘hot’ or ‘cool’ or ‘overunder’ or ‘custom’

Select a colormap. Maps to a command-line argument: %s (position: 4).

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0). (Nipype default value: 3)

input_imagea pathlike object or string representing an existing file

Main input is a 3-D grayscale image. Maps to a command-line argument: %s (position: 1).

maximum_inputan integer (int or long)

Maximum input. Maps to a command-line argument: %d (position: 7).

minimum_inputan integer (int or long)

Minimum input. Maps to a command-line argument: %d (position: 6).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

custom_color_map_filea unicode string

Custom color map file. Maps to a command-line argument: %s (position: 5). (Nipype default value: none)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

mask_imagea pathlike object or string representing an existing file

Mask image. Maps to a command-line argument: %s (position: 3). (Nipype default value: none)

maximum_RGB_outputan integer (int or long)

Maps to a command-line argument: %d (position: 9). (Nipype default value: 255)

minimum_RGB_outputan integer (int or long)

Maps to a command-line argument: %d (position: 8). (Nipype default value: 0)

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea unicode string

Rgb output image. Maps to a command-line argument: %s (position: 2). (Nipype default value: rgb.nii.gz)

output_imagea pathlike object or string representing an existing file

Converted RGB image.

CorticalThickness

Link to code

Bases: ANTSCommand

Wrapped executable: antsCorticalThickness.sh.

Examples

>>> from nipype.interfaces.ants.segmentation import CorticalThickness
>>> corticalthickness = CorticalThickness()
>>> corticalthickness.inputs.dimension = 3
>>> corticalthickness.inputs.anatomical_image ='T1.nii.gz'
>>> corticalthickness.inputs.brain_template = 'study_template.nii.gz'
>>> corticalthickness.inputs.brain_probability_mask ='ProbabilityMaskOfStudyTemplate.nii.gz'
>>> corticalthickness.inputs.segmentation_priors = ['BrainSegmentationPrior01.nii.gz',
...                                                 'BrainSegmentationPrior02.nii.gz',
...                                                 'BrainSegmentationPrior03.nii.gz',
...                                                 'BrainSegmentationPrior04.nii.gz']
>>> corticalthickness.inputs.t1_registration_template = 'brain_study_template.nii.gz'
>>> corticalthickness.cmdline
'antsCorticalThickness.sh -a T1.nii.gz -m ProbabilityMaskOfStudyTemplate.nii.gz
-e study_template.nii.gz -d 3 -s nii.gz -o antsCT_
-p nipype_priors/BrainSegmentationPrior%02d.nii.gz -t brain_study_template.nii.gz'
anatomical_imagea pathlike object or string representing an existing file

Structural intensity image, typically T1. If more than one anatomical image is specified, subsequently specified images are used during the segmentation process. However, only the first image is used in the registration of priors. Our suggestion would be to specify the T1 as the first image. Maps to a command-line argument: -a %s.

brain_probability_maska pathlike object or string representing an existing file

Brain probability mask in template space. Maps to a command-line argument: -m %s.

brain_templatea pathlike object or string representing an existing file

Anatomical intensity template (possibly created using a population data set with buildtemplateparallel.sh in ANTs). This template is not skull-stripped. Maps to a command-line argument: -e %s.

segmentation_priorsa list of items which are a pathlike object or string representing an existing file

Maps to a command-line argument: -p %s.

t1_registration_templatea pathlike object or string representing an existing file

Anatomical intensity template (assumed to be skull-stripped). A common case would be where this would be the same template as specified in the -e option which is not skull stripped. Maps to a command-line argument: -t %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

b_spline_smoothinga boolean

Use B-spline SyN for registrations and B-spline exponential mapping in DiReCT. Maps to a command-line argument: -v.

cortical_label_imagea pathlike object or string representing an existing file

Cortical ROI labels to use as a prior for ATITH.

debuga boolean

If > 0, runs a faster version of the script. Only for testing. Implies -u 0. Requires single thread computation for complete reproducibility. Maps to a command-line argument: -z 1.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

extraction_registration_maska pathlike object or string representing an existing file

Mask (defined in the template space) used during registration for brain extraction. Maps to a command-line argument: -f %s.

image_suffixa unicode string

Any of standard ITK formats, nii.gz is default. Maps to a command-line argument: -s %s. (Nipype default value: nii.gz)

keep_temporary_filesan integer (int or long)

Keep brain extraction/segmentation warps, etc (default = 0). Maps to a command-line argument: -k %d.

label_propagationa unicode string

Incorporate a distance prior one the posterior formulation. Should be of the form ‘label[lambda,boundaryProbability]’ where label is a value of 1,2,3,… denoting label ID. The label probability for anything outside the current label = boundaryProbability * exp( -lambda * distanceFromBoundary ) Intuitively, smaller lambda values will increase the spatial capture range of the distance prior. To apply to all label values, simply omit specifying the label, i.e. -l [lambda,boundaryProbability]. Maps to a command-line argument: -l %s.

max_iterationsan integer (int or long)

ANTS registration max iterations (default = 100x100x70x20). Maps to a command-line argument: -i %d.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_prefixa unicode string

Prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: antsCT_)

posterior_formulationa unicode string

Atropos posterior formulation and whether or not to use mixture model proportions. e.g ‘Socrates[1]’ (default) or ‘Aristotle[1]’. Choose the latter if you want use the distance priors (see also the -l option for label propagation control). Maps to a command-line argument: -b %s.

prior_segmentation_weighta float

Atropos spatial prior probability weight for the segmentation. Maps to a command-line argument: -w %f.

quick_registrationa boolean

If = 1, use antsRegistrationSyNQuick.sh as the basis for registration during brain extraction, brain segmentation, and (optional) normalization to a template. Otherwise use antsRegistrationSyN.sh (default = 0). Maps to a command-line argument: -q 1.

segmentation_iterationsan integer (int or long)

N4 -> Atropos -> N4 iterations during segmentation (default = 3). Maps to a command-line argument: -n %d.

use_floatingpoint_precision0 or 1

Use floating point precision in registrations (default = 0). Maps to a command-line argument: -j %d.

use_random_seeding0 or 1

Use random number generated from system clock in Atropos (default = 1). Maps to a command-line argument: -u %d.

BrainExtractionMaska pathlike object or string representing an existing file

Brain extraction mask.

BrainSegmentationa pathlike object or string representing an existing file

Brain segmentaion image.

BrainSegmentationN4a pathlike object or string representing an existing file

N4 corrected image.

BrainSegmentationPosteriorsa list of items which are a pathlike object or string representing an existing file

Posterior probability images.

BrainVolumesa pathlike object or string representing an existing file

Brain volumes as text.

CorticalThicknessa pathlike object or string representing an existing file

Cortical thickness file.

CorticalThicknessNormedToTemplatea pathlike object or string representing an existing file

Normalized cortical thickness.

ExtractedBrainN4a pathlike object or string representing an existing file

Extracted brain from N4 image.

SubjectToTemplate0GenericAffinea pathlike object or string representing an existing file

Template to subject inverse affine.

SubjectToTemplate1Warpa pathlike object or string representing an existing file

Template to subject inverse warp.

SubjectToTemplateLogJacobiana pathlike object or string representing an existing file

Template to subject log jacobian.

TemplateToSubject0Warpa pathlike object or string representing an existing file

Template to subject warp.

TemplateToSubject1GenericAffinea pathlike object or string representing an existing file

Template to subject affine.

CreateJacobianDeterminantImage

Link to code

Bases: ANTSCommand

Wrapped executable: CreateJacobianDeterminantImage.

Examples

>>> from nipype.interfaces.ants import CreateJacobianDeterminantImage
>>> jacobian = CreateJacobianDeterminantImage()
>>> jacobian.inputs.imageDimension = 3
>>> jacobian.inputs.deformationField = 'ants_Warp.nii.gz'
>>> jacobian.inputs.outputImage = 'out_name.nii.gz'
>>> jacobian.cmdline
'CreateJacobianDeterminantImage 3 ants_Warp.nii.gz out_name.nii.gz'
deformationFielda pathlike object or string representing an existing file

Deformation transformation file. Maps to a command-line argument: %s (position: 1).

imageDimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0).

outputImagea pathlike object or string representing a file

Output filename. Maps to a command-line argument: %s (position: 2).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

doLogJacobian0 or 1

Return the log jacobian. Maps to a command-line argument: %d (position: 3).

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

useGeometric0 or 1

Return the geometric jacobian. Maps to a command-line argument: %d (position: 4).

jacobian_imagea pathlike object or string representing an existing file

Jacobian image.

CreateTiledMosaic

Link to code

Bases: ANTSCommand

Wrapped executable: CreateTiledMosaic.

The program CreateTiledMosaic in conjunction with ConvertScalarImageToRGB provides useful functionality for common image analysis tasks. The basic usage of CreateTiledMosaic is to tile a 3-D image volume slice-wise into a 2-D image.

Examples

>>> from nipype.interfaces.ants.visualization import CreateTiledMosaic
>>> mosaic_slicer = CreateTiledMosaic()
>>> mosaic_slicer.inputs.input_image = 'T1.nii.gz'
>>> mosaic_slicer.inputs.rgb_image = 'rgb.nii.gz'
>>> mosaic_slicer.inputs.mask_image = 'mask.nii.gz'
>>> mosaic_slicer.inputs.output_image = 'output.png'
>>> mosaic_slicer.inputs.alpha_value = 0.5
>>> mosaic_slicer.inputs.direction = 2
>>> mosaic_slicer.inputs.pad_or_crop = '[ -15x -50 , -15x -30 ,0]'
>>> mosaic_slicer.inputs.slices = '[2 ,100 ,160]'
>>> mosaic_slicer.cmdline
'CreateTiledMosaic -a 0.50 -d 2 -i T1.nii.gz -x mask.nii.gz -o output.png -p [ -15x -50 , -15x -30 ,0] -r rgb.nii.gz -s [2 ,100 ,160]'
input_imagea pathlike object or string representing an existing file

Main input is a 3-D grayscale image. Maps to a command-line argument: -i %s.

rgb_imagea pathlike object or string representing an existing file

An optional Rgb image can be added as an overlay.It must have the same imagegeometry as the input grayscale image. Maps to a command-line argument: -r %s.

alpha_valuea float

If an Rgb image is provided, render the overlay using the specified alpha parameter. Maps to a command-line argument: -a %.2f.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

directionan integer (int or long)

Specifies the direction of the slices. If no direction is specified, the direction with the coarsest spacing is chosen. Maps to a command-line argument: -d %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

flip_slicea unicode string

FlipXxflipY. Maps to a command-line argument: -f %s.

mask_imagea pathlike object or string representing an existing file

Specifies the ROI of the RGB voxels used. Maps to a command-line argument: -x %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea unicode string

The output consists of the tiled mosaic image. Maps to a command-line argument: -o %s. (Nipype default value: output.png)

pad_or_cropa unicode string

Argument passed to -p flag:[padVoxelWidth,<constantValue=0>][lowerPadding[0]xlowerPadding[1],upperPadding[0]xupperPadding[1],constantValue]The user can specify whether to pad or crop a specified voxel-width boundary of each individual slice. For this program, cropping is simply padding with negative voxel-widths.If one pads (+), the user can also specify a constant pad value (default = 0). If a mask is specified, the user can use the mask to define the region, by using the keyword “mask” plus an offset, e.g. “-p mask+3”. Maps to a command-line argument: -p %s.

permute_axesa boolean

DoPermute. Maps to a command-line argument: -g.

slicesa unicode string

Number of slices to increment Slice1xSlice2xSlice3[numberOfSlicesToIncrement,<minSlice=0>,<maxSlice=lastSlice>]. Maps to a command-line argument: -s %s.

tile_geometrya unicode string

The tile geometry specifies the number of rows and columnsin the output image. For example, if the user specifies “5x10”, then 5 rows by 10 columns of slices are rendered. If R < 0 and C > 0 (or vice versa), the negative value is selectedbased on direction. Maps to a command-line argument: -t %s.

output_imagea pathlike object or string representing an existing file

Image file.

DenoiseImage

Link to code

Bases: ANTSCommand

Wrapped executable: DenoiseImage.

Examples

>>> import copy
>>> from nipype.interfaces.ants import DenoiseImage
>>> denoise = DenoiseImage()
>>> denoise.inputs.dimension = 3
>>> denoise.inputs.input_image = 'im1.nii'
>>> denoise.cmdline
'DenoiseImage -d 3 -i im1.nii -n Gaussian -o im1_noise_corrected.nii -s 1'
>>> denoise_2 = copy.deepcopy(denoise)
>>> denoise_2.inputs.output_image = 'output_corrected_image.nii.gz'
>>> denoise_2.inputs.noise_model = 'Rician'
>>> denoise_2.inputs.shrink_factor = 2
>>> denoise_2.cmdline
'DenoiseImage -d 3 -i im1.nii -n Rician -o output_corrected_image.nii.gz -s 2'
>>> denoise_3 = DenoiseImage()
>>> denoise_3.inputs.input_image = 'im1.nii'
>>> denoise_3.inputs.save_noise = True
>>> denoise_3.cmdline
'DenoiseImage -i im1.nii -n Gaussian -o [ im1_noise_corrected.nii, im1_noise.nii ] -s 1'
input_imagea pathlike object or string representing an existing file

A scalar image is expected as input for noise correction. Maps to a command-line argument: -i %s.

save_noisea boolean

True if the estimated noise should be saved to file. Mutually exclusive with inputs: noise_image. (Nipype default value: False)

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension2 or 3 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, the program tries to infer the dimensionality from the input image. Maps to a command-line argument: -d %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

noise_imagea pathlike object or string representing a file

Filename for the estimated noise.

noise_model‘Gaussian’ or ‘Rician’

Employ a Rician or Gaussian noise model. Maps to a command-line argument: -n %s. (Nipype default value: Gaussian)

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea pathlike object or string representing a file

The output consists of the noise corrected version of the input image. Maps to a command-line argument: -o %s.

shrink_factoran integer (int or long)

Running noise correction on large images can be time consuming. To lessen computation time, the input image can be resampled. The shrink factor, specified as a single integer, describes this resampling. Shrink factor = 1 is the default. Maps to a command-line argument: -s %s. (Nipype default value: 1)

verbosea boolean

Verbose output. Maps to a command-line argument: -v.

noise_image : a pathlike object or string representing a file output_image : a pathlike object or string representing an existing file

ImageMath

Link to code

Bases: ANTSCommand, CopyHeaderInterface

Wrapped executable: ImageMath.

Operations over images.

Examples

>>> ImageMath(
...     op1='structural.nii',
...     operation='+',
...     op2='2').cmdline
'ImageMath 3 structural_maths.nii + structural.nii 2'
>>> ImageMath(
...     op1='structural.nii',
...     operation='Project',
...     op2='1 2').cmdline
'ImageMath 3 structural_maths.nii Project structural.nii 1 2'
>>> ImageMath(
...     op1='structural.nii',
...     operation='G',
...     op2='4').cmdline
'ImageMath 3 structural_maths.nii G structural.nii 4'
>>> ImageMath(
...     op1='structural.nii',
...     operation='TruncateImageIntensity',
...     op2='0.005 0.999 256').cmdline
'ImageMath 3 structural_maths.nii TruncateImageIntensity structural.nii 0.005 0.999 256'

By default, Nipype copies headers from the first input image (op1) to the output image. For the PadImage operation, the header cannot be copied from inputs to outputs, and so copy_header option is automatically set to False.

>>> pad = ImageMath(
...     op1='structural.nii',
...     operation='PadImage')
>>> pad.inputs.copy_header
False

While the operation is set to PadImage, setting copy_header = True will have no effect.

>>> pad.inputs.copy_header = True
>>> pad.inputs.copy_header
False

For any other operation, copy_header can be enabled/disabled normally:

>>> pad.inputs.operation = "ME"
>>> pad.inputs.copy_header = True
>>> pad.inputs.copy_header
True
op1a pathlike object or string representing an existing file

First operator. Maps to a command-line argument: %s (position: -2).

operation‘m’ or ‘vm’ or ‘+’ or ‘v+’ or ‘-’ or ‘v-’ or ‘/’ or ‘^’ or ‘max’ or ‘exp’ or ‘addtozero’ or ‘overadd’ or ‘abs’ or ‘total’ or ‘mean’ or ‘vtotal’ or ‘Decision’ or ‘Neg’ or ‘Project’ or ‘G’ or ‘MD’ or ‘ME’ or ‘MO’ or ‘MC’ or ‘GD’ or ‘GE’ or ‘GO’ or ‘GC’ or ‘TruncateImageIntensity’ or ‘Laplacian’ or ‘GetLargestComponent’ or ‘FillHoles’ or ‘PadImage’

Mathematical operations. Maps to a command-line argument: %s (position: 3).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

copy_headera boolean

Copy headers of the original image into the output (corrected) file. (Nipype default value: True)

dimensionan integer (int or long)

Dimension of output image. Maps to a command-line argument: %d (position: 1). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

op2a pathlike object or string representing an existing file or a unicode string

Second operator. Maps to a command-line argument: %s (position: -1).

output_imagea pathlike object or string representing a file

Output image file. Maps to a command-line argument: %s (position: 2).

output_imagea pathlike object or string representing an existing file

Output image file.

JointFusion

Link to code

Bases: ANTSCommand

Wrapped executable: antsJointFusion.

An image fusion algorithm.

Developed by Hongzhi Wang and Paul Yushkevich, and it won segmentation challenges at MICCAI 2012 and MICCAI 2013. The original label fusion framework was extended to accommodate intensities by Brian Avants. This implementation is based on Paul’s original ITK-style implementation and Brian’s ANTsR implementation.

References include 1) H. Wang, J. W. Suh, S. Das, J. Pluta, C. Craige, P. Yushkevich, Multi-atlas segmentation with joint label fusion IEEE Trans. on Pattern Analysis and Machine Intelligence, 35(3), 611-623, 2013. and 2) H. Wang and P. A. Yushkevich, Multi-atlas segmentation with joint label fusion and corrective learning–an open source implementation, Front. Neuroinform., 2013.

Examples

>>> from nipype.interfaces.ants import JointFusion
>>> jf = JointFusion()
>>> jf.inputs.out_label_fusion = 'ants_fusion_label_output.nii'
>>> jf.inputs.atlas_image = [ ['rc1s1.nii','rc1s2.nii'] ]
>>> jf.inputs.atlas_segmentation_image = ['segmentation0.nii.gz']
>>> jf.inputs.target_image = ['im1.nii']
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -l segmentation0.nii.gz
-b 2.0 -o ants_fusion_label_output.nii -s 3x3x3 -t ['im1.nii']"
>>> jf.inputs.target_image = [ ['im1.nii', 'im2.nii'] ]
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -l segmentation0.nii.gz
-b 2.0 -o ants_fusion_label_output.nii -s 3x3x3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.atlas_image = [ ['rc1s1.nii','rc1s2.nii'],
...                                        ['rc2s1.nii','rc2s2.nii'] ]
>>> jf.inputs.atlas_segmentation_image = ['segmentation0.nii.gz',
...                                                    'segmentation1.nii.gz']
>>> jf.cmdline
"antsJointFusion -a 0.1 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 2.0 -o ants_fusion_label_output.nii
-s 3x3x3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.dimension = 3
>>> jf.inputs.alpha = 0.5
>>> jf.inputs.beta = 1.0
>>> jf.inputs.patch_radius = [3,2,1]
>>> jf.inputs.search_radius = [3]
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -o ants_fusion_label_output.nii
-p 3x2x1 -s 3 -t ['im1.nii', 'im2.nii']"
>>> jf.inputs.search_radius = ['mask.nii']
>>> jf.inputs.verbose = True
>>> jf.inputs.exclusion_image = ['roi01.nii', 'roi02.nii']
>>> jf.inputs.exclusion_image_label = ['1','2']
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -e 1[roi01.nii] -e 2[roi02.nii]
-o ants_fusion_label_output.nii -p 3x2x1 -s mask.nii -t ['im1.nii', 'im2.nii'] -v"
>>> jf.inputs.out_label_fusion = 'ants_fusion_label_output.nii'
>>> jf.inputs.out_intensity_fusion_name_format = 'ants_joint_fusion_intensity_%d.nii.gz'
>>> jf.inputs.out_label_post_prob_name_format = 'ants_joint_fusion_posterior_%d.nii.gz'
>>> jf.inputs.out_atlas_voting_weight_name_format = 'ants_joint_fusion_voting_weight_%d.nii.gz'
>>> jf.cmdline
"antsJointFusion -a 0.5 -g ['rc1s1.nii', 'rc1s2.nii'] -g ['rc2s1.nii', 'rc2s2.nii']
-l segmentation0.nii.gz -l segmentation1.nii.gz -b 1.0 -d 3 -e 1[roi01.nii] -e 2[roi02.nii]
-o [ants_fusion_label_output.nii, ants_joint_fusion_intensity_%d.nii.gz,
ants_joint_fusion_posterior_%d.nii.gz, ants_joint_fusion_voting_weight_%d.nii.gz]
-p 3x2x1 -s mask.nii -t ['im1.nii', 'im2.nii'] -v"
atlas_imagea list of items which are a list of items which are a pathlike object or string representing an existing file

The atlas image (or multimodal atlas images) assumed to be aligned to a common image domain. Maps to a command-line argument: -g %s....

atlas_segmentation_imagea list of items which are a pathlike object or string representing an existing file

The atlas segmentation images. For performing label fusion the number of specified segmentations should be identical to the number of atlas image sets. Maps to a command-line argument: -l %s....

target_imagea list of items which are a list of items which are a pathlike object or string representing an existing file

The target image (or multimodal target images) assumed to be aligned to a common image domain. Maps to a command-line argument: -t %s.

alphaa float

Regularization term added to matrix Mx for calculating the inverse. Default = 0.1. Maps to a command-line argument: -a %s. (Nipype default value: 0.1)

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

betaa float

Exponent for mapping intensity difference to the joint error. Default = 2.0. Maps to a command-line argument: -b %s. (Nipype default value: 2.0)

constrain_nonnegativea boolean

Constrain solution to non-negative weights. Maps to a command-line argument: -c. (Nipype default value: False)

dimension3 or 2 or 4

This option forces the image to be treated as a specified-dimensional image. If not specified, the program tries to infer the dimensionality from the input image. Maps to a command-line argument: -d %d.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

exclusion_imagea list of items which are a pathlike object or string representing an existing file

Specify an exclusion region for the given label.

exclusion_image_labela list of items which are a unicode string

Specify a label for the exclusion region. Maps to a command-line argument: -e %s. Requires inputs: exclusion_image.

mask_imagea pathlike object or string representing an existing file

If a mask image is specified, fusion is only performed in the mask region. Maps to a command-line argument: -x %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_atlas_voting_weight_name_formata unicode string

Optional atlas voting weight image file name format. Requires inputs: out_label_fusion, out_intensity_fusion_name_format, out_label_post_prob_name_format.

out_intensity_fusion_name_formata unicode string

Optional intensity fusion image file name format. (e.g. “antsJointFusionIntensity_%d.nii.gz”).

out_label_fusiona pathlike object or string representing a file

The output label fusion image. Maps to a command-line argument: %s.

out_label_post_prob_name_formata unicode string

Optional label posterior probability image file name format. Requires inputs: out_label_fusion, out_intensity_fusion_name_format.

patch_metric‘PC’ or ‘MSQ’

Metric to be used in determining the most similar neighborhood patch. Options include Pearson’s correlation (PC) and mean squares (MSQ). Default = PC (Pearson correlation). Maps to a command-line argument: -m %s.

patch_radiusa list of items which are a value of class ‘int’

Patch radius for similarity measures. Default: 2x2x2. Maps to a command-line argument: -p %s.

retain_atlas_voting_imagesa boolean

Retain atlas voting images. Default = false. Maps to a command-line argument: -f. (Nipype default value: False)

retain_label_posterior_imagesa boolean

Retain label posterior probability images. Requires atlas segmentations to be specified. Default = false. Maps to a command-line argument: -r. Requires inputs: atlas_segmentation_image. (Nipype default value: False)

search_radiusa list of from 1 to 3 items which are any value

Search radius for similarity measures. Default = 3x3x3. One can also specify an image where the value at the voxel specifies the isotropic search radius at that voxel. Maps to a command-line argument: -s %s. (Nipype default value: [3, 3, 3])

verbosea boolean

Verbose output. Maps to a command-line argument: -v.

out_atlas_voting_weight : a list of items which are a pathlike object or string representing an existing file out_intensity_fusion : a list of items which are a pathlike object or string representing an existing file out_label_fusion : a pathlike object or string representing an existing file out_label_post_prob : a list of items which are a pathlike object or string representing an existing file

LabelGeometry

Link to code

Bases: ANTSCommand

Wrapped executable: LabelGeometryMeasures.

Extracts geometry measures using a label file and an optional image file

Examples

>>> from nipype.interfaces.ants import LabelGeometry
>>> label_extract = LabelGeometry()
>>> label_extract.inputs.dimension = 3
>>> label_extract.inputs.label_image = 'atlas.nii.gz'
>>> label_extract.cmdline
'LabelGeometryMeasures 3 atlas.nii.gz [] atlas.csv'
>>> label_extract.inputs.intensity_image = 'ants_Warp.nii.gz'
>>> label_extract.cmdline
'LabelGeometryMeasures 3 atlas.nii.gz ants_Warp.nii.gz atlas.csv'
intensity_imagea pathlike object or string representing an existing file

Intensity image to extract values from. This is an optional input. Maps to a command-line argument: %s (position: 2). (Nipype default value: [])

label_imagea pathlike object or string representing a file

Label image to use for extracting geometry measures. Maps to a command-line argument: %s (position: 1).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_filea unicode string

Name of output file. Maps to a command-line argument: %s (position: 3).

output_filea pathlike object or string representing an existing file

CSV file of geometry measures.

LaplacianThickness

Link to code

Bases: ANTSCommand

Wrapped executable: LaplacianThickness.

Calculates the cortical thickness from an anatomical image

Examples

>>> from nipype.interfaces.ants import LaplacianThickness
>>> cort_thick = LaplacianThickness()
>>> cort_thick.inputs.input_wm = 'white_matter.nii.gz'
>>> cort_thick.inputs.input_gm = 'gray_matter.nii.gz'
>>> cort_thick.cmdline
'LaplacianThickness white_matter.nii.gz gray_matter.nii.gz white_matter_thickness.nii.gz'
>>> cort_thick.inputs.output_image = 'output_thickness.nii.gz'
>>> cort_thick.cmdline
'LaplacianThickness white_matter.nii.gz gray_matter.nii.gz output_thickness.nii.gz'
input_gma pathlike object or string representing a file

Gray matter segmentation image. Maps to a command-line argument: %s (position: 2).

input_wma pathlike object or string representing a file

White matter segmentation image. Maps to a command-line argument: %s (position: 1).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dTa float

Time delta used during integration (defaults to 0.01). Maps to a command-line argument: %s (position: 6). Requires inputs: prior_thickness.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea unicode string

Name of output file. Maps to a command-line argument: %s (position: 3).

prior_thicknessa float

Prior thickness (defaults to 500). Maps to a command-line argument: %s (position: 5). Requires inputs: smooth_param.

smooth_parama float

Sigma of the Laplacian Recursive Image Filter (defaults to 1). Maps to a command-line argument: %s (position: 4).

sulcus_priora float

Positive floating point number for sulcus prior. Authors said that 0.15 might be a reasonable value. Maps to a command-line argument: %s (position: 7). Requires inputs: dT.

tolerancea float

Tolerance to reach during optimization (defaults to 0.001). Maps to a command-line argument: %s (position: 8). Requires inputs: sulcus_prior.

output_imagea pathlike object or string representing an existing file

Cortical thickness.

MeasureImageSimilarity

Link to code

Bases: ANTSCommand

Wrapped executable: MeasureImageSimilarity.

Examples

>>> from nipype.interfaces.ants import MeasureImageSimilarity
>>> sim = MeasureImageSimilarity()
>>> sim.inputs.dimension = 3
>>> sim.inputs.metric = 'MI'
>>> sim.inputs.fixed_image = 'T1.nii'
>>> sim.inputs.moving_image = 'resting.nii'
>>> sim.inputs.metric_weight = 1.0
>>> sim.inputs.radius_or_number_of_bins = 5
>>> sim.inputs.sampling_strategy = 'Regular'
>>> sim.inputs.sampling_percentage = 1.0
>>> sim.inputs.fixed_image_mask = 'mask.nii'
>>> sim.inputs.moving_image_mask = 'mask.nii.gz'
>>> sim.cmdline
'MeasureImageSimilarity --dimensionality 3 --masks ["mask.nii","mask.nii.gz"] --metric MI["T1.nii","resting.nii",1.0,5,Regular,1.0]'
fixed_imagea pathlike object or string representing an existing file

Image to which the moving image is warped.

metric‘CC’ or ‘MI’ or ‘Mattes’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’

Maps to a command-line argument: %s.

moving_imagea pathlike object or string representing an existing file

Image to apply transformation to (generally a coregistered functional).

radius_or_number_of_binsan integer (int or long)

The number of bins in each stage for the MI and Mattes metric, or the radius for other metrics. Requires inputs: metric.

sampling_percentage0.0 <= a floating point number <= 1.0

Percentage of points accessible to the sampling strategy over which to optimize the metric. Requires inputs: metric.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension2 or 3 or 4

Dimensionality of the fixed/moving image pair. Maps to a command-line argument: --dimensionality %d (position: 1).

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

fixed_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the fixed image. Maps to a command-line argument: %s.

metric_weighta float

The “metricWeight” variable is not used. Requires inputs: metric. (Nipype default value: 1.0)

moving_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the moving image. Requires inputs: fixed_image_mask.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

sampling_strategy‘None’ or ‘Regular’ or ‘Random’

Manner of choosing point set over which to optimize the metric. Defaults to “None” (i.e. a dense sampling of one sample per voxel). Requires inputs: metric. (Nipype default value: None)

similarity : a float

MeasureImageSimilarity.aggregate_outputs(runtime=None, needed_outputs=None)

Collate expected outputs and apply output traits validation.

MultiplyImages

Link to code

Bases: ANTSCommand

Wrapped executable: MultiplyImages.

Examples

>>> from nipype.interfaces.ants import MultiplyImages
>>> test = MultiplyImages()
>>> test.inputs.dimension = 3
>>> test.inputs.first_input = 'moving2.nii'
>>> test.inputs.second_input = 0.25
>>> test.inputs.output_product_image = "out.nii"
>>> test.cmdline
'MultiplyImages 3 moving2.nii 0.25 out.nii'
dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 0).

first_inputa pathlike object or string representing an existing file

Image 1. Maps to a command-line argument: %s (position: 1).

output_product_imagea pathlike object or string representing a file

Outputfname.nii.gz: the name of the resulting image. Maps to a command-line argument: %s (position: 3).

second_inputa pathlike object or string representing an existing file or a float

Image 2 or multiplication weight. Maps to a command-line argument: %s (position: 2).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_product_imagea pathlike object or string representing an existing file

Average image file.

N4BiasFieldCorrection

Link to code

Bases: ANTSCommand, CopyHeaderInterface

Wrapped executable: N4BiasFieldCorrection.

Bias field correction.

N4 is a variant of the popular N3 (nonparameteric nonuniform normalization) retrospective bias correction algorithm. Based on the assumption that the corruption of the low frequency bias field can be modeled as a convolution of the intensity histogram by a Gaussian, the basic algorithmic protocol is to iterate between deconvolving the intensity histogram by a Gaussian, remapping the intensities, and then spatially smoothing this result by a B-spline modeling of the bias field itself. The modifications from and improvements obtained over the original N3 algorithm are described in [Tustison2010].

Tustison2010

N. Tustison et al., N4ITK: Improved N3 Bias Correction, IEEE Transactions on Medical Imaging, 29(6):1310-1320, June 2010.

Examples

>>> import copy
>>> from nipype.interfaces.ants import N4BiasFieldCorrection
>>> n4 = N4BiasFieldCorrection()
>>> n4.inputs.dimension = 3
>>> n4.inputs.input_image = 'structural.nii'
>>> n4.inputs.bspline_fitting_distance = 300
>>> n4.inputs.shrink_factor = 3
>>> n4.inputs.n_iterations = [50,50,30,20]
>>> n4.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_2 = copy.deepcopy(n4)
>>> n4_2.inputs.convergence_threshold = 1e-6
>>> n4_2.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20, 1e-06 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_3 = copy.deepcopy(n4_2)
>>> n4_3.inputs.bspline_order = 5
>>> n4_3.cmdline
'N4BiasFieldCorrection --bspline-fitting [ 300, 5 ]
-d 3 --input-image structural.nii
--convergence [ 50x50x30x20, 1e-06 ] --output structural_corrected.nii
--shrink-factor 3'
>>> n4_4 = N4BiasFieldCorrection()
>>> n4_4.inputs.input_image = 'structural.nii'
>>> n4_4.inputs.save_bias = True
>>> n4_4.inputs.dimension = 3
>>> n4_4.cmdline
'N4BiasFieldCorrection -d 3 --input-image structural.nii
--output [ structural_corrected.nii, structural_bias.nii ]'
>>> n4_5 = N4BiasFieldCorrection()
>>> n4_5.inputs.input_image = 'structural.nii'
>>> n4_5.inputs.dimension = 3
>>> n4_5.inputs.histogram_sharpening = (0.12, 0.02, 200)
>>> n4_5.cmdline
'N4BiasFieldCorrection -d 3  --histogram-sharpening [0.12,0.02,200]
--input-image structural.nii --output structural_corrected.nii'
copy_headera boolean

Copy headers of the original image into the output (corrected) file. (Nipype default value: False)

input_imagea pathlike object or string representing a file

Input for bias correction. Negative values or values close to zero should be processed prior to correction. Maps to a command-line argument: --input-image %s.

save_biasa boolean

True if the estimated bias should be saved to file. Mutually exclusive with inputs: bias_image. (Nipype default value: False)

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

bias_imagea pathlike object or string representing a file

Filename for the estimated bias.

bspline_fitting_distancea float

Maps to a command-line argument: --bspline-fitting %s.

bspline_orderan integer (int or long)

Requires inputs: bspline_fitting_distance.

convergence_thresholda float

Requires inputs: n_iterations.

dimension3 or 2 or 4

Image dimension (2, 3 or 4). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

histogram_sharpeninga tuple of the form: (a float, a float, an integer (int or long))

Three-values tuple of histogram sharpening parameters (FWHM, wienerNose, numberOfHistogramBins). These options describe the histogram sharpening parameters, i.e. the deconvolution step parameters described in the original N3 algorithm. The default values have been shown to work fairly well. Maps to a command-line argument: --histogram-sharpening [%g,%g,%d].

mask_imagea pathlike object or string representing a file

Image to specify region to perform final bias correction in. Maps to a command-line argument: --mask-image %s.

n_iterationsa list of items which are an integer (int or long)

Maps to a command-line argument: --convergence %s.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea unicode string

Output file name. Maps to a command-line argument: --output %s.

rescale_intensitiesa boolean

[NOTE: Only ANTs>=2.1.0] At each iteration, a new intensity mapping is calculated and applied but there is nothing which constrains the new intensity range to be within certain values. The result is that the range can “drift” from the original at each iteration. This option rescales to the [min,max] range of the original image intensities within the user-specified mask. Maps to a command-line argument: -r. (Nipype default value: False)

shrink_factoran integer (int or long)

Maps to a command-line argument: --shrink-factor %d.

weight_imagea pathlike object or string representing a file

Image for relative weighting (e.g. probability map of the white matter) of voxels during the B-spline fitting. . Maps to a command-line argument: --weight-image %s.

bias_imagea pathlike object or string representing an existing file

Estimated bias.

output_imagea pathlike object or string representing an existing file

Warped image.

Registration

Link to code

Bases: ANTSCommand

Wrapped executable: antsRegistration.

ANTs Registration command for registration of images

antsRegistration registers a moving_image to a fixed_image, using a predefined (sequence of) cost function(s) and transformation operations. The cost function is defined using one or more ‘metrics’, specifically local cross-correlation (CC), Mean Squares (MeanSquares), Demons (Demons), global correlation (GC), or Mutual Information (Mattes or MI).

ANTS can use both linear (Translation, Rigid, Affine, CompositeAffine, or Translation) and non-linear transformations (BSpline, GaussianDisplacementField, TimeVaryingVelocityField, TimeVaryingBSplineVelocityField, SyN, BSplineSyN, Exponential, or BSplineExponential). Usually, registration is done in multiple stages. For example first an Affine, then a Rigid, and ultimately a non-linear (Syn)-transformation.

antsRegistration can be initialized using one ore more transforms from moving_image to fixed_image with the initial_moving_transform-input. For example, when you already have a warpfield that corrects for geometrical distortions in an EPI (functional) image, that you want to apply before an Affine registration to a structural image. You could put this transform into ‘intial_moving_transform’.

The Registration-interface can output the resulting transform(s) that map moving_image to fixed_image in a single file as a composite_transform (if write_composite_transform is set to True), or a list of transforms as forwards_transforms. It can also output inverse transforms (from fixed_image to moving_image) in a similar fashion using inverse_composite_transform. Note that the order of forward_transforms is in ‘natural’ order: the first element should be applied first, the last element should be applied last.

Note, however, that ANTS tools always apply lists of transformations in reverse order (the last transformation in the list is applied first). Therefore, if the output forward_transforms is a list, one can not directly feed it into, for example, ants.ApplyTransforms. To make ants.ApplyTransforms apply the transformations in the same order as ants.Registration, you have to provide the list of transformations in reverse order from forward_transforms. reverse_forward_transforms outputs forward_transforms in reverse order and can be used for this purpose. Note also that, because composite_transform is always a single file, this output is preferred for most use-cases.

More information can be found in the ANTS manual.

See below for some useful examples.

Examples

Set up a Registration node with some default settings. This Node registers ‘fixed1.nii’ to ‘moving1.nii’ by first fitting a linear ‘Affine’ transformation, and then a non-linear ‘SyN’ transformation, both using the Mutual Information-cost metric.

The registration is initialized by first applying the (linear) transform trans.mat.

>>> import copy, pprint
>>> from nipype.interfaces.ants import Registration
>>> reg = Registration()
>>> reg.inputs.fixed_image = 'fixed1.nii'
>>> reg.inputs.moving_image = 'moving1.nii'
>>> reg.inputs.output_transform_prefix = "output_"
>>> reg.inputs.initial_moving_transform = 'trans.mat'
>>> reg.inputs.transforms = ['Affine', 'SyN']
>>> reg.inputs.transform_parameters = [(2.0,), (0.25, 3.0, 0.0)]
>>> reg.inputs.number_of_iterations = [[1500, 200], [100, 50, 30]]
>>> reg.inputs.dimension = 3
>>> reg.inputs.write_composite_transform = True
>>> reg.inputs.collapse_output_transforms = False
>>> reg.inputs.initialize_transforms_per_stage = False
>>> reg.inputs.metric = ['Mattes']*2
>>> reg.inputs.metric_weight = [1]*2 # Default (value ignored currently by ANTs)
>>> reg.inputs.radius_or_number_of_bins = [32]*2
>>> reg.inputs.sampling_strategy = ['Random', None]
>>> reg.inputs.sampling_percentage = [0.05, None]
>>> reg.inputs.convergence_threshold = [1.e-8, 1.e-9]
>>> reg.inputs.convergence_window_size = [20]*2
>>> reg.inputs.smoothing_sigmas = [[1,0], [2,1,0]]
>>> reg.inputs.sigma_units = ['vox'] * 2
>>> reg.inputs.shrink_factors = [[2,1], [3,2,1]]
>>> reg.inputs.use_estimate_learning_rate_once = [True, True]
>>> reg.inputs.use_histogram_matching = [True, True] # This is the default
>>> reg.inputs.output_warped_image = 'output_warped_image.nii.gz'
>>> reg.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 0 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> reg.run()  

Same as reg1, but first invert the initial transform (‘trans.mat’) before applying it.

>>> reg.inputs.invert_initial_moving_transform = True
>>> reg1 = copy.deepcopy(reg)
>>> reg1.inputs.winsorize_lower_quantile = 0.025
>>> reg1.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.025, 1.0 ]  --write-composite-transform 1'
>>> reg1.run()  

Clip extremely high intensity data points using winsorize_upper_quantile. All data points higher than the 0.975 quantile are set to the value of the 0.975 quantile.

>>> reg2 = copy.deepcopy(reg)
>>> reg2.inputs.winsorize_upper_quantile = 0.975
>>> reg2.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 0.975 ]  --write-composite-transform 1'

Clip extremely low intensity data points using winsorize_lower_quantile. All data points lower than the 0.025 quantile are set to the original value at the 0.025 quantile.

>>> reg3 = copy.deepcopy(reg)
>>> reg3.inputs.winsorize_lower_quantile = 0.025
>>> reg3.inputs.winsorize_upper_quantile = 0.975
>>> reg3.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.025, 0.975 ]  --write-composite-transform 1'

Use float instead of double for computations (saves memory usage)

>>> reg3a = copy.deepcopy(reg)
>>> reg3a.inputs.float = True
>>> reg3a.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --float 1 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Force to use double instead of float for computations (more precision and memory usage).

>>> reg3b = copy.deepcopy(reg)
>>> reg3b.inputs.float = False
>>> reg3b.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --float 0 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

‘collapse_output_transforms’ can be used to put all transformation in a single ‘composite_transform’- file. Note that forward_transforms will now be an empty list.

>>> # Test collapse transforms flag
>>> reg4 = copy.deepcopy(reg)
>>> reg4.inputs.save_state = 'trans.mat'
>>> reg4.inputs.restore_state = 'trans.mat'
>>> reg4.inputs.initialize_transforms_per_stage = True
>>> reg4.inputs.collapse_output_transforms = True
>>> outputs = reg4._list_outputs()
>>> pprint.pprint(outputs)  
{'composite_transform': '...data/output_Composite.h5',
 'elapsed_time': <undefined>,
 'forward_invert_flags': [],
 'forward_transforms': [],
 'inverse_composite_transform': '...data/output_InverseComposite.h5',
 'inverse_warped_image': <undefined>,
 'metric_value': <undefined>,
 'reverse_forward_invert_flags': [],
 'reverse_forward_transforms': [],
 'reverse_invert_flags': [],
 'reverse_transforms': [],
 'save_state': '...data/trans.mat',
 'warped_image': '...data/output_warped_image.nii.gz'}
>>> reg4.cmdline
'antsRegistration --collapse-output-transforms 1 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 1 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --restore-state trans.mat --save-state trans.mat --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> # Test collapse transforms flag
>>> reg4b = copy.deepcopy(reg4)
>>> reg4b.inputs.write_composite_transform = False
>>> outputs = reg4b._list_outputs()
>>> pprint.pprint(outputs)  
{'composite_transform': <undefined>,
 'elapsed_time': <undefined>,
 'forward_invert_flags': [False, False],
 'forward_transforms': ['...data/output_0GenericAffine.mat',
 '...data/output_1Warp.nii.gz'],
 'inverse_composite_transform': <undefined>,
 'inverse_warped_image': <undefined>,
 'metric_value': <undefined>,
 'reverse_forward_invert_flags': [False, False],
 'reverse_forward_transforms': ['...data/output_1Warp.nii.gz',
 '...data/output_0GenericAffine.mat'],
 'reverse_invert_flags': [True, False],
 'reverse_transforms': ['...data/output_0GenericAffine.mat',     '...data/output_1InverseWarp.nii.gz'],
 'save_state': '...data/trans.mat',
 'warped_image': '...data/output_warped_image.nii.gz'}
>>> reg4b.aggregate_outputs()  
>>> reg4b.cmdline
'antsRegistration --collapse-output-transforms 1 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 1 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --restore-state trans.mat --save-state trans.mat --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 0'

One can use multiple similarity metrics in a single registration stage.The Node below first performs a linear registation using only the Mutual Information (‘Mattes’)-metric. In a second stage, it performs a non-linear registration (‘Syn’) using both a Mutual Information and a local cross-correlation (‘CC’)-metric. Both metrics are weighted equally (‘metric_weight’ is .5 for both). The Mutual Information- metric uses 32 bins. The local cross-correlations (correlations between every voxel’s neighborhoods) is computed with a radius of 4.

>>> # Test multiple metrics per stage
>>> reg5 = copy.deepcopy(reg)
>>> reg5.inputs.fixed_image = 'fixed1.nii'
>>> reg5.inputs.moving_image = 'moving1.nii'
>>> reg5.inputs.metric = ['Mattes', ['Mattes', 'CC']]
>>> reg5.inputs.metric_weight = [1, [.5,.5]]
>>> reg5.inputs.radius_or_number_of_bins = [32, [32, 4] ]
>>> reg5.inputs.sampling_strategy = ['Random', None] # use default strategy in second stage
>>> reg5.inputs.sampling_percentage = [0.05, [0.05, 0.10]]
>>> reg5.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 0.5, 32, None, 0.05 ] --metric CC[ fixed1.nii, moving1.nii, 0.5, 4, None, 0.1 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

ANTS Registration can also use multiple modalities to perform the registration. Here it is assumed that fixed1.nii and fixed2.nii are in the same space, and so are moving1.nii and moving2.nii. First, a linear registration is performed matching fixed1.nii to moving1.nii, then a non-linear registration is performed to match fixed2.nii to moving2.nii, starting from the transformation of the first step.

>>> # Test multiple inputS
>>> reg6 = copy.deepcopy(reg5)
>>> reg6.inputs.fixed_image = ['fixed1.nii', 'fixed2.nii']
>>> reg6.inputs.moving_image = ['moving1.nii', 'moving2.nii']
>>> reg6.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 0.5, 32, None, 0.05 ] --metric CC[ fixed2.nii, moving2.nii, 0.5, 4, None, 0.1 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Different methods can be used for the interpolation when applying transformations.

>>> # Test Interpolation Parameters (BSpline)
>>> reg7a = copy.deepcopy(reg)
>>> reg7a.inputs.interpolation = 'BSpline'
>>> reg7a.inputs.interpolation_parameters = (3,)
>>> reg7a.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation BSpline[ 3 ] --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
>>> # Test Interpolation Parameters (MultiLabel/Gaussian)
>>> reg7b = copy.deepcopy(reg)
>>> reg7b.inputs.interpolation = 'Gaussian'
>>> reg7b.inputs.interpolation_parameters = (1.0, 1.0)
>>> reg7b.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Gaussian[ 1.0, 1.0 ] --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

BSplineSyN non-linear registration with custom parameters.

>>> # Test Extended Transform Parameters
>>> reg8 = copy.deepcopy(reg)
>>> reg8.inputs.transforms = ['Affine', 'BSplineSyN']
>>> reg8.inputs.transform_parameters = [(2.0,), (0.25, 26, 0, 3)]
>>> reg8.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform BSplineSyN[ 0.25, 26, 0, 3 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Mask the fixed image in the second stage of the registration (but not the first).

>>> # Test masking
>>> reg9 = copy.deepcopy(reg)
>>> reg9.inputs.fixed_image_masks = ['NULL', 'fixed1.nii']
>>> reg9.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ trans.mat, 1 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --masks [ NULL, NULL ] --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --masks [ fixed1.nii, NULL ] --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'

Here we use both a warpfield and a linear transformation, before registration commences. Note that the first transformation that needs to be applied (‘ants_Warp.nii.gz’) is last in the list of ‘initial_moving_transform’.

>>> # Test initialization with multiple transforms matrices (e.g., unwarp and affine transform)
>>> reg10 = copy.deepcopy(reg)
>>> reg10.inputs.initial_moving_transform = ['func_to_struct.mat', 'ants_Warp.nii.gz']
>>> reg10.inputs.invert_initial_moving_transform = [False, False]
>>> reg10.cmdline
'antsRegistration --collapse-output-transforms 0 --dimensionality 3 --initial-moving-transform [ func_to_struct.mat, 0 ] [ ants_Warp.nii.gz, 0 ] --initialize-transforms-per-stage 0 --interpolation Linear --output [ output_, output_warped_image.nii.gz ] --transform Affine[ 2.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32, Random, 0.05 ] --convergence [ 1500x200, 1e-08, 20 ] --smoothing-sigmas 1.0x0.0vox --shrink-factors 2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --transform SyN[ 0.25, 3.0, 0.0 ] --metric Mattes[ fixed1.nii, moving1.nii, 1, 32 ] --convergence [ 100x50x30, 1e-09, 20 ] --smoothing-sigmas 2.0x1.0x0.0vox --shrink-factors 3x2x1 --use-estimate-learning-rate-once 1 --use-histogram-matching 1 --winsorize-image-intensities [ 0.0, 1.0 ]  --write-composite-transform 1'
fixed_imagea list of items which are a pathlike object or string representing an existing file

Image to which the moving_image should be transformed(usually a structural image).

metrica list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ or a list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’

The metric(s) to use for each stage. Note that multiple metrics per stage are not supported in ANTS 1.9.1 and earlier.

metric_weighta list of items which are a float or a list of items which are a float

The metric weight(s) for each stage. The weights must sum to 1 per stage. Requires inputs: metric. (Nipype default value: [1.0])

moving_imagea list of items which are a pathlike object or string representing an existing file

Image that will be registered to the space of fixed_image. This is theimage on which the transformations will be applied to.

shrink_factors : a list of items which are a list of items which are an integer (int or long) smoothing_sigmas : a list of items which are a list of items which are a float transforms : a list of items which are ‘Rigid’ or ‘Affine’ or ‘CompositeAffine’ or ‘Similarity’ or ‘Translation’ or ‘BSpline’ or ‘GaussianDisplacementField’ or ‘TimeVaryingVelocityField’ or ‘TimeVaryingBSplineVelocityField’ or ‘SyN’ or ‘BSplineSyN’ or ‘Exponential’ or ‘BSplineExponential’

Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

collapse_output_transformsa boolean

Collapse output transforms. Specifically, enabling this option combines all adjacent linear transforms and composes all adjacent displacement field transforms before writing the results to disk. Maps to a command-line argument: --collapse-output-transforms %d. (Nipype default value: True)

convergence_thresholda list of at least 1 items which are a float

Requires inputs: number_of_iterations. (Nipype default value: [1e-06])

convergence_window_sizea list of at least 1 items which are an integer (int or long)

Requires inputs: convergence_threshold. (Nipype default value: [10])

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: --dimensionality %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

fixed_image_maska pathlike object or string representing an existing file

Mask used to limit metric sampling region of the fixed imagein all stages. Maps to a command-line argument: %s. Mutually exclusive with inputs: fixed_image_masks.

fixed_image_masksa list of items which are a pathlike object or string representing an existing file or ‘NULL’

Masks used to limit metric sampling region of the fixed image, defined per registration stage(Use “NULL” to omit a mask at a given stage). Mutually exclusive with inputs: fixed_image_mask.

floata boolean

Use float instead of double for computations. Maps to a command-line argument: --float %d.

initial_moving_transforma list of items which are a pathlike object or string representing an existing file

A transform or a list of transforms that should be applied before the registration begins. Note that, when a list is given, the transformations are applied in reverse order. Maps to a command-line argument: %s. Mutually exclusive with inputs: initial_moving_transform_com.

initial_moving_transform_com0 or 1 or 2

Align the moving_image and fixed_image before registration using the geometric center of the images (=0), the image intensities (=1), or the origin of the images (=2). Maps to a command-line argument: %s. Mutually exclusive with inputs: initial_moving_transform.

initialize_transforms_per_stagea boolean

Initialize linear transforms from the previous stage. By enabling this option, the current linear stage transform is directly intialized from the previous stages linear transform; this allows multiple linear stages to be run where each stage directly updates the estimated linear transform from the previous stage. (e.g. Translation -> Rigid -> Affine). . Maps to a command-line argument: --initialize-transforms-per-stage %d. (Nipype default value: False)

interpolation‘Linear’ or ‘NearestNeighbor’ or ‘CosineWindowedSinc’ or ‘WelchWindowedSinc’ or ‘HammingWindowedSinc’ or ‘LanczosWindowedSinc’ or ‘BSpline’ or ‘MultiLabel’ or ‘Gaussian’

Maps to a command-line argument: %s. (Nipype default value: Linear)

interpolation_parameters : a tuple of the form: (an integer (int or long)) or a tuple of the form: (a float, a float) invert_initial_moving_transform : a list of items which are a boolean

One boolean or a list of booleans that indicatewhether the inverse(s) of the transform(s) definedin initial_moving_transform should be used. Mutually exclusive with inputs: initial_moving_transform_com. Requires inputs: initial_moving_transform.

metric_item_trait : ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ metric_stage_trait : ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ or a list of items which are ‘CC’ or ‘MeanSquares’ or ‘Demons’ or ‘GC’ or ‘MI’ or ‘Mattes’ metric_weight_item_trait : a float

(Nipype default value: 1.0)

metric_weight_stage_trait : a float or a list of items which are a float moving_image_mask : a pathlike object or string representing an existing file

Mask used to limit metric sampling region of the moving imagein all stages. Mutually exclusive with inputs: moving_image_masks. Requires inputs: fixed_image_mask.

moving_image_masksa list of items which are a pathlike object or string representing an existing file or ‘NULL’

Masks used to limit metric sampling region of the moving image, defined per registration stage(Use “NULL” to omit a mask at a given stage). Mutually exclusive with inputs: moving_image_mask.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

number_of_iterations : a list of items which are a list of items which are an integer (int or long) output_inverse_warped_image : a boolean or a pathlike object or string representing a file

Requires inputs: output_warped_image.

output_transform_prefixa unicode string

Maps to a command-line argument: %s. (Nipype default value: transform)

output_warped_image : a boolean or a pathlike object or string representing a file radius_bins_item_trait : an integer (int or long)

(Nipype default value: 5)

radius_bins_stage_trait : an integer (int or long) or a list of items which are an integer (int or long) radius_or_number_of_bins : a list of items which are an integer (int or long) or a list of items which are an integer (int or long)

The number of bins in each stage for the MI and Mattes metric, the radius for other metrics. Requires inputs: metric_weight. (Nipype default value: [5])

restore_statea pathlike object or string representing an existing file

Filename for restoring the internal restorable state of the registration. Maps to a command-line argument: --restore-state %s.

restrict_deformationa list of items which are a list of items which are 0.0 <= a floating point number <= 1.0

This option allows the user to restrict the optimization of the displacement field, translation, rigid or affine transform on a per-component basis. For example, if one wants to limit the deformation or rotation of 3-D volume to the first two dimensions, this is possible by specifying a weight vector of ‘1x1x0’ for a deformation field or ‘1x1x0x1x1x0’ for a rigid transformation. Low-dimensional restriction only works if there are no preceding transformations.

sampling_percentagea list of items which are 0.0 <= a floating point number <= 1.0 or None or a list of items which are 0.0 <= a floating point number <= 1.0 or None

The metric sampling percentage(s) to use for each stage. Requires inputs: sampling_strategy.

sampling_percentage_item_trait : 0.0 <= a floating point number <= 1.0 or None sampling_percentage_stage_trait : 0.0 <= a floating point number <= 1.0 or None or a list of items which are 0.0 <= a floating point number <= 1.0 or None sampling_strategy : a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None or a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None

The metric sampling strategy (strategies) for each stage. Requires inputs: metric_weight.

sampling_strategy_item_trait : ‘None’ or ‘Regular’ or ‘Random’ or None sampling_strategy_stage_trait : ‘None’ or ‘Regular’ or ‘Random’ or None or a list of items which are ‘None’ or ‘Regular’ or ‘Random’ or None save_state : a pathlike object or string representing a file

Filename for saving the internal restorable state of the registration. Maps to a command-line argument: --save-state %s.

sigma_unitsa list of items which are ‘mm’ or ‘vox’

Units for smoothing sigmas. Requires inputs: smoothing_sigmas.

transform_parameters : a list of items which are a tuple of the form: (a float) or a tuple of the form: (a float, a float, a float) or a tuple of the form: (a float, an integer (int or long), an integer (int or long), an integer (int or long)) or a tuple of the form: (a float, an integer (int or long), a float, a float, a float, a float) or a tuple of the form: (a float, a float, a float, an integer (int or long)) or a tuple of the form: (a float, an integer (int or long), an integer (int or long), an integer (int or long), an integer (int or long)) use_estimate_learning_rate_once : a list of items which are a boolean use_histogram_matching : a boolean or a list of items which are a boolean

Histogram match the images before registration. (Nipype default value: True)

verbosea boolean

Maps to a command-line argument: -v. (Nipype default value: False)

winsorize_lower_quantile0.0 <= a floating point number <= 1.0

The Lower quantile to clip image ranges. Maps to a command-line argument: %s. (Nipype default value: 0.0)

winsorize_upper_quantile0.0 <= a floating point number <= 1.0

The Upper quantile to clip image ranges. Maps to a command-line argument: %s. (Nipype default value: 1.0)

write_composite_transforma boolean

Maps to a command-line argument: --write-composite-transform %d. (Nipype default value: False)

composite_transforma pathlike object or string representing an existing file

Composite transform file.

elapsed_timea float

The total elapsed time as reported by ANTs.

forward_invert_flagsa list of items which are a boolean

List of flags corresponding to the forward transforms.

forward_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for forward registration.

inverse_composite_transforma pathlike object or string representing a file

Inverse composite transform file.

inverse_warped_imagea pathlike object or string representing a file

Outputs the inverse of the warped image.

metric_valuea float

The final value of metric.

reverse_forward_invert_flagsa list of items which are a boolean

List of flags corresponding to the forward transforms reversed for antsApplyTransform.

reverse_forward_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for forward registration reversed for antsApplyTransform.

reverse_invert_flagsa list of items which are a boolean

List of flags corresponding to the reverse transforms.

reverse_transformsa list of items which are a pathlike object or string representing an existing file

List of output transforms for reverse registration.

save_statea pathlike object or string representing a file

The saved registration state to be restored.

warped_imagea pathlike object or string representing a file

Outputs warped image.

Registration.DEF_SAMPLING_STRATEGY = 'None'

The default sampling strategy argument.

RegistrationSynQuick

Link to code

Bases: ANTSCommand

Wrapped executable: antsRegistrationSyNQuick.sh.

Registration using a symmetric image normalization method (SyN). You can read more in Avants et al.; Med Image Anal., 2008 (https://www.ncbi.nlm.nih.gov/pubmed/17659998).

Examples

>>> from nipype.interfaces.ants import RegistrationSynQuick
>>> reg = RegistrationSynQuick()
>>> reg.inputs.fixed_image = 'fixed1.nii'
>>> reg.inputs.moving_image = 'moving1.nii'
>>> reg.inputs.num_threads = 2
>>> reg.cmdline
'antsRegistrationSyNQuick.sh -d 3 -f fixed1.nii -r 32 -m moving1.nii -n 2 -o transform -p d -s 26 -t s'
>>> reg.run()  

example for multiple images

>>> from nipype.interfaces.ants import RegistrationSynQuick
>>> reg = RegistrationSynQuick()
>>> reg.inputs.fixed_image = ['fixed1.nii', 'fixed2.nii']
>>> reg.inputs.moving_image = ['moving1.nii', 'moving2.nii']
>>> reg.inputs.num_threads = 2
>>> reg.cmdline
'antsRegistrationSyNQuick.sh -d 3 -f fixed1.nii -f fixed2.nii -r 32 -m moving1.nii -m moving2.nii -n 2 -o transform -p d -s 26 -t s'
>>> reg.run()  
fixed_imagea list of items which are a pathlike object or string representing an existing file

Fixed image or source image or reference image. Maps to a command-line argument: -f %s....

moving_imagea list of items which are a pathlike object or string representing an existing file

Moving image or target image. Maps to a command-line argument: -m %s....

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: -d %d. (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

histogram_binsan integer (int or long)

Histogram bins for mutual information in SyN stage (default = 32). Maps to a command-line argument: -r %d. (Nipype default value: 32)

num_threadsan integer (int or long)

Number of threads (default = 1). Maps to a command-line argument: -n %d. (Nipype default value: 1)

output_prefixa unicode string

A prefix that is prepended to all output files. Maps to a command-line argument: -o %s. (Nipype default value: transform)

precision_type‘double’ or ‘float’

Precision type (default = double). Maps to a command-line argument: -p %s. (Nipype default value: double)

spline_distancean integer (int or long)

Spline distance for deformable B-spline SyN transform (default = 26). Maps to a command-line argument: -s %d. (Nipype default value: 26)

transform_type‘s’ or ‘t’ or ‘r’ or ‘a’ or ‘sr’ or ‘b’ or ‘br’

Transform type

  • t: translation

  • r: rigid

  • a: rigid + affine

  • s: rigid + affine + deformable syn (default)

  • sr: rigid + deformable syn

  • b: rigid + affine + deformable b-spline syn

  • br: rigid + deformable b-spline syn

Maps to a command-line argument: -t %s. (Nipype default value: s)

use_histogram_matchinga boolean

Use histogram matching. Maps to a command-line argument: -j %d.

forward_warp_fielda pathlike object or string representing an existing file

Forward warp field.

inverse_warp_fielda pathlike object or string representing an existing file

Inverse warp field.

inverse_warped_imagea pathlike object or string representing an existing file

Inverse warped image.

out_matrixa pathlike object or string representing an existing file

Affine matrix.

warped_imagea pathlike object or string representing an existing file

Warped image.

ResampleImageBySpacing

Link to code

Bases: ANTSCommand

Wrapped executable: ResampleImageBySpacing.

Resample an image with a given spacing.

Examples

>>> res = ResampleImageBySpacing(dimension=3)
>>> res.inputs.input_image = 'structural.nii'
>>> res.inputs.output_image = 'output.nii.gz'
>>> res.inputs.out_spacing = (4, 4, 4)
>>> res.cmdline  
'ResampleImageBySpacing 3 structural.nii output.nii.gz 4 4 4'
>>> res = ResampleImageBySpacing(dimension=3)
>>> res.inputs.input_image = 'structural.nii'
>>> res.inputs.output_image = 'output.nii.gz'
>>> res.inputs.out_spacing = (4, 4, 4)
>>> res.inputs.apply_smoothing = True
>>> res.cmdline  
'ResampleImageBySpacing 3 structural.nii output.nii.gz 4 4 4 1'
>>> res = ResampleImageBySpacing(dimension=3)
>>> res.inputs.input_image = 'structural.nii'
>>> res.inputs.output_image = 'output.nii.gz'
>>> res.inputs.out_spacing = (0.4, 0.4, 0.4)
>>> res.inputs.apply_smoothing = True
>>> res.inputs.addvox = 2
>>> res.inputs.nn_interp = False
>>> res.cmdline  
'ResampleImageBySpacing 3 structural.nii output.nii.gz 0.4 0.4 0.4 1 2 0'
input_imagea pathlike object or string representing an existing file

Input image file. Maps to a command-line argument: %s (position: 2).

out_spacinga list of from 2 to 3 items which are a float or a tuple of the form: (a float, a float, a float) or a tuple of the form: (a float, a float)

Output spacing. Maps to a command-line argument: %s (position: 4).

addvoxan integer (int or long)

Addvox pads each dimension by addvox. Maps to a command-line argument: %d (position: 6). Requires inputs: apply_smoothing.

apply_smoothinga boolean

Smooth before resampling. Maps to a command-line argument: %d (position: 5).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimensionan integer (int or long)

Dimension of output image. Maps to a command-line argument: %d (position: 1). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

nn_interpa boolean

Nn interpolation. Maps to a command-line argument: %d (position: -1). Requires inputs: addvox.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

output_imagea pathlike object or string representing a file

Output image file. Maps to a command-line argument: %s (position: 3).

output_imagea pathlike object or string representing an existing file

Resampled file.

ThresholdImage

Link to code

Bases: ANTSCommand, CopyHeaderInterface

Wrapped executable: ThresholdImage.

Apply thresholds on images.

Examples

>>> thres = ThresholdImage(dimension=3)
>>> thres.inputs.input_image = 'structural.nii'
>>> thres.inputs.output_image = 'output.nii.gz'
>>> thres.inputs.th_low = 0.5
>>> thres.inputs.th_high = 1.0
>>> thres.inputs.inside_value = 1.0
>>> thres.inputs.outside_value = 0.0
>>> thres.cmdline  
'ThresholdImage 3 structural.nii output.nii.gz 0.500000 1.000000 1.000000 0.000000'
>>> thres = ThresholdImage(dimension=3)
>>> thres.inputs.input_image = 'structural.nii'
>>> thres.inputs.output_image = 'output.nii.gz'
>>> thres.inputs.mode = 'Kmeans'
>>> thres.inputs.num_thresholds = 4
>>> thres.cmdline  
'ThresholdImage 3 structural.nii output.nii.gz Kmeans 4'
copy_headera boolean

Copy headers of the original image into the output (corrected) file. (Nipype default value: True)

input_imagea pathlike object or string representing an existing file

Input image file. Maps to a command-line argument: %s (position: 2).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimensionan integer (int or long)

Dimension of output image. Maps to a command-line argument: %d (position: 1). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

input_maska pathlike object or string representing an existing file

Input mask for Otsu, Kmeans. Maps to a command-line argument: %s. Requires inputs: num_thresholds.

inside_valuea float

Inside value. Maps to a command-line argument: %f (position: 6). Requires inputs: th_low.

mode‘Otsu’ or ‘Kmeans’

Whether to run Otsu / Kmeans thresholding. Maps to a command-line argument: %s (position: 4). Mutually exclusive with inputs: th_low, th_high. Requires inputs: num_thresholds.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

num_thresholdsan integer (int or long)

Number of thresholds. Maps to a command-line argument: %d (position: 5).

output_imagea pathlike object or string representing a file

Output image file. Maps to a command-line argument: %s (position: 3).

outside_valuea float

Outside value. Maps to a command-line argument: %f (position: 7). Requires inputs: th_low.

th_higha float

Upper threshold. Maps to a command-line argument: %f (position: 5). Mutually exclusive with inputs: mode.

th_lowa float

Lower threshold. Maps to a command-line argument: %f (position: 4). Mutually exclusive with inputs: mode.

output_imagea pathlike object or string representing an existing file

Resampled file.

WarpImageMultiTransform

Link to code

Bases: ANTSCommand

Wrapped executable: WarpImageMultiTransform.

Warps an image from one space to another

Examples

>>> from nipype.interfaces.ants import WarpImageMultiTransform
>>> wimt = WarpImageMultiTransform()
>>> wimt.inputs.input_image = 'structural.nii'
>>> wimt.inputs.reference_image = 'ants_deformed.nii.gz'
>>> wimt.inputs.transformation_series = ['ants_Warp.nii.gz','ants_Affine.txt']
>>> wimt.cmdline
'WarpImageMultiTransform 3 structural.nii structural_wimt.nii -R ants_deformed.nii.gz ants_Warp.nii.gz ants_Affine.txt'
>>> wimt = WarpImageMultiTransform()
>>> wimt.inputs.input_image = 'diffusion_weighted.nii'
>>> wimt.inputs.reference_image = 'functional.nii'
>>> wimt.inputs.transformation_series = ['func2anat_coreg_Affine.txt','func2anat_InverseWarp.nii.gz',     'dwi2anat_Warp.nii.gz','dwi2anat_coreg_Affine.txt']
>>> wimt.inputs.invert_affine = [1]  # this will invert the 1st Affine file: 'func2anat_coreg_Affine.txt'
>>> wimt.cmdline
'WarpImageMultiTransform 3 diffusion_weighted.nii diffusion_weighted_wimt.nii -R functional.nii -i func2anat_coreg_Affine.txt func2anat_InverseWarp.nii.gz dwi2anat_Warp.nii.gz dwi2anat_coreg_Affine.txt'
input_imagea pathlike object or string representing a file

Image to apply transformation to (generally a coregistered functional). Maps to a command-line argument: %s (position: 2).

transformation_seriesa list of items which are a pathlike object or string representing an existing file

Transformation file(s) to be applied. Maps to a command-line argument: %s (position: -1).

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension3 or 2

Image dimension (2 or 3). Maps to a command-line argument: %d (position: 1). (Nipype default value: 3)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

invert_affinea list of items which are an integer (int or long)

List of Affine transformations to invert.E.g.: [1,4,5] inverts the 1st, 4th, and 5th Affines found in transformation_series. Note that indexing starts with 1 and does not include warp fields. Affine transformations are distinguished from warp fields by the word “affine” included in their filenames.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_postfixa pathlike object or string representing a file

Postfix that is prepended to all output files (default = _wimt). Mutually exclusive with inputs: output_image. (Nipype default value: _wimt)

output_imagea pathlike object or string representing a file

Name of the output warped image. Maps to a command-line argument: %s (position: 3). Mutually exclusive with inputs: out_postfix.

reference_imagea pathlike object or string representing a file

Reference image space that you wish to warp INTO. Maps to a command-line argument: -R %s. Mutually exclusive with inputs: tightest_box.

reslice_by_headera boolean

Uses orientation matrix and origin encoded in reference image file header. Not typically used with additional transforms. Maps to a command-line argument: --reslice-by-header.

tightest_boxa boolean

Computes tightest bounding box (overrided by reference_image if given). Maps to a command-line argument: --tightest-bounding-box. Mutually exclusive with inputs: reference_image.

use_bsplinea boolean

Use 3rd order B-Spline interpolation. Maps to a command-line argument: --use-BSpline.

use_nearesta boolean

Use nearest neighbor interpolation. Maps to a command-line argument: --use-NN.

output_imagea pathlike object or string representing an existing file

Warped image.

WarpTimeSeriesImageMultiTransform

Link to code

Bases: ANTSCommand

Wrapped executable: WarpTimeSeriesImageMultiTransform.

Warps a time-series from one space to another

Examples

>>> from nipype.interfaces.ants import WarpTimeSeriesImageMultiTransform
>>> wtsimt = WarpTimeSeriesImageMultiTransform()
>>> wtsimt.inputs.input_image = 'resting.nii'
>>> wtsimt.inputs.reference_image = 'ants_deformed.nii.gz'
>>> wtsimt.inputs.transformation_series = ['ants_Warp.nii.gz','ants_Affine.txt']
>>> wtsimt.cmdline
'WarpTimeSeriesImageMultiTransform 4 resting.nii resting_wtsimt.nii -R ants_deformed.nii.gz ants_Warp.nii.gz ants_Affine.txt'
>>> wtsimt = WarpTimeSeriesImageMultiTransform()
>>> wtsimt.inputs.input_image = 'resting.nii'
>>> wtsimt.inputs.reference_image = 'ants_deformed.nii.gz'
>>> wtsimt.inputs.transformation_series = ['ants_Warp.nii.gz','ants_Affine.txt']
>>> wtsimt.inputs.invert_affine = [1] # # this will invert the 1st Affine file: ants_Affine.txt
>>> wtsimt.cmdline
'WarpTimeSeriesImageMultiTransform 4 resting.nii resting_wtsimt.nii -R ants_deformed.nii.gz ants_Warp.nii.gz -i ants_Affine.txt'
input_imagea pathlike object or string representing a file

Image to apply transformation to (generally a coregistered functional). Maps to a command-line argument: %s.

transformation_seriesa list of items which are a pathlike object or string representing an existing file

Transformation file(s) to be applied. Maps to a command-line argument: %s.

argsa unicode string

Additional parameters to the command. Maps to a command-line argument: %s.

dimension4 or 3

Image dimension (3 or 4). Maps to a command-line argument: %d (position: 1). (Nipype default value: 4)

environa dictionary with keys which are a bytes or None or a value of class ‘str’ and with values which are a bytes or None or a value of class ‘str’

Environment variables. (Nipype default value: {})

invert_affinea list of items which are an integer (int or long)

List of Affine transformations to invert.E.g.: [1,4,5] inverts the 1st, 4th, and 5th Affines found in transformation_series. Note that indexing starts with 1 and does not include warp fields. Affine transformations are distinguished from warp fields by the word “affine” included in their filenames.

num_threadsan integer (int or long)

Number of ITK threads to use. (Nipype default value: 1)

out_postfixa unicode string

Postfix that is prepended to all output files (default = _wtsimt). Maps to a command-line argument: %s. (Nipype default value: _wtsimt)

reference_imagea pathlike object or string representing a file

Reference image space that you wish to warp INTO. Maps to a command-line argument: -R %s. Mutually exclusive with inputs: tightest_box.

reslice_by_headera boolean

Uses orientation matrix and origin encoded in reference image file header. Not typically used with additional transforms. Maps to a command-line argument: --reslice-by-header.

tightest_boxa boolean

Computes tightest bounding box (overrided by reference_image if given). Maps to a command-line argument: --tightest-bounding-box. Mutually exclusive with inputs: reference_image.

use_bsplinea boolean

Use 3rd order B-Spline interpolation. Maps to a command-line argument: --use-Bspline.

use_nearesta boolean

Use nearest neighbor interpolation. Maps to a command-line argument: --use-NN.

output_imagea pathlike object or string representing an existing file

Warped image.

Submodules