diff --git a/Changelog b/Changelog index e352b52b44..1ce3531511 100644 --- a/Changelog +++ b/Changelog @@ -342,7 +342,7 @@ Bug fixes reviewed by PM) * Safer warning registry manipulation when checking for overflows (pr/753) (CM, reviewed by MB) -* Correctly write .annot files with duplicate lables (pr/763) (Richard Nemec +* Correctly write .annot files with duplicate labels (pr/763) (Richard Nemec with CM) Maintenance @@ -997,7 +997,7 @@ visiting the URL:: * Bugfix: Removed left-over print statement in extension code. * Bugfix: Prevent saving of bogus 'None.nii' images when the filename was previously assign, before calling NiftiImage.save() (Closes: #517920). -* Bugfix: Extension length was to short for all `edata` whos length matches +* Bugfix: Extension length was to short for all `edata` whose length matches n*16-8, for all integer n. 0.20090205.1 (Thu, 5 Feb 2009) @@ -1017,7 +1017,7 @@ visiting the URL:: automatically dumped into this extension. Embedded meta data is not loaded automatically, since this has security implications, because code from the file header is actually executed. - The documentation explicitely mentions this risk. + The documentation explicitly mentions this risk. * Added :class:`~nifti.extensions.NiftiExtensions`. This is a container-like handler to access and manipulate NIfTI1 header extensions. * Exposed :class:`~nifti.image.MemMappedNiftiImage` in the root module. @@ -1223,7 +1223,7 @@ visiting the URL:: * Does not depend on libfslio anymore. * Up to seven-dimensional dataset are supported (as much as NIfTI can do). * The complete NIfTI header dataset is modifiable. -* Most image properties are accessable via class attributes and accessor +* Most image properties are accessible via class attributes and accessor methods. * Improved documentation (but still a long way to go). diff --git a/doc/misc/pylintrc b/doc/misc/pylintrc index efa0487b09..61439e3d20 100644 --- a/doc/misc/pylintrc +++ b/doc/misc/pylintrc @@ -79,7 +79,7 @@ output-format=colorized # Include message's id in output include-ids=yes -# Tells wether to display a full report or only the messages +# Tells whether to display a full report or only the messages reports=yes [MISCELLANEOUS] diff --git a/doc/source/coordinate_systems.rst b/doc/source/coordinate_systems.rst index 9541dc6f82..3c68d71271 100644 --- a/doc/source/coordinate_systems.rst +++ b/doc/source/coordinate_systems.rst @@ -530,7 +530,7 @@ then the image affine matrix $A$ is: Why the extra row of $[0, 0, 0, 1]$? We need this row because we have rephrased the combination of rotations / zooms and translations as a -transformation in *homogenous coordinates* (see `wikipedia homogenous +transformation in *homogeneous coordinates* (see `wikipedia homogeneous coordinates`_). This is a trick that allows us to put the translation part into the same matrix as the rotations / zooms, so that both translations and rotations / zooms can be applied by matrix multiplication. In order to make diff --git a/doc/source/devel/data_pkg_discuss.rst b/doc/source/devel/data_pkg_discuss.rst index 08765743ce..dbd8cca88a 100644 --- a/doc/source/devel/data_pkg_discuss.rst +++ b/doc/source/devel/data_pkg_discuss.rst @@ -369,7 +369,7 @@ Discovery Revsion ids could for example be hashes of the package instantiation (package contents), so they could be globally unique to the contents, - whereever the contents was when the identifier was made. However, *tags* + wherever the contents was when the identifier was made. However, *tags* are just names that someone has attached to a particular revsion id. If there is more than one person providing versions of a particular package, there may not be agreement on the revsion that a particular tag is attached diff --git a/doc/source/devel/devguide.rst b/doc/source/devel/devguide.rst index 9d97543cec..2747564dbf 100644 --- a/doc/source/devel/devguide.rst +++ b/doc/source/devel/devguide.rst @@ -64,7 +64,7 @@ necessary and the branch gets tagged when a package version is released. Maintenance (as well as backport) releases or branches off from the respective packaging tag. -There might be additonal branches for each developer, prefixed with intials. +There might be additional branches for each developer, prefixed with initials. Alternatively, several GitHub (or elsewhere) clones might be used. @@ -99,7 +99,7 @@ Changelog ========= The changelog is located in the toplevel directory of the source tree in the -`Changelog` file. The content of this file should be formated as restructured +`Changelog` file. The content of this file should be formatted as restructured text to make it easy to put it into manual appendix and on the website. This changelog should neither replicate the VCS commit log nor the diff --git a/doc/source/dicom/dicom_intro.rst b/doc/source/dicom/dicom_intro.rst index 12c187d652..f1508932c6 100644 --- a/doc/source/dicom/dicom_intro.rst +++ b/doc/source/dicom/dicom_intro.rst @@ -686,7 +686,7 @@ For example, there is a DIMSE service called "C-ECHO" that requests confirmation from the responding application that the echo message arrived. The definition of the DIMSE services specifies, for a particular DIMSE service, -whether the DIMSE commend set should be followed by a data set. +whether the DIMSE command set should be followed by a data set. In particular, the data set will be a full Information Object Definition's worth of data. diff --git a/doc/source/dicom/dicom_mosaic.rst b/doc/source/dicom/dicom_mosaic.rst index 7e5a157a94..5ff0f1fcf7 100644 --- a/doc/source/dicom/dicom_mosaic.rst +++ b/doc/source/dicom/dicom_mosaic.rst @@ -124,7 +124,7 @@ Data scaling SPM gets the DICOM scaling, offset for the image ('RescaleSlope', 'RescaleIntercept'). It writes these scalings into the nifti_ header. Then it writes the raw image data (unscaled) to disk. Obviously these -will have the corrent scalings applied when the nifti image is read again. +will have the current scalings applied when the nifti image is read again. A comment in the code here says that the data are not scaled by the maximum amount. I assume by this they mean that the DICOM scaling may diff --git a/doc/source/dicom/dicom_orientation.rst b/doc/source/dicom/dicom_orientation.rst index f4d66ae979..dae0ea5c60 100644 --- a/doc/source/dicom/dicom_orientation.rst +++ b/doc/source/dicom/dicom_orientation.rst @@ -350,7 +350,7 @@ constant, to the voxel coordinate for the slice (the value for the slice index). Our DICOM might have the 'SliceLocation' field (0020,1041). -'SliceLocation' seems to be proportianal to slice location, at least for +'SliceLocation' seems to be proportional to slice location, at least for some GE and Philips DICOMs I was looking at. But, there is a more reliable way (that doesn't depend on this field), and uses only the very standard 'ImageOrientationPatient' and 'ImagePositionPatient' fields. @@ -385,7 +385,7 @@ unit change in the slice voxel coordinate. So, the addition of two vectors $T^j = \mathbf{a} + \mathbf{b}$, where $\mathbf{a}$ is the position of the first voxel in some slice (here slice 1, therefore $\mathbf{a} = T^1$) and $\mathbf{b}$ is $d$ times the -third colum of $A$. Obviously $d$ can be negative or positive. This +third column of $A$. Obviously $d$ can be negative or positive. This leads to various ways of recovering something that is proportional to $d$ plus a constant. The algorithm suggested in this `ITK post on ordering slices`_ - and the one used by SPM - is to take the inner diff --git a/doc/source/dicom/siemens_csa.rst b/doc/source/dicom/siemens_csa.rst index 3ec6c09914..7807f7b89f 100644 --- a/doc/source/dicom/siemens_csa.rst +++ b/doc/source/dicom/siemens_csa.rst @@ -124,7 +124,7 @@ Each item Now there's a different length check from CSA1. ``item_len`` is given just by ``xx[1]``. If ``item_len`` > ``csa_max_pos - csa_position`` -(the remaining bytes in the header), then we just read the remaning +(the remaining bytes in the header), then we just read the remaining bytes in the header (as above) into ``value`` below, as uint8, move the filepointer to the next 4 byte boundary, and give up reading. diff --git a/doc/source/gettingstarted.rst b/doc/source/gettingstarted.rst index 31c20fd4e1..9b3026bcc7 100644 --- a/doc/source/gettingstarted.rst +++ b/doc/source/gettingstarted.rst @@ -14,7 +14,7 @@ Getting Started *************** NiBabel supports an ever growing collection of neuroimaging file formats. Every -file format format has its own features and pecularities that need to be taken +file format format has its own features and peculiarities that need to be taken care of to get the most out of it. To this end, NiBabel offers both high-level format-independent access to neuroimages, as well as an API with various levels of format-specific access to all available information in a particular file @@ -109,7 +109,7 @@ True In this case, we used the identity matrix as the affine transformation. The image header is initialized from the provided data array (i.e. shape, dtype) -and all other values are set to resonable defaults. +and all other values are set to reasonable defaults. Saving this new image to a file is trivial: diff --git a/doc/source/gitwash/configure_git.rst b/doc/source/gitwash/configure_git.rst index 0e18b666d0..9911d7cbb1 100644 --- a/doc/source/gitwash/configure_git.rst +++ b/doc/source/gitwash/configure_git.rst @@ -142,7 +142,7 @@ and it gives graph / text output something like this (but with color!):: | * 4aff2a8 - fixed bug 35, and added a test in test_bugfixes (2 weeks ago) [Hugo] |/ * a7ff2e5 - Added notes on discussion/proposal made during Data Array Summit. (2 weeks ago) [Corran Webster] - * 68f6752 - Initial implimentation of AxisIndexer - uses 'index_by' which needs to be changed to a call on an Axes object - this is all very sketchy right now. (2 weeks ago) [Corr + * 68f6752 - Initial implementation of AxisIndexer - uses 'index_by' which needs to be changed to a call on an Axes object - this is all very sketchy right now. (2 weeks ago) [Corr * 376adbd - Merge pull request #46 from terhorst/master (2 weeks ago) [Jonathan Terhorst] |\ | * b605216 - updated joshu example to current api (3 weeks ago) [Jonathan Terhorst] diff --git a/doc/source/links_names.txt b/doc/source/links_names.txt index 1a1b688cd4..7fbb27b12e 100644 --- a/doc/source/links_names.txt +++ b/doc/source/links_names.txt @@ -217,7 +217,7 @@ .. _`wikipedia affine transform`: https://en.wikipedia.org/wiki/Affine_transformation .. _`wikipedia linear transform`: https://en.wikipedia.org/wiki/Linear_transformation .. _`wikipedia rotation matrix`: https://en.wikipedia.org/wiki/Rotation_matrix -.. _`wikipedia homogenous coordinates`: https://en.wikipedia.org/wiki/Homogeneous_coordinates +.. _`wikipedia homogeneous coordinates`: https://en.wikipedia.org/wiki/Homogeneous_coordinates .. _`wikipedia axis angle`: https://en.wikipedia.org/wiki/Axis_angle .. _`wikipedia Euler angles`: https://en.wikipedia.org/wiki/Euler_angles .. _`Mathworld Euler angles`: http://mathworld.wolfram.com/EulerAngles.html diff --git a/doc/source/old/design.txt b/doc/source/old/design.txt index 9d424475bc..f2d30ddf56 100644 --- a/doc/source/old/design.txt +++ b/doc/source/old/design.txt @@ -71,7 +71,7 @@ We think of an image as being the association of: For simplicity, we want the transformation (above) to be spatial. Because the images are always at least 3D, and the transform is - spatial, this means that the tranformation is always exactly 3D. We + spatial, this means that the transformation is always exactly 3D. We have to know which of the N image dimensions are spatial. For example, if we have a 4D (space and time) image, we need to know which of the 4 dimensions are spatial. We could ask the image to diff --git a/doc/source/old/orientation.txt b/doc/source/old/orientation.txt index ef231f7e95..e74d65517f 100644 --- a/doc/source/old/orientation.txt +++ b/doc/source/old/orientation.txt @@ -12,7 +12,7 @@ Affines as orientation ---------------------- Orientations are expressed by 4 by 4 affine arrays. 4x4 affine arrays -give, in homogenous coordinates, the relationship between the +give, in homogeneous coordinates, the relationship between the coordinates in the voxel array, and millimeters. Let is say that I have a simple affine like this: @@ -26,7 +26,7 @@ And I have a voxel coordinate: then the millimeter coordinate for that voxel is given by: ->>> # add extra 1 for homogenous coordinates +>>> # add extra 1 for homogeneous coordinates >>> homogenous_coord = np.concatenate((coord, [1])) >>> mm_coord = np.dot(aff, homogenous_coord)[:3] >>> mm_coord diff --git a/nibabel/affines.py b/nibabel/affines.py index 915167ab7a..2ec3c47c9d 100644 --- a/nibabel/affines.py +++ b/nibabel/affines.py @@ -34,7 +34,7 @@ def apply_affine(aff, pts): Parameters ---------- aff : (N, N) array-like - Homogenous affine, for 3D points, will be 4 by 4. Contrary to first + Homogeneous affine, for 3D points, will be 4 by 4. Contrary to first appearance, the affine will be applied on the left of `pts`. pts : (..., N-1) array-like Points, where the last dimension contains the coordinates of each @@ -87,7 +87,7 @@ def apply_affine(aff, pts): def to_matvec(transform): """Split a transform into its matrix and vector components. - The tranformation must be represented in homogeneous coordinates and is + The transformation must be represented in homogeneous coordinates and is split into its rotation matrix and translation vector components. Parameters @@ -104,7 +104,7 @@ def to_matvec(transform): matrix : (N-1, M-1) array Matrix component of `transform` vector : (M-1,) array - Vector compoent of `transform` + Vector component of `transform` See Also -------- @@ -145,7 +145,7 @@ def from_matvec(matrix, vector=None): Returns ------- xform : array - An (N+1, M+1) homogenous transform matrix. + An (N+1, M+1) homogeneous transform matrix. See Also -------- @@ -269,7 +269,7 @@ def voxel_sizes(affine): 1)[:3]``. The world coordinate vector of voxel vector (1, 0, 0) is ``v1_ax1 = affine.dot((1, 0, 0, 1))[:3]``. The final 1 in the voxel vectors and the ``[:3]`` at the end are because the affine works on - homogenous coodinates. The translations part of the affine is ``trans = + homogeneous coordinates. The translations part of the affine is ``trans = affine[:3, 3]``, and the rotations, zooms and shearing part of the affine is ``rzs = affine[:3, :3]``. Because of the final 1 in the input voxel vector, ``v0 == rzs.dot((0, 0, 0)) + trans``, and ``v1_ax1 == rzs.dot((1, diff --git a/nibabel/analyze.py b/nibabel/analyze.py index cb7e032c9e..3daaaf1175 100644 --- a/nibabel/analyze.py +++ b/nibabel/analyze.py @@ -385,7 +385,7 @@ def from_header(klass, header=None, check=True): # safely discard fields with names not known to this header # type on the basis they are from the wrong Analyze dialect pass - # set any fields etc that are specific to this format (overriden by + # set any fields etc that are specific to this format (overridden by # sub-classes) obj._clean_after_mapping() # Fallback basic conversion always done. diff --git a/nibabel/batteryrunners.py b/nibabel/batteryrunners.py index 67af4f4a8b..882c1814ef 100644 --- a/nibabel/batteryrunners.py +++ b/nibabel/batteryrunners.py @@ -37,7 +37,7 @@ (very bad problem). The levels follow the log levels from the logging module (e.g 40 equivalent to "error" level, 50 to "critical"). The ``error`` can be one of ``None`` if no error to suggest, or an Exception -class that the user might consider raising for this sitation. The +class that the user might consider raising for this situation. The ``problem_msg`` and ``fix_msg`` are human readable strings that should explain what happened. diff --git a/nibabel/casting.py b/nibabel/casting.py index 38c4ea7bda..45c2c5bd36 100644 --- a/nibabel/casting.py +++ b/nibabel/casting.py @@ -1,4 +1,4 @@ -""" Utilties for casting numpy values in various ways +""" Utilities for casting numpy values in various ways Most routines work round some numpy oddities in floating point precision and casting. Others work round numpy casting to and from python ints @@ -132,7 +132,7 @@ def shared_range(flt_type, int_type): Returns ------- mn : object - Number of type `flt_type` that is the minumum value in the range of + Number of type `flt_type` that is the minimum value in the range of `int_type`, such that ``mn.astype(int_type)`` >= min of `int_type` mx : object Number of type `flt_type` that is the maximum value in the range of diff --git a/nibabel/cifti2/tests/test_cifti2io_axes.py b/nibabel/cifti2/tests/test_cifti2io_axes.py index c237e3c61a..fb5a485d98 100644 --- a/nibabel/cifti2/tests/test_cifti2io_axes.py +++ b/nibabel/cifti2/tests/test_cifti2io_axes.py @@ -79,7 +79,7 @@ def check_Conte69(brain_model): def check_rewrite(arr, axes, extension='.nii'): """ - Checks wheter writing the Cifti2 array to disc and reading it back in gives the same object + Checks whether writing the Cifti2 array to disc and reading it back in gives the same object Parameters ---------- diff --git a/nibabel/cmdline/diff.py b/nibabel/cmdline/diff.py index f1e4958e8f..b48033eb45 100755 --- a/nibabel/cmdline/diff.py +++ b/nibabel/cmdline/diff.py @@ -73,7 +73,7 @@ def get_opt_parser(): def are_values_different(*values): """Generically compare values, return True if different - Note that comparison is targetting reporting of comparison of the headers + Note that comparison is targeting reporting of comparison of the headers so has following specifics: - even a difference in data types is considered a difference, i.e. 1 != 1.0 - nans are considered to be the "same", although generally nan != nan @@ -94,7 +94,7 @@ def are_values_different(*values): except TypeError as exc: str_exc = str(exc) # Not implemented in numpy 1.7.1 - if "not supported" in str_exc or "ot implemented" in str_exc: + if "not supported" in str_exc or "not implemented" in str_exc: value0_nans = None else: raise diff --git a/nibabel/cmdline/tck2trk.py b/nibabel/cmdline/tck2trk.py index b8d9ce12d8..3c25ea3266 100644 --- a/nibabel/cmdline/tck2trk.py +++ b/nibabel/cmdline/tck2trk.py @@ -30,7 +30,7 @@ def main(): try: nii = nib.load(args.anatomy) except Exception: - parser.error("Expecting anatomical image as first agument.") + parser.error("Expecting anatomical image as first argument.") for tractogram in args.tractograms: tractogram_format = nib.streamlines.detect_format(tractogram) diff --git a/nibabel/dataobj_images.py b/nibabel/dataobj_images.py index d072678568..7480a5cbfc 100644 --- a/nibabel/dataobj_images.py +++ b/nibabel/dataobj_images.py @@ -27,9 +27,9 @@ def __init__(self, dataobj, header=None, extra=None, file_map=None): Parameters ---------- dataobj : object - Object containg image data. It should be some object that retuns an - array from ``np.asanyarray``. It should have ``shape`` and ``ndim`` - attributes or properties + Object containing image data. It should be some object that returns + an array from ``np.asanyarray``. It should have ``shape`` and + ``ndim`` attributes or properties header : None or mapping or header instance, optional metadata for this image format extra : None or mapping, optional diff --git a/nibabel/ecat.py b/nibabel/ecat.py index a7dad5de0c..3af82e10f5 100644 --- a/nibabel/ecat.py +++ b/nibabel/ecat.py @@ -661,7 +661,7 @@ def data_from_fileobj(self, frame=0, orientation=None): class EcatImageArrayProxy(object): - """ Ecat implemention of array proxy protocol + """ Ecat implementation of array proxy protocol The array proxy allows us to freeze the passed fileobj and header such that it returns the expected data array. @@ -989,7 +989,7 @@ def to_file_map(self, file_map=None): # Write frame images self._write_data(image, imgf, pos + 2, endianness='>') - # Move to dictionnary offset and write dictionnary entry + # Move to dictionary offset and write dictionary entry self._write_data(mlist[index], imgf, entry_pos, endianness='>') entry_pos = entry_pos + 16 diff --git a/nibabel/filename_parser.py b/nibabel/filename_parser.py index 149cef06a9..d8ed87c38a 100644 --- a/nibabel/filename_parser.py +++ b/nibabel/filename_parser.py @@ -275,7 +275,7 @@ def splitext_addext(filename, Extension, where extension is not in `addexts` - e.g. ``.ext`` in example above addext : str - Any suffixes appearing in `addext` occuring at end of filename + Any suffixes appearing in `addext` occurring at end of filename Examples -------- diff --git a/nibabel/fileslice.py b/nibabel/fileslice.py index 0bb987c8be..d0bd3ca721 100644 --- a/nibabel/fileslice.py +++ b/nibabel/fileslice.py @@ -416,7 +416,7 @@ def optimize_slicer(slicer, dim_len, all_full, is_slowest, stride, # full, but reversed if slicer == slice(dim_len - 1, None, -1): return slice(None), slice(None, None, -1) - # Not full, mabye continuous + # Not full, maybe continuous is_int = False else: # int if slicer < 0: # make negative offsets positive diff --git a/nibabel/gifti/gifti.py b/nibabel/gifti/gifti.py index 23a1df2e60..ee72521525 100644 --- a/nibabel/gifti/gifti.py +++ b/nibabel/gifti/gifti.py @@ -331,7 +331,7 @@ class GiftiDataArray(xml.XmlSerializable): The Endianness to store the data array. Should correspond to the machine endianness. Default is system byteorder. coordsys : :class:`GiftiCoordSystem` instance - Input and output coordinate system with tranformation matrix between + Input and output coordinate system with transformation matrix between the two. ind_ord : int The ordering of the array. see util.array_index_order_codes. Default diff --git a/nibabel/gifti/tests/test_gifti.py b/nibabel/gifti/tests/test_gifti.py index 2d60482c59..eb6a87d673 100644 --- a/nibabel/gifti/tests/test_gifti.py +++ b/nibabel/gifti/tests/test_gifti.py @@ -146,7 +146,7 @@ def test_dataarray_init(): pytest.raises(KeyError, gda, datatype='not_datatype') # Float32 datatype comes from array if datatype not set assert gda(arr).datatype == 16 - # Can be overriden by init + # Can be overridden by init assert gda(arr, datatype='uint8').datatype == 2 # Encoding assert gda(encoding=1).encoding == 1 diff --git a/nibabel/imageclasses.py b/nibabel/imageclasses.py index 1d33db8ed1..f657822977 100644 --- a/nibabel/imageclasses.py +++ b/nibabel/imageclasses.py @@ -125,7 +125,7 @@ def __getitem__(self, *args, **kwargs): def spatial_axes_first(img): - """ True if spatial image axes for `img` always preceed other axes + """ True if spatial image axes for `img` always precede other axes Parameters ---------- @@ -136,7 +136,7 @@ def spatial_axes_first(img): ------- spatial_axes_first : bool True if image only has spatial axes (number of axes < 4) or image type - known to have spatial axes preceeding other axes. + known to have spatial axes preceding other axes. """ if len(img.shape) < 4: return True diff --git a/nibabel/loadsave.py b/nibabel/loadsave.py index 4eb206ba4e..1f736409dd 100644 --- a/nibabel/loadsave.py +++ b/nibabel/loadsave.py @@ -243,7 +243,7 @@ def read_img_data(img, prefer='scaled'): array is given by the raw data on disk, multiplied by a scalefactor and maybe with the addition of a constant. This function, with ``unscaled`` returns the data on the disk, without these - format-specific scalings applied. Please use this funciton only if + format-specific scalings applied. Please use this function only if you absolutely need the unscaled data, and the magnitude of the data, as given by the scalefactor, is not relevant to your application. The Analyze-type formats have a single scalefactor +/- diff --git a/nibabel/nicom/csareader.py b/nibabel/nicom/csareader.py index 8082608b73..7e465ff19a 100644 --- a/nibabel/nicom/csareader.py +++ b/nibabel/nicom/csareader.py @@ -252,7 +252,7 @@ def nt_str(s): Returns ------- sdash : str - s stripped to first occurence of null (0) + s stripped to first occurrence of null (0) """ zero_pos = s.find(b'\x00') if zero_pos == -1: diff --git a/nibabel/nicom/dicomreaders.py b/nibabel/nicom/dicomreaders.py index 34e7d19527..e4fbc625ab 100644 --- a/nibabel/nicom/dicomreaders.py +++ b/nibabel/nicom/dicomreaders.py @@ -195,7 +195,7 @@ def _third_pass(wrappers): vol_list.append(dw) these_zs.append(z) continue - # new volumne + # new volume vol_list.sort(_slice_sorter) vol_list = [dw] these_zs = [z] diff --git a/nibabel/nicom/dicomwrappers.py b/nibabel/nicom/dicomwrappers.py index 2f494893f6..9b15dd18c3 100755 --- a/nibabel/nicom/dicomwrappers.py +++ b/nibabel/nicom/dicomwrappers.py @@ -494,7 +494,7 @@ def image_shape(self): What each axis in the frame indices refers to is given by the corresponding entry in the *DimensionIndexSequence* DICOM attribute. - **WARNING**: Any axis refering to the *StackID* DICOM attribute will + **WARNING**: Any axis referring to the *StackID* DICOM attribute will have been removed from the frame indices in determining the shape. This is because only a file containing a single stack is currently allowed by this wrapper. diff --git a/nibabel/parrec.py b/nibabel/parrec.py index 18a557733b..046e1ec704 100644 --- a/nibabel/parrec.py +++ b/nibabel/parrec.py @@ -868,7 +868,7 @@ def _get_unique_image_prop(self, name): props = self.image_defs[name] if np.any(np.diff(props, axis=0)): raise PARRECError(f'Varying {name} in image sequence ' - f'({props}). This is not suppported.') + f'({props}). This is not supported.') return props[0] @deprecate_with_version('get_voxel_size deprecated. ' diff --git a/nibabel/processing.py b/nibabel/processing.py index 4f3e512773..b7abfb8c75 100644 --- a/nibabel/processing.py +++ b/nibabel/processing.py @@ -197,7 +197,7 @@ def resample_to_output(in_img, an image from data, affine and header. voxel_sizes : None or sequence Gives the diagonal entries of ``out_img.affine` (except the trailing 1 - for the homogenous coordinates) (``out_img.affine == + for the homogeneous coordinates) (``out_img.affine == np.diag(voxel_sizes + [1])``). If None, return identity `out_img.affine`. If scalar, interpret as vector ``[voxel_sizes] * len(in_img.shape)``. diff --git a/nibabel/spaces.py b/nibabel/spaces.py index 094f43dc77..dac8fdd049 100644 --- a/nibabel/spaces.py +++ b/nibabel/spaces.py @@ -51,7 +51,7 @@ def vox2out_vox(mapped_voxels, voxel_sizes=None): affine is a (4, 4) array-like. voxel_sizes : None or sequence Gives the diagonal entries of `output_affine` (except the trailing 1 - for the homogenous coordinates) (``output_affine == np.diag(voxel_sizes + for the homogeneous coordinates) (``output_affine == np.diag(voxel_sizes + [1])``). If None, return identity `output_affine`. Returns diff --git a/nibabel/spatialimages.py b/nibabel/spatialimages.py index a85667e5c1..9a2dc76db7 100644 --- a/nibabel/spatialimages.py +++ b/nibabel/spatialimages.py @@ -431,11 +431,11 @@ def __init__(self, dataobj, affine, header=None, Parameters ---------- dataobj : object - Object containg image data. It should be some object that retuns an + Object containing image data. It should be some object that returns an array from ``np.asanyarray``. It should have a ``shape`` attribute or property affine : None or (4,4) array-like - homogenous affine giving relationship between voxel coordinates and + homogeneous affine giving relationship between voxel coordinates and world coordinates. Affine can also be None. In this case, ``obj.affine`` also returns None, and the affine as written to disk will depend on the file format. diff --git a/nibabel/streamlines/trk.py b/nibabel/streamlines/trk.py index c602937928..6f9987e4ee 100644 --- a/nibabel/streamlines/trk.py +++ b/nibabel/streamlines/trk.py @@ -450,7 +450,7 @@ def save(self, fileobj): affine_to_trackvis = get_affine_rasmm_to_trackvis(header) tractogram = tractogram.apply_affine(affine_to_trackvis, lazy=True) - # Create the iterator we'll be using for the rest of the funciton. + # Create the iterator we'll be using for the rest of the function. tractogram = iter(tractogram) try: @@ -599,7 +599,7 @@ def _read_header(fileobj): raise HeaderError('NiBabel only supports versions 1 and 2 of ' 'the Trackvis file format') - # Convert the first record of `header_rec` into a dictionnary + # Convert the first record of `header_rec` into a dictionary header = dict(zip(header_rec.dtype.names, header_rec[0])) header[Field.ENDIANNESS] = endianness diff --git a/nibabel/tests/test_analyze.py b/nibabel/tests/test_analyze.py index 1a87f23e1d..95b33f5069 100644 --- a/nibabel/tests/test_analyze.py +++ b/nibabel/tests/test_analyze.py @@ -83,7 +83,7 @@ def test_general_init(self): # an empty header has shape (0,) - like an empty array # (np.array([])) assert hdr.get_data_shape() == (0,) - # The affine is always homogenous 3D regardless of shape. The + # The affine is always homogeneous 3D regardless of shape. The # default affine will have -1 as the X zoom iff default_x_flip # is True (which it is by default). We have to be careful of the # translations though - these arise from SPM's use of the origin diff --git a/nibabel/tests/test_arraywriters.py b/nibabel/tests/test_arraywriters.py index 22684ac955..50a250d17c 100644 --- a/nibabel/tests/test_arraywriters.py +++ b/nibabel/tests/test_arraywriters.py @@ -234,7 +234,7 @@ def test_scaling_needed(): def test_special_rt(): - # Test that zeros; none finite - round trip to zeros for scaleable types + # Test that zeros; none finite - round trip to zeros for scalable types # For ArrayWriter, these error for default creation, when forced to create # the writer, they round trip to out_dtype max arr = np.array([np.inf, np.nan, -np.inf]) @@ -790,7 +790,7 @@ def test_nan2zero_scaling(): # Use fixed-up type information to avoid bugs, especially on PPC in_info = type_info(in_dt) out_info = type_info(out_dt) - # Skip inpossible combinations + # Skip impossible combinations if in_info['min'] == 0 and sign == -1: continue mx = min(in_info['max'], out_info['max'] * 2., 2**32) diff --git a/nibabel/tests/test_data.py b/nibabel/tests/test_data.py index 56671cdf7d..0c1671dfbf 100644 --- a/nibabel/tests/test_data.py +++ b/nibabel/tests/test_data.py @@ -166,7 +166,7 @@ def test_find_data_dir(): under_here, subhere = os.path.split(here) # under_here == '/nipy/utils' # subhere = 'tests' - # fails with non-existant path + # fails with non-existent path with pytest.raises(DataError): find_data_dir([here], 'implausible', 'directory') # fails with file, when directory expected diff --git a/nibabel/tests/test_filename_parser.py b/nibabel/tests/test_filename_parser.py index e53d6ebd29..49112036d9 100644 --- a/nibabel/tests/test_filename_parser.py +++ b/nibabel/tests/test_filename_parser.py @@ -35,7 +35,7 @@ def test_filenames(): tfns = types_filenames('test.img.bz2', types_exts) assert tfns == {'header': 'test.hdr.bz2', 'image': 'test.img.bz2'} # of course, if we don't know about e.g. gz, and enforce_extensions - # is on, we get an errror + # is on, we get an error with pytest.raises(TypesFilenamesError): types_filenames('test.img.gz', types_exts, ()) # if we don't know about .gz extension, and not enforcing, then we diff --git a/nibabel/tests/test_nifti1.py b/nibabel/tests/test_nifti1.py index 562c09c15b..895bc4a855 100644 --- a/nibabel/tests/test_nifti1.py +++ b/nibabel/tests/test_nifti1.py @@ -439,7 +439,7 @@ def test_qform_sform(self): hdr.set_qform(nice_aff, 1) # Check sform unchanged by setting qform assert hdr.get_sform(coded=True) == (None, 0) - # Setting does change the sform ouput + # Setting does change the sform output hdr.set_sform(nasty_aff, 1) aff, code = hdr.get_sform(coded=True) assert_array_equal(aff, nasty_aff) diff --git a/nibabel/tests/test_parrec_data.py b/nibabel/tests/test_parrec_data.py index e626068ca1..56ff7f035b 100644 --- a/nibabel/tests/test_parrec_data.py +++ b/nibabel/tests/test_parrec_data.py @@ -77,7 +77,7 @@ def test_oblique_loading(): nimg = top_load(nifti_fname) assert_almost_equal(nimg.affine[:3, :3], pimg.affine[:3, :3], 3) # The translation part is always off - # The ammount differs by rotation + # The amount differs by rotation aff_off = pimg.affine[:3, 3] - nimg.affine[:3, 3] # The difference is max in the order of 0.5 voxel vox_sizes = voxel_sizes(nimg.affine) diff --git a/nibabel/tests/test_processing.py b/nibabel/tests/test_processing.py index 4f62f76abf..633502ffd9 100644 --- a/nibabel/tests/test_processing.py +++ b/nibabel/tests/test_processing.py @@ -142,7 +142,7 @@ def test_resample_from_to(caplog): # Test order trans_p_25_aff = from_matvec(np.diag([-4, 5, 6]), [1, 0, 0]) trans_p_25_img = Nifti1Image(data, trans_p_25_aff) - # Suprising to me, but all points outside are set to 0, even with NN + # Surprising to me, but all points outside are set to 0, even with NN out = resample_from_to(img, trans_p_25_img, order=0) exp_out = np.zeros_like(data) exp_out[1:, :, :] = data[1, :, :] @@ -182,7 +182,7 @@ def test_resample_from_to(caplog): # 3D to 2D, we don't need to invert the fixed matrix out = resample_from_to(img, img_2d) assert_array_equal(out.dataobj, data[:, :, 0]) - # Same for tuple as to_img imput + # Same for tuple as to_img input out = resample_from_to(img, (img_2d.shape, img_2d.affine)) assert_array_equal(out.dataobj, data[:, :, 0]) # 4D input and output also OK @@ -295,7 +295,7 @@ def test_resample_to_output(caplog): # Default is Nifti1Image with caplog.at_level(logging.CRITICAL): # Here and below, suppress logs when changing classes assert resample_to_output(img_ni2).__class__ == Nifti1Image - # Can be overriden + # Can be overridden with caplog.at_level(logging.CRITICAL): assert resample_to_output(img_ni1, out_class=Nifti2Image).__class__ == Nifti2Image # None specifies out_class from input @@ -351,7 +351,7 @@ def test_smooth_image(caplog): # Default is Nifti1Image with caplog.at_level(logging.CRITICAL): # Here and below, suppress logs when changing classes assert smooth_image(img_ni2, 0).__class__ == Nifti1Image - # Can be overriden + # Can be overridden with caplog.at_level(logging.CRITICAL): assert smooth_image(img_ni1, 0, out_class=Nifti2Image).__class__ == Nifti2Image # None specifies out_class from input diff --git a/nibabel/tests/test_scripts.py b/nibabel/tests/test_scripts.py index 143e6ba608..61a41f54ad 100644 --- a/nibabel/tests/test_scripts.py +++ b/nibabel/tests/test_scripts.py @@ -473,7 +473,7 @@ def test_nib_tck2trk(): cmd = ["nib-tck2trk", standard_tck, anat] code, stdout, stderr = run_command(cmd, check_code=False) assert code == 2 # Parser error. - assert "Expecting anatomical image as first agument" in stderr + assert "Expecting anatomical image as first argument" in stderr # Convert one file. cmd = ["nib-tck2trk", anat, standard_tck] diff --git a/nibabel/tmpdirs.py b/nibabel/tmpdirs.py index 5ae4097c29..2510faf43c 100644 --- a/nibabel/tmpdirs.py +++ b/nibabel/tmpdirs.py @@ -17,7 +17,7 @@ class TemporaryDirectory(object): """Create and return a temporary directory. This has the same behavior as mkdtemp but can be used as a context manager. - Upon exiting the context, the directory and everthing contained + Upon exiting the context, the directory and everything contained in it are removed. Examples diff --git a/nibabel/volumeutils.py b/nibabel/volumeutils.py index ff6c5b913b..abbcfc1afd 100644 --- a/nibabel/volumeutils.py +++ b/nibabel/volumeutils.py @@ -529,7 +529,7 @@ def array_to_file(data, fileobj, out_dtype=None, offset=0, casting; this depends on the underlying C library and is undefined. In practice `nan2zero` == False might be a good choice when you completely sure there will be no NaNs in the data. This value ignored for float - outut types. NaNs are treated as zero *before* applying `intercept` + output types. NaNs are treated as zero *before* applying `intercept` and `divslope` - so an array ``[np.nan]`` with an `intercept` of 10 becomes ``[-10]`` after conversion to integer `out_dtype` with `nan2zero` set. That is because you will likely apply `divslope` and @@ -616,7 +616,7 @@ def array_to_file(data, fileobj, out_dtype=None, offset=0, pre_clips = max(mn, mn_out), min(mx, mx_out) return _write_data(data, fileobj, out_dtype, order, pre_clips=pre_clips) - # In any case, we do not want to check for nans beause we've already + # In any case, we do not want to check for nans because we've already # disallowed scaling that generates nans nan2zero = False # We are either scaling into c/floats or starting with c/floats, then we're @@ -723,7 +723,7 @@ def _write_data(data, order : {'F', 'C'} memory layout of array in fileobj after writing in_cast : None or numpy type, optional - If not None, inital cast to do on `data` slices before further + If not None, initial cast to do on `data` slices before further processing pre_clips : None or 2-sequence, optional If not None, minimum and maximum of input values at which to clip. @@ -1261,7 +1261,7 @@ def shape_zoom_affine(shape, zooms, x_flip=True): Returns ------- aff : (4,4) array - affine giving correspondance of voxel coordinates to mm + affine giving correspondence of voxel coordinates to mm coordinates, taking the center of the image as origin Examples diff --git a/tools/mpkg_wrapper.py b/tools/mpkg_wrapper.py index ad956696c8..c050bd0f8c 100644 --- a/tools/mpkg_wrapper.py +++ b/tools/mpkg_wrapper.py @@ -10,7 +10,7 @@ distutils setup.py. This script is a minimal version of a wrapper script shipped with the -bdist_mpkg packge. +bdist_mpkg package. """ __docformat__ = 'restructuredtext'