Skip to content

Commit

Permalink
Sphinx build: catch warnings for style errors (#208)
Browse files Browse the repository at this point in the history
* See if adding fail_on_warning in a readthedocs yml file does the trick

* Change to 4 spaces indent for .readthedocs.yml

* Update documentation: remove extraneous md files and change Makefile to fail if warning caught

* Intentionally put a mistake in the documentation to try and force error

* Get configuration right to ensure we fail on warning

* Fix version error (removed)

* Add an intentional indentation error (appeared on local make html) to see if readthedocs fails

* Fix error when trying to break spatial_analysis.py

* Change .readthedocs.yml

* Add more configuration

* See if changing to version 2 changes anything

* Now fix intentional issue and see if readthedocs passes now

* This should fix it...

* Fix docstring in test_utils and configure conf.py to ignore ark.md

* Try out new conf.py settings to see if Sphinx makes the documentation for us

* Re-add _static and _templates and update params in conf.py to build documentation in correct locations

* Fix regex to ignore testing files

* Update .gitignore to ensure people do not try to add the _markdown files generated if testing locally

* Configure conf.py to catch docstring errors, and intentionally push a bad build to see if ReadTheDocs actually catches them

* Fix argument-list errors caught by autodoc-process-docstring

* conf.py now works to catch basic return errors not caught by arglist checking

* conf.py now handles argument-less function docstring checks, and fixed bulleted list issue in test_utils.py

* Remove TODOs in conf.py

* Remove extra arguments checks, leave just 1 all-encompassing one
  • Loading branch information
alex-l-kong committed Sep 6, 2020
1 parent e567fef commit 5f17430
Show file tree
Hide file tree
Showing 19 changed files with 147 additions and 205 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,4 @@ env
/dist

*/_build
*/_markdown
12 changes: 12 additions & 0 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
version: 2

sphinx:
configuration: docs/conf.py

sphinx:
fail_on_warning: true

python:
version: 3.6
install:
- requirements: docs/rtd-requirements.txt
8 changes: 4 additions & 4 deletions ark/analysis/spatial_analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ def calculate_channel_spatial_enrichment(dist_matrices_dict, marker_thresholds,
excluded_colnames=None, included_fovs=None,
dist_lim=100, bootstrap_num=1000, fov_col="SampleID"):
"""Spatial enrichment analysis to find significant interactions between cells expressing
different markers. Uses bootstrapping to permute cell labels randomly.
different markers. Uses bootstrapping to permute cell labels randomly.
Args:
dist_matrices_dict (dict):
Expand All @@ -21,9 +21,9 @@ def calculate_channel_spatial_enrichment(dist_matrices_dict, marker_thresholds,
excluded_colnames (list):
all column names that are not markers. If argument is none, default is
["cell_size", "Background", "HH3",
"summed_channel", "label", "area",
"eccentricity", "major_axis_length",
"minor_axis_length", "perimeter", "fov"]
"summed_channel", "label", "area",
"eccentricity", "major_axis_length",
"minor_axis_length", "perimeter", "fov"]
included_fovs (list):
patient labels to include in analysis. If argument is none, default is all labels used.
dist_lim (int):
Expand Down
3 changes: 0 additions & 3 deletions ark/segmentation/marker_quantification.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,9 +212,6 @@ def compute_complete_expression_matrices(segmentation_labels, tiff_dir, img_sub_
a list of points we wish to analyze, if None will default to all points
is_mibitiff (bool):
a flag to indicate whether or not the base images are MIBItiffs
mibitiff_suffix (str):
if is_mibitiff is true, then needs to be specified to select which points to load from
mibitiff
batch_size (int):
how large we want each of the batches of points to be when computing, adjust as
necessary for speed and memory considerations
Expand Down
1 change: 1 addition & 0 deletions ark/segmentation/signal_extraction.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ def positive_pixels_extraction(cell_coords, image_data, threshold=0):
Args:
cell_coords (numpy.ndarray): values representing pixels within one cell
image_data (xarray.DataArray): array containing channel counts
threshold (int): where we want to set the cutoff for a positive pixel, default 0
Returns:
numpy.ndarray:
Expand Down
2 changes: 1 addition & 1 deletion ark/utils/data_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -611,7 +611,7 @@ def split_img_stack(stack_dir, output_dir, stack_list, indices, names, channels_
the indices we want to pull data from
names (list):
the corresponding names of the channels
channel_first (bool):
channels_first (bool):
whether we index at the beginning or end of the array
"""

Expand Down
23 changes: 19 additions & 4 deletions ark/utils/plot_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,10 +139,22 @@ def randomize_labels(label_map):
return label_map


# TODO: make documentation more specific here
def outline_objects(L_matrix, list_of_lists):
"""Takes in an L matrix generated by skimage.label, along with a
list of lists, and returns a mask that has the
pixels for all cells from each list represented as integer values for easy plotting"""
pixels for all cells from each list represented as integer values for easy plotting
Args:
L_matrix (numpy.ndarray):
a label map indicating the label of each cell
list_of_lists (list):
each element is a list of cells we wish to plot separately
Returns:
np.ndarray:
an binary mask indicating the regions of cells outlined
"""

L_plot = copy.deepcopy(L_matrix).astype(float)

Expand Down Expand Up @@ -170,7 +182,7 @@ def plot_color_map(outline_matrix, names, plotting_colors=None, ground_truth=Non
list of names for each category to use for plotting
plotting_colors (list):
list of colors to use for plotting cell categories
ground truth (numpy.ndarray):
ground_truth (numpy.ndarray):
optional argument to supply label map of true segmentation to be plotted alongside
save_path (str):
optional argument to save plot as TIF
Expand Down Expand Up @@ -205,14 +217,17 @@ def plot_color_map(outline_matrix, names, plotting_colors=None, ground_truth=Non
fig.savefig(save_path, dpi=200)


# TODO: make documentation more specific here
def plot_barchart_errors(pd_array, contour_errors, predicted_errors, save_path=None):
"""Plot different error types in a barchart, along with cell-size correlation in a scatter plot
Args:
pd_array (pandas.array):
pandas cell array representing error types for each class of cell
cell_category (list):
list of error types to extract from array
contour_errors (list):
list of contour error types to extract from array
predicted_errors (list):
list of predictive error types to extract from the array
save_path (str):
optional file path to save generated TIF
"""
Expand Down
8 changes: 6 additions & 2 deletions ark/utils/synthetic_spatial_datagen.py
Original file line number Diff line number Diff line change
Expand Up @@ -316,6 +316,8 @@ def generate_two_cell_test_nuclear_signal(segmentation_mask, cell_centers,
a list of cells we wish to generate nuclear signal for, if None assume just cell 1
nuc_radius (int):
the radius of the nucleus of each cell
nuc_signal_strength (int):
the value we want to assign for nuclear signal
nuc_uncertainty_length (int):
will extend nuc_radius by the specified length
Expand Down Expand Up @@ -365,6 +367,8 @@ def generate_two_cell_test_membrane_signal(segmentation_mask, cell_centers,
a list of cells we wish to generate nuclear signal for, if None assume just cell 2
memb_thickness (int):
the diameter of the membrane ring of each cell
memb_signal_strength (int):
the value we want to assign to membrane signal
memb_uncertainty_length (int):
will extend memb_radius by the specified length
Expand Down Expand Up @@ -416,8 +420,8 @@ def generate_two_cell_test_channel_synthetic_data(size_img=(1024, 1024), cell_ra
the radius of each cell
nuc_radius (int):
the radius of each nucleus
memb_diameter (int):
the diameter of each membrane
memb_thickness (int):
the thickness of each membrane
nuc_cell_ids (list):
a list of which cells we wish to generate nuclear signal for, if None assume just
cell 1
Expand Down
17 changes: 9 additions & 8 deletions ark/utils/test_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -363,12 +363,13 @@ def create_paired_xarray_fovs(base_dir, fov_names, channel_names, img_shape=(10,
img_shape (tuple):
Single image shape (x pixels, y pixels)
mode (str):
The type of data to generate. Current options are:
- 'tiff'
- 'multitiff'
- 'reverse_multitiff'
- 'mibitiff'
- 'labels'
The type of data to generate. Current options are:
* 'tiff'
* 'multitiff'
* 'reverse_multitiff'
* 'mibitiff'
* 'labels'
delimiter (str or None):
Delimiting character or string separating fov_id from rest of file/folder name.
Default is None.
Expand Down Expand Up @@ -520,7 +521,7 @@ def make_segmented_csv(num_cells, extra_cols=None):
num_cells (int):
Number of rows (cells) in csv
extra_cols (dict):
Extra columns to add in the format `{'Column_Name' : data_1D, ...}`_
Extra columns to add in the format ``{'Column_Name' : data_1D, ...}``
Returns:
pandas.DataFrame:
Expand All @@ -538,7 +539,7 @@ def make_segmented_csv(num_cells, extra_cols=None):


def create_test_extraction_data():
""" Generate hardcoded extraction test data
"""Generate hardcoded extraction test data
"""
# first create segmentation masks
Expand Down
2 changes: 1 addition & 1 deletion docs/Makefile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
SPHINXOPTS =
SPHINXOPTS = -W --keep-going
SPHINXBUILD = sphinx-build
SOURCEDIR = .
BUILDDIR = _build
Expand Down
Empty file removed docs/_build/.gitkeep
Empty file.
38 changes: 0 additions & 38 deletions docs/_markdown/ark.analysis.md

This file was deleted.

20 changes: 0 additions & 20 deletions docs/_markdown/ark.md

This file was deleted.

30 changes: 0 additions & 30 deletions docs/_markdown/ark.segmentation.md

This file was deleted.

70 changes: 0 additions & 70 deletions docs/_markdown/ark.utils.md

This file was deleted.

7 changes: 0 additions & 7 deletions docs/_markdown/conftest.md

This file was deleted.

7 changes: 0 additions & 7 deletions docs/_markdown/modules.md

This file was deleted.

7 changes: 0 additions & 7 deletions docs/_markdown/setup.md

This file was deleted.

Loading

0 comments on commit 5f17430

Please sign in to comment.