Skip to content

Commit

Permalink
change factors for data pyramid
Browse files Browse the repository at this point in the history
  • Loading branch information
droumis committed Sep 28, 2024
1 parent b332094 commit 0a2b9c2
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 28 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
13 changes: 2 additions & 11 deletions multichannel_timeseries/index.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,21 +20,12 @@
"___"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#TODO: Make Key Features list into a diagram or gif showing the feature-components of the viewer"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Key Features\n",
"Analyzing electrophysiological data often involves searching for patterns across time, channels, and covariates. Features that support this type of investigation for time-aligned, amplitude-diverse data include:\n",
"Analyzing electrophysiological data often involves searching for patterns across time, channels, and covariates. Target features that support this type of investigation for time-aligned, amplitude-diverse data include:\n",
"\n",
"- **Smooth Interactions at Scale:** Smooth zooming and panning across time and channels.\n",
"- **Subcoordinate Axes:** Independent amplitude dimension (y-axis) per channel.\n",
Expand Down Expand Up @@ -63,7 +54,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src='./assets/240716b_multi-chan-ts.png' alt=\"Multichannel timeseries recommended workflows\" align=\"right\" width=75%>\n",
"<img src='./assets/240927_multichan.png' alt=\"Multichannel timeseries recommended workflows\" align=\"right\" width=75%>\n",
"\n",
"The [recommended multichannel timeseries notebook](./multichan.ipynb) provides a workflow for processing and analyzing multichannel timeseries datasets that fit in memory.\n",
"\n",
Expand Down
23 changes: 6 additions & 17 deletions multichannel_timeseries/large_multichan.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@
"PYRAMID_FILE = f\"{DATA_PATH.stem}.zarr\"\n",
"PYRAMID_PATH = DATA_DIR / PYRAMID_FILE\n",
"print(f'Local Original Data Path: {DATA_PATH}')\n",
"print(f'Pyramid Path To Be Created: {PYRAMID_PATH}')"
"print(f'Pyramid Path: {PYRAMID_PATH}')"
]
},
{
Expand Down Expand Up @@ -278,15 +278,6 @@
"</div>"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#TODO. Check with Andrew about creating a delayed version of this.. I know it's downsampling in a chunked manner, but I think it then lives entirely in memory prior to getting dumped to disk"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -295,10 +286,9 @@
"source": [
"%%time\n",
"\n",
"FACTORS = [1, 2, 4, 8, 16, 32, 64, 128, 256]\n",
"\n",
"# TODO: find better principled way to determine factors.. The following doesn't work as the number of channels scales\n",
"# FACTORS = list(np.array([1, 2, 4, 8, 16, 32, 64, 128, 256]) ** (len(ts_ds[\"channel\"]) // 4))\n",
"# The FACTORS will depend on the size of your dataset, available disk size, and the resolution needed\n",
"# For this demo, we are arbitrarily choosing 8 levels and scaling by quadrupling each preceding factor.\n",
"FACTORS = [4**i for i in range(8)]\n",
"\n",
"def _help_downsample(data, time, n_out):\n",
" \"\"\"\n",
Expand Down Expand Up @@ -482,14 +472,13 @@
" pyramid_level = num_levels - 1\n",
" size = time_da.size\n",
" else:\n",
" #TODO: explore a more efficient way to determine resulting size, avoiding\n",
" sizes = np.array([\n",
" _extract_ds(ts_dt, pyramid_level)[\"time\"].sel(time=time_slice).size\n",
" for pyramid_level in range(num_levels)\n",
" ])\n",
" diffs = sizes - width\n",
" pyramid_level = np.argmin(np.where(diffs >= 0, diffs, np.inf)) # nearest higher-resolution level\n",
" # pyramid_level = np.argmin(np.abs(np.array(sizes) - width)) # nearest, regardless of direction\n",
" # pyramid_level = np.argmin(np.abs(np.array(sizes) - width)) # nearest (higher or lower resolution) level\n",
" size = sizes[pyramid_level]\n",
" \n",
" title = (\n",
Expand Down Expand Up @@ -644,7 +633,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.9"
"version": "3.11.10"
}
},
"nbformat": 4,
Expand Down

0 comments on commit 0a2b9c2

Please sign in to comment.