Skip to content

Commit

Permalink
Merge branch 'master' of https://github.com/cbroz1/spyglass
Browse files Browse the repository at this point in the history
  • Loading branch information
CBroz1 committed Aug 23, 2023
2 parents cc48025 + 8c36f05 commit 4d83c13
Show file tree
Hide file tree
Showing 11 changed files with 29 additions and 22 deletions.
2 changes: 1 addition & 1 deletion notebooks/01_Insert_Data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -692,7 +692,7 @@
"- `minirec20230622.nwb`, .3 GB: minimal recording, on\n",
" [Box](https://ucsf.box.com/s/k3sgql6z475oia848q1rgms4zdh4rkjn)\n",
"- `montague20200802.nwb`, 8 GB: full recording, on\n",
" [DropBox](https://www.dropbox.com/scl/fo/4i5b1z4iapetzxfps0grf/h?dl=0&preview=montague20200802_tutorial_.nwb&rlkey=ctahes9v0r7bxes8yceh86gzg)\n",
" DropBox (link coming soon)\n",
"- For those in the UCSF network, these and many others on `/stelmo/nwb/raw`\n",
"\n",
"If you are connected to the Frank lab database, please rename any downloaded\n",
Expand Down
2 changes: 1 addition & 1 deletion notebooks/14_Theta.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -594,7 +594,7 @@
"\n",
"We can overlay theta and detected phase for each electrode.\n",
"\n",
"_Note:_ The red horizontal line indicates phase 0, corresponding to the trough\n",
"_Note:_ The red horizontal line indicates phase 0, corresponding to the through\n",
"of theta."
]
},
Expand Down
2 changes: 1 addition & 1 deletion notebooks/20_Position_Trodes.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@
"available. To adjust the default, insert a new set into this table. The\n",
"parameters are...\n",
"\n",
"- `max_separation`, default 9 cm: maximium acceptable distance between red and\n",
"- `max_separation`, default 9 cm: maximum acceptable distance between red and\n",
" green LEDs.\n",
" - If exceeded, the times are marked as NaNs and inferred by interpolation.\n",
" - Useful when the inferred LED position tracks a reflection instead of the\n",
Expand Down
3 changes: 1 addition & 2 deletions notebooks/24_Linearization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,6 @@
"\n",
"import spyglass.common as sgc\n",
"import spyglass.position.v1 as sgp\n",
"import spyglass as nd\n",
"\n",
"# ignore datajoint+jupyter async warnings\n",
"import warnings\n",
Expand Down Expand Up @@ -1501,7 +1500,7 @@
" + 1\n",
")\n",
"video_info = (\n",
" nd.common.common_behav.VideoFile()\n",
" sgc.common_behav.VideoFile()\n",
" & {\"nwb_file_name\": key[\"nwb_file_name\"], \"epoch\": epoch}\n",
").fetch1()\n",
"\n",
Expand Down
8 changes: 4 additions & 4 deletions notebooks/33_Decoding_Clusterless.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@
" [extracted marks](./31_Extract_Mark_Indicators.ipynb), as well as loaded \n",
" position data. If 1D decoding, this data should also be\n",
" [linearized](./24_Linearization.ipynb).\n",
"- Ths tutorial also assumes you're familiar with how to run processes on GPU, as\n",
" presented in [this notebook](./32_Decoding_with_GPUs.ipynb)\n",
"- This tutorial also assumes you're familiar with how to run processes on GPU, \n",
" as presented in [this notebook](./32_Decoding_with_GPUs.ipynb)\n",
"\n",
"Clusterless decoding can be performed on either 1D or 2D data. A few steps in\n",
"this notebook will refer to a `decode_1d` variable set in \n",
Expand Down Expand Up @@ -143,10 +143,10 @@
"source": [
"First, we'll fetch marks with `fetch_xarray`, which provides a labeled array of\n",
"shape (n_time, n_mark_features, n_electrodes). Time is in 2 ms bins with either\n",
"`NaN` if no spike occured or the value of the spike features.\n",
"`NaN` if no spike occurred or the value of the spike features.\n",
"\n",
"If there is >1 spike per time bin per tetrode, we take an an average of the\n",
"marks. Ideally, we would use all the marks, this is a rare occurance and\n",
"marks. Ideally, we would use all the marks, this is a rare occurrence and\n",
"decoding is generally robust to the averaging."
]
},
Expand Down
2 changes: 1 addition & 1 deletion notebooks/py_scripts/01_Insert_Data.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@
# - `minirec20230622.nwb`, .3 GB: minimal recording, on
# [Box](https://ucsf.box.com/s/k3sgql6z475oia848q1rgms4zdh4rkjn)
# - `montague20200802.nwb`, 8 GB: full recording, on
# [DropBox](https://www.dropbox.com/scl/fo/4i5b1z4iapetzxfps0grf/h?dl=0&preview=montague20200802_tutorial_.nwb&rlkey=ctahes9v0r7bxes8yceh86gzg)
# DropBox (link coming soon)
# - For those in the UCSF network, these and many others on `/stelmo/nwb/raw`
#
# If you are connected to the Frank lab database, please rename any downloaded
Expand Down
2 changes: 1 addition & 1 deletion notebooks/py_scripts/14_Theta.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@
#
# We can overlay theta and detected phase for each electrode.
#
# _Note:_ The red horizontal line indicates phase 0, corresponding to the trough
# _Note:_ The red horizontal line indicates phase 0, corresponding to the through
# of theta.

# +
Expand Down
2 changes: 1 addition & 1 deletion notebooks/py_scripts/20_Position_Trodes.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@
# available. To adjust the default, insert a new set into this table. The
# parameters are...
#
# - `max_separation`, default 9 cm: maximium acceptable distance between red and
# - `max_separation`, default 9 cm: maximum acceptable distance between red and
# green LEDs.
# - If exceeded, the times are marked as NaNs and inferred by interpolation.
# - Useful when the inferred LED position tracks a reflection instead of the
Expand Down
3 changes: 1 addition & 2 deletions notebooks/py_scripts/24_Linearization.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,6 @@

import spyglass.common as sgc
import spyglass.position.v1 as sgp
import spyglass as nd

# ignore datajoint+jupyter async warnings
import warnings
Expand Down Expand Up @@ -335,7 +334,7 @@
+ 1
)
video_info = (
nd.common.common_behav.VideoFile()
sgc.common_behav.VideoFile()
& {"nwb_file_name": key["nwb_file_name"], "epoch": epoch}
).fetch1()

Expand Down
8 changes: 4 additions & 4 deletions notebooks/py_scripts/33_Decoding_Clusterless.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@
# [extracted marks](./31_Extract_Mark_Indicators.ipynb), as well as loaded
# position data. If 1D decoding, this data should also be
# [linearized](./24_Linearization.ipynb).
# - Ths tutorial also assumes you're familiar with how to run processes on GPU, as
# presented in [this notebook](./32_Decoding_with_GPUs.ipynb)
# - This tutorial also assumes you're familiar with how to run processes on GPU,
# as presented in [this notebook](./32_Decoding_with_GPUs.ipynb)
#
# Clusterless decoding can be performed on either 1D or 2D data. A few steps in
# this notebook will refer to a `decode_1d` variable set in
Expand Down Expand Up @@ -87,10 +87,10 @@

# First, we'll fetch marks with `fetch_xarray`, which provides a labeled array of
# shape (n_time, n_mark_features, n_electrodes). Time is in 2 ms bins with either
# `NaN` if no spike occured or the value of the spike features.
# `NaN` if no spike occurred or the value of the spike features.
#
# If there is >1 spike per time bin per tetrode, we take an an average of the
# marks. Ideally, we would use all the marks, this is a rare occurance and
# marks. Ideally, we would use all the marks, this is a rare occurrence and
# decoding is generally robust to the averaging.

# +
Expand Down
17 changes: 13 additions & 4 deletions src/spyglass/utils/dj_merge_tables.py
Original file line number Diff line number Diff line change
Expand Up @@ -445,10 +445,19 @@ def merge_delete_parent(
if dry_run:
return part_parents

with cls._safe_context():
super().delete(cls(), **kwargs)
for part_parent in part_parents:
super().delete(part_parent, **kwargs)
merge_ids = cls.merge_restrict(restriction).fetch(
RESERVED_PRIMARY_KEY, as_dict=True
)

# CB: Removed transaction protection here bc 'no' confirmation resp
# still resulted in deletes. If re-add, consider transaction=False
super().delete((cls & merge_ids), **kwargs)

if cls & merge_ids: # If 'no' on del prompt from above, skip below
return # User can still abort del below, but yes/no is unlikly

for part_parent in part_parents:
super().delete(part_parent, **kwargs) # add safemode=False?

@classmethod
def fetch_nwb(
Expand Down

0 comments on commit 4d83c13

Please sign in to comment.