Skip to content

Commit

Permalink
Blackify py scrips. Continue config changes
Browse files Browse the repository at this point in the history
  • Loading branch information
CBroz1 committed Aug 8, 2023
1 parent 2bcbbdd commit 92764cc
Show file tree
Hide file tree
Showing 22 changed files with 315 additions and 182 deletions.
3 changes: 3 additions & 0 deletions notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,4 +48,7 @@ root Spyglass directory
pip install jupytext
jupytext --to py notebooks/*ipynb
mv notebooks/*py notebooks/py_scripts
black .
```

Unfortunately, jupytext-generated py script are not black-compliant by default.
3 changes: 2 additions & 1 deletion notebooks/py_scripts/01_Insert_Data.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,7 @@

# spyglass.data_import has tools for inserting NWB files into the database
import spyglass.data_import as sgi

# -

# ## Visualizing the database
Expand Down Expand Up @@ -91,7 +92,7 @@
# By adding diagrams together, of adding and subtracting levels, we can visualize
# key parts of Spyglass.
#
# _Note:_ Notice the *Selection* tables. This is a design pattern that selects a
# _Note:_ Notice the *Selection* tables. This is a design pattern that selects a
# subset of upstream items for further processing. In some cases, these also pair
# the selected data with processing parameters.

Expand Down
30 changes: 15 additions & 15 deletions notebooks/py_scripts/02_Data_Sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
# # Sync Data
#

# DEV note:
# DEV note:
# - set up as host, then as client
# - test as collaborator

Expand All @@ -24,10 +24,10 @@

# This notebook will cover ...
#
# 1. [General Kachery information](#intro)
# 2. Setting up Kachery as a [host](#host-setup). If you'll use an existing host,
# 1. [General Kachery information](#intro)
# 2. Setting up Kachery as a [host](#host-setup). If you'll use an existing host,
# skip this.
# 3. Setting up Kachery in your [database](#database-setup). If you're using an
# 3. Setting up Kachery in your [database](#database-setup). If you're using an
# existing database, skip this.
# 4. Adding Kachery [data](#data-setup).
#
Expand All @@ -36,7 +36,7 @@
#

# This is one notebook in a multi-part series on Spyglass. Before running, be sure
# to [setup your environment](./00_Setup.ipynb) and run some analyses (e.g.
# to [setup your environment](./00_Setup.ipynb) and run some analyses (e.g.
# [LFP](./12_LFP.ipynb)).
#
# ### Cloud
Expand All @@ -46,8 +46,8 @@
# makes it possible to share analysis results, stored in NWB files. When a user
# tries to access a file, Spyglass does the following:
#
# 1. Try to load from the local file system/store.
# 2. If unavailable, check if it is in the relevant sharing table (i.e.,
# 1. Try to load from the local file system/store.
# 2. If unavailable, check if it is in the relevant sharing table (i.e.,
# `NwbKachery` or `AnalysisNWBKachery`).
# 3. If present, attempt to download from the associated Kachery Resource.
#
Expand All @@ -64,7 +64,7 @@
# 2. `franklab.collaborator`: File sharing with collaborating labs.
# 3. `franklab.public`: Public file sharing (not yet active)
#
# Setting your zone can either be done as as an environment variable or an item
# Setting your zone can either be done as as an environment variable or an item
# in a DataJoint config.
#
# - Environment variable:
Expand All @@ -89,9 +89,9 @@
# See
# [instructions](https://github.com/flatironinstitute/kachery-cloud/blob/main/doc/create_kachery_zone.md)
# for setting up new Kachery Zones, including creating a cloud bucket and
# registering it with the Kachery team.
# registering it with the Kachery team.
#
# _Notes:_
# _Notes:_
#
# - Bucket names cannot include periods, so we substitute a dash, as in
# `franklab-default`.
Expand All @@ -100,7 +100,7 @@
# ### Resources
#
# See [instructions](https://github.com/scratchrealm/kachery-resource/blob/main/README.md)
# for setting up zone resources. This allows for sharing files on demand. We
# for setting up zone resources. This allows for sharing files on demand. We
# suggest using the same name for the zone and resource.
#
# _Note:_ For each zone, you need to run the local daemon that listens for
Expand Down Expand Up @@ -167,7 +167,7 @@
# Once the zone exists, we can add `AnalysisNWB` files we want to share by adding
# entries to the `AnalysisNwbfileKacherySelection` table.
#
# _Note:_ This step depends on having previously run an analysis on the example
# _Note:_ This step depends on having previously run an analysis on the example
# file.

# +
Expand All @@ -192,13 +192,13 @@
sgs.AnalysisNwbfileKachery.populate()

# + [markdown] jupyter={"outputs_hidden": true}
# If all of that worked,
# If all of that worked,
#
# 1. go to https://kachery-gateway.figurl.org/admin?zone=your_zone
# (changing your_zone to the name of your zone)
# 2. Go to the Admin/Authorization Settings tab
# 3. Add the GitHub login names and permissions for the users you want to share
# with.
# 3. Add the GitHub login names and permissions for the users you want to share
# with.
#
# If those users can connect to your database, they should now be able to use the
# `.fetch_nwb()` method to download any `AnalysisNwbfiles` that have been shared
Expand Down
21 changes: 15 additions & 6 deletions notebooks/py_scripts/11_Curation.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@
dj.config.load("dj_local_conf.json") # load config for database connection info

from spyglass.spikesorting import SpikeSorting

# -

# ## Spikes Sorted
Expand Down Expand Up @@ -80,21 +81,29 @@
f"https://sortingview.vercel.app/workspace?workspace={workspace_uri}&channel=franklab"
)

# This will take you to a workspace on the `sortingview` app. The workspace, which you can think of as a list of recording and associated sorting objects, was created at the end of spike sorting. On the workspace view, you will see a set of recordings that have been added to the workspace.
# This will take you to a workspace on the `sortingview` app. The workspace, which
# you can think of as a list of recording and associated sorting objects, was
# created at the end of spike sorting. On the workspace view, you will see a set
# of recordings that have been added to the workspace.
#
# ![Workspace view](./../notebook-images/workspace.png)
#
# Clicking on a recording then takes you to a page that gives you information about the recording as well as the associated sorting objects.
# Clicking on a recording then takes you to a page that gives you information
# about the recording as well as the associated sorting objects.
#
# ![Recording view](./../notebook-images/recording.png)
#
# Click on a sorting to see the curation view. Try exploring the many visualization widgets.
# Click on a sorting to see the curation view. Try exploring the many
# visualization widgets.
#
# ![Unit table](./../notebook-images/unittable.png)
#
# The most important is the `Units Table` and the `Curation` menu, which allows you to give labels to the units. The curation labels will persist even if you suddenly lose connection to the app; this is because the curaiton actions are appended to the workspace as soon as they are created. Note that if you are not logged in with your Google account, `Curation` menu may not be visible. Log in and refresh the page to access this feature.
# The most important is the `Units Table` and the `Curation` menu, which allows
# you to give labels to the units. The curation labels will persist even if you
# suddenly lose connection to the app; this is because the curation actions are
# appended to the workspace as soon as they are created. Note that if you are not
# logged in with your Google account, `Curation` menu may not be visible. Log in
# and refresh the page to access this feature.
#
# ![Curation](./../notebook-images/curation.png)
#


31 changes: 14 additions & 17 deletions notebooks/py_scripts/14_Theta.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@
# - For additional info on DataJoint syntax, including table definitions and
# inserts, see
# [the Insert Data notebook](./01_Insert_Data.ipynb)
# - To run this notebook, you should have already completed the
# [LFP](./12_LFP.ipynb) notebook and populated the `LFPBand` table.
#
# - To run this notebook, you should have already completed the
# [LFP](./12_LFP.ipynb) notebook and populated the `LFPBand` table.
#
# In this tutorial, we demonstrate how to generate analytic signals from the LFP
# data, as well as how to compute theta phases and power.

Expand All @@ -55,7 +55,6 @@

warnings.simplefilter("ignore", category=DeprecationWarning)
warnings.simplefilter("ignore", category=ResourceWarning)

# -

# ## Acquire Signal
Expand All @@ -76,33 +75,33 @@

# We do not need all electrodes for theta phase/power, so we define a list for
# analyses. When working with full data, this list might limit to hippocampal
# reference electrodes.
# reference electrodes.
#
# Make sure that the chosen electrodes already exist in the LFPBand data; if not,
# go to the LFP tutorial to generate them.

# +
electrode_list = [0]

all_electrodes = ( # All available electrode ids
all_electrodes = ( # All available electrode ids
(lfp_band.LFPBandV1() & lfp_key).fetch_nwb()[0]["lfp_band"]
).electrodes.data[:]

np.isin(electrode_list, all_electrodes) # Check if our list is in 'all'
np.isin(electrode_list, all_electrodes) # Check if our list is in 'all'
# -

# Next, we'll compute the theta analytic signal.

# +
theta_analytic_signal = (lfp_band.LFPBandV1() & lfp_key).compute_analytic_signal(
electrode_list=electrode_list
)
theta_analytic_signal = (
lfp_band.LFPBandV1() & lfp_key
).compute_analytic_signal(electrode_list=electrode_list)

theta_analytic_signal
# -

# In the dataframe above, the index is the timestamps, and the columns are the
# analytic sinals of theta band (complex numbers) for each electrode.
# analytic signals of theta band (complex numbers) for each electrode.

# ## Compute phase and power
#
Expand Down Expand Up @@ -162,9 +161,9 @@

fig.tight_layout()
ax1.set_title(
f"Theta band amplitude and phase, electode {electrode_id}",
f"Theta band amplitude and phase, electrode {electrode_id}",
fontsize=20,
);
)
# -

# We can also plot the theta power.
Expand All @@ -184,9 +183,7 @@
)
ax.tick_params(axis="y", labelcolor="k")
ax.set_title(
f"Theta band power, electode {electrode_id}",
f"Theta band power, electrode {electrode_id}",
fontsize=20,
);
)
# -


24 changes: 11 additions & 13 deletions notebooks/py_scripts/20_Position_Trodes.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,22 +76,22 @@
nwb_copy_file_name = sgu.nwb_helper_fn.get_nwb_copy_filename(nwb_file_name)
sgc.common_behav.RawPosition() & {"nwb_file_name": nwb_copy_file_name}

# ## Setting parameters
# ## Setting parameters
#
# Parameters are set by the `TrodesPosParams` table, with a `default` set
# available. To adjust the default, insert a new set into this table. The
# parameters are...
#
# - `max_separation`, default 9 cm: maxmium acceptable distance between red and
# green LEDs.
# - If exceeded, the times are marked as NaNs and inferred by interpolation.
# - `max_separation`, default 9 cm: maximium acceptable distance between red and
# green LEDs.
# - If exceeded, the times are marked as NaNs and inferred by interpolation.
# - Useful when the inferred LED position tracks a reflection instead of the
# true position.
# - `max_speed`, default 300.0 cm/s: maximum speed the animal can move.
# - If exceeded, times are marked as NaNs and inferred by interpolation.
# - Useful to prevent big jumps in position.
# - `max_speed`, default 300.0 cm/s: maximum speed the animal can move.
# - If exceeded, times are marked as NaNs and inferred by interpolation.
# - Useful to prevent big jumps in position.
# - `position_smoothing_duration`, default 0.100 s: LED position smoothing before
# computing average position to get head position.
# computing average position to get head position.
# - `speed_smoothing_std_dev`, default 0.100 s: standard deviation of the Gaussian
# kernel used to smooth the head speed.
# - `front_led1`, default 1 (True), use `xloc`/`yloc`: Which LED is the front LED
Expand Down Expand Up @@ -120,7 +120,7 @@
# ## Select interval

# Later, we'll pair the above parameters with an interval from our NWB file and
# insert into `TrodesPosSelection`.
# insert into `TrodesPosSelection`.
#
# First, let's select an interval from the `IntervalList` table.
#
Expand All @@ -133,7 +133,7 @@
# the video itself.
#
# `fetch1_dataframe` returns the position of the LEDs as a pandas dataframe where
# time is the index.
# time is the index.

interval_list_name = "pos 0 valid times" # pos # is epoch # minus 1
raw_position_df = (
Expand Down Expand Up @@ -263,7 +263,7 @@

# ## Upsampling position
#
# Sometimes we need the position data in smaller in time bins, which can be
# Sometimes we need the position data in smaller in time bins, which can be
# achieved with upsampling using the following parameters.
#
# - `is_upsampled`, default 0 (False): If 1, perform upsampling.
Expand Down Expand Up @@ -361,5 +361,3 @@
axes[1].set_ylabel("y-velocity [cm/s]", fontsize=18)
axes[1].set_title("Upsampled Head Velocity", fontsize=28)
# -


16 changes: 8 additions & 8 deletions notebooks/py_scripts/21_Position_DLC_1.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,12 @@
# inserts, see
# [the Insert Data notebook](./01_Insert_Data.ipynb)
#
# This tutorial will extract position via DeepLabCut (DLC). It will walk through...
# This tutorial will extract position via DeepLabCut (DLC). It will walk through...
# - creating a DLC project
# - extracting and labeling frames
# - training your model
#
# If you already have a pretrained project, you can either skip to the
# If you already have a pretrained project, you can either skip to the
# [next tutorial](./22_Position_DLC_2.ipynb) to load it into the database, or skip
# to the [following tutorial](./23_Position_DLC_3.ipynb) to start pose estimation
# with a model that is already inserted.
Expand Down Expand Up @@ -82,11 +82,11 @@
# <div class="alert alert-block alert-info">
# <b>Notes:</b><ul>
# <li>
# The cells within this <code>DLCProject</code> step need to be performed
# The cells within this <code>DLCProject</code> step need to be performed
# in a local Jupyter notebook to allow for use of the frame labeling GUI
# </li>
# <li>
# Please do not add to the <code>BodyPart</code> table in the production
# Please do not add to the <code>BodyPart</code> table in the production
# database unless necessary.
# </li>
# </ul>
Expand Down Expand Up @@ -125,7 +125,7 @@
# - A team name, as shown in `LabTeam` for setting permissions. Here, we'll
# use "LorenLab".
# - A `project_name`, as a unique identifier for this DLC project. Here, we'll use
# __"tutorial_scratch_yourinitials"__
# __"tutorial_scratch_yourinitials"__
# - `bodyparts` is a list of body parts for which we want to extract position.
# The pre-labeled frames we're using include the bodyparts listed below.
# - Number of frames to extract/label as `frames_per_video`. A true project might
Expand Down Expand Up @@ -171,14 +171,14 @@
# This step and beyond should be run on a GPU-enabled machine.
# </div>

# #### [DLCModelTraining](#ToC)<a id='DLCModelTraining1'></a>
# #### [DLCModelTraining](#ToC)<a id='DLCModelTraining1'></a>
#
# Please make sure you're running this notebook on a GPU-enabled machine.
#
# Now that we've imported existing frames, we can get ready to train our model.
#
# First, we'll need to define a set of parameters for `DLCModelTrainingParams`, which will get used by DeepLabCut during training. Let's start with `gputouse`,
# which determines which GPU core to use.
# which determines which GPU core to use.
#
# The cell below determines which core has space and set the `gputouse` variable
# accordingly.
Expand Down Expand Up @@ -298,7 +298,7 @@

# ### Next Steps
#
# With our trained model in place, we're ready to move on to
# With our trained model in place, we're ready to move on to
# [pose estimation](./23_Position_DLC_3.ipynb).

# ### [Return To Table of Contents](#TableOfContents)<br>
Loading

0 comments on commit 92764cc

Please sign in to comment.