Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Can't use classes created in the same extension as references (link or include) #1873

Closed
3 tasks done
luiztauffer opened this issue Mar 26, 2024 · 3 comments
Closed
3 tasks done

Comments

@luiztauffer
Copy link

What happened?

I’m working on an extension for fiber photometry data, and in that extension I want to make reference (with a link and with an include) to one of the neurodata types that’s being created. Example:

fiberphotometryresponse_series = NWBGroupSpec(
    neurodata_type_def="FiberPhotometryResponseSeries",
    neurodata_type_inc="TimeSeries",
    doc="Extends TimeSeries to hold Fiber Photometry data",
    datasets=[
        NWBDatasetSpec(
            name="fibers",
            doc="references row(s) of FibersTable",
            neurodata_type_inc="DynamicTableRegion",
            quantity="?",
        ),
        ... other attributes ...
    ]
)

deconvolved_fiberphotometryresponse_series_series = NWBGroupSpec(
    neurodata_type_def="DeconvolvedFiberPhotometryResponseSeries",
    neurodata_type_inc="FiberPhotometryResponseSeries",
    doc="Extends FiberPhotometryResponseSeries to hold deconvolved data",
    links=[
        NWBLinkSpec(
            name="raw",
            target_type="FiberPhotometryResponseSeries",
            doc="ref to fiber photometry response series",
        )
    ],
    datasets=[
        NWBDatasetSpec(
            name="deconvolution_filter",
            doc="description of deconvolution filter used",
            dtype="text",
            neurodata_type_inc="VectorData",
            quantity="?",
        ),
        NWBDatasetSpec(
            name="downsampling_filter",
            doc="description of downsampling filter used",
            dtype="text",
            neurodata_type_inc="VectorData",
            quantity="?",
        ),
    ],
)

The spec yaml file is created without problems, but when trying to import the new classes in Python I get an import error.
The error happens at this line: https://github.com/catalystneuro/ndx-photometry/blob/29eb6c43f61e6456e2d124241e97953526a548bd/src/pynwb/ndx_photometry/__init__.py#L24

It seems to be caused by FiberPhotometryResponseSeries not being found in the namespace, when used as an include to the DeconvolvedFiberPhotometryResponseSeries class.

In order to reproduce, you should install the version from this branch: https://github.com/catalystneuro/ndx-photometry/tree/general-improvements

Steps to Reproduce

from ndx_photometry import FiberPhotometry

Traceback

File /mnt/shared_storage/Github/nelson-lab-to-nwb/src/nelson_lab_to_nwb/interfaces/tdt_interface.py:9
      7 from pynwb import NWBFile
      8 from pynwb.core import DynamicTableRegion
----> 9 from ndx_photometry import (
     10     FibersTable,
     11     PhotodetectorsTable,
     12     ExcitationSourcesTable,
     13     FluorophoresTable,
     14     FiberPhotometryResponseSeries,
     15     FiberPhotometry
     16 )
     19 class TdtFiberPhotometryInterface(BaseDataInterface):
     20     """
     21     Custom data interface class for converting TDT Fiber Photometry recordings.
     22     """

File /mnt/shared_storage/Github/ndx-photometry/src/pynwb/ndx_photometry/__init__.py:24
     12     ndx_photometry_specpath = os.path.abspath(
     13         os.path.join(
     14             os.path.dirname(__file__),
   (...)
     20         )
     21     )
     23 # Load the namespace
---> 24 load_namespaces(ndx_photometry_specpath)
     26 # TODO: import your classes here or define your class using get_class to make
     27 # them accessible at the package level
     28 from .photometry import FibersTable, FluorophoresTable, PhotodetectorsTable, ExcitationSourcesTable, FiberPhotometryResponseSeries

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/utils.py:672, in docval.<locals>.dec.<locals>.func_call(*args, **kwargs)
    670 def func_call(*args, **kwargs):
    671     pargs = _check_args(args, kwargs)
--> 672     return func(**pargs)

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/pynwb/__init__.py:109, in load_namespaces(**kwargs)
    105 '''
    106 Load namespaces from file
    107 '''
    108 namespace_path = getargs('namespace_path', kwargs)
--> 109 return __TYPE_MAP.load_namespaces(namespace_path)

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/utils.py:668, in docval.<locals>.dec.<locals>.func_call(*args, **kwargs)
    666 def func_call(*args, **kwargs):
    667     pargs = _check_args(args, kwargs)
--> 668     return func(args[0], **pargs)

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/build/manager.py:473, in TypeMap.load_namespaces(self, **kwargs)
    465 @docval(*get_docval(NamespaceCatalog.load_namespaces),
    466         returns="the namespaces loaded from the given file", rtype=dict)
    467 def load_namespaces(self, **kwargs):
    468     '''Load namespaces from a namespace file.
    469     This method will call load_namespaces on the NamespaceCatalog used to construct this TypeMap. Additionally,
    470     it will process the return value to keep track of what types were included in the loaded namespaces. Calling
    471     load_namespaces here has the advantage of being able to keep track of type dependencies across namespaces.
    472     '''
--> 473     deps = self.__ns_catalog.load_namespaces(**kwargs)
    474     for new_ns, ns_deps in deps.items():
    475         for src_ns, types in ns_deps.items():

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/utils.py:668, in docval.<locals>.dec.<locals>.func_call(*args, **kwargs)
    666 def func_call(*args, **kwargs):
    667     pargs = _check_args(args, kwargs)
--> 668     return func(args[0], **pargs)

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/spec/namespace.py:537, in NamespaceCatalog.load_namespaces(self, **kwargs)
    535 # now load specs into namespace
    536 for ns in to_load:
--> 537     ret[ns['name']] = self.__load_namespace(ns, reader, resolve=resolve)
    538 self.__included_specs[ns_path_key] = ret
    539 return ret

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/spec/namespace.py:447, in NamespaceCatalog.__load_namespace(self, namespace, reader, resolve)
    444     types_to_load = set(types_to_load)
    445 if 'source' in s:
    446     # read specs from file
--> 447     self.__load_spec_file(reader, s['source'], catalog, types_to_load=types_to_load, resolve=resolve)
    448     self.__included_sources.setdefault(ns_name, list()).append(s['source'])
    449 elif 'namespace' in s:
    450     # load specs from namespace

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/spec/namespace.py:401, in NamespaceCatalog.__load_spec_file(self, reader, spec_source, catalog, types_to_load, resolve)
    399 for spec_dict in specs:
    400     self.__convert_spec_cls_keys(GroupSpec, self.__group_spec_cls, spec_dict)
--> 401     temp_dict = {k: None for k in __reg_spec(self.__group_spec_cls, spec_dict)}
    402     ret.update(temp_dict)
    403 self.__loaded_specs[spec_source] = ret

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/spec/namespace.py:386, in NamespaceCatalog.__load_spec_file.<locals>.__reg_spec(spec_cls, spec_dict)
    384     return
    385 if resolve:
--> 386     self.__resolve_includes(spec_cls, spec_dict, catalog)
    387 spec_obj = spec_cls.build_spec(spec_dict)
    388 return catalog.auto_register(spec_obj, spec_source)

File ~/anaconda3/envs/nelson_lab_to_nwb_env/lib/python3.12/site-packages/hdmf/spec/namespace.py:424, in NamespaceCatalog.__resolve_includes(self, spec_cls, spec_dict, catalog)
    422 if parent_spec is None:
    423     msg = "Cannot resolve include spec '%s' for type '%s'" % (dt_inc, dt_def)
--> 424     raise ValueError(msg)
    425 # replace the inc key value from string to the inc spec so that the spec can be updated with all of the
    426 # attributes, datasets, groups, and links of the inc spec when spec_cls.build_spec(spec_dict) is called
    427 spec_dict[spec_cls.inc_key()] = parent_spec

ValueError: Cannot resolve include spec 'FiberPhotometryResponseSeries' for type 'DeconvolvedFiberPhotometryResponseSeries'

Operating System

Linux

Python Executable

Conda

Python Version

3.10

Package Versions

No response

Code of Conduct

@rly
Copy link
Contributor

rly commented Mar 27, 2024

@luiztauffer it looks like the order of the spec matters for resolution. If I move the definition of FiberPhotometryResponseSeries above the definition of DeconvolvedFiberPhotometryResponseSeries which contains the link and neurodata_type_inc to FiberPhotometryResponseSeries, then the import works.

If you reorder the data types in create_extension_spec.py to have fiberphotometryresponse_series before deconvolved_fiberphotometryresponse_series_series:

 # Add all new data types to this list
    new_data_types = [
        fibers_table,
        photodetectors_table,
        excitationsources_table,
        commandedvoltage_series,
        fiberphotometryresponse_series,
        deconvolved_fiberphotometryresponse_series_series,
        multi_commanded_voltage,
        fiber_photometry,
        fluorophores_table,
    ]

and re-generate the spec and re-install the package, then the new spec will have a working definition order, and the import will work.

This is not ideal very user-friendly behavior. I created an issue in HDMF for it hdmf-dev/hdmf#1086

@luiztauffer
Copy link
Author

thanks for the workaround @rly!

@luiztauffer
Copy link
Author

closing it, as this solved the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants