Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Issue with DLC interface not loading #1114

Open
2 tasks done
vigji opened this issue Oct 15, 2024 · 2 comments
Open
2 tasks done

[Bug]: Issue with DLC interface not loading #1114

vigji opened this issue Oct 15, 2024 · 2 comments
Labels

Comments

@vigji
Copy link
Contributor

vigji commented Oct 15, 2024

What happened?

I am replicating the funny behavior I already described in #967: when running DLCInterface conversions, it fails silently not adding the correct fields in the nwb.

This time it was more painful as I forgot about the issue previously raised, and here it was 'in production'. I was running a batch conversion of multiple experiments:

for exp in experiments:
    load_stuff()
    interfaces_pipe = make_interfaces()
    interfaces_pipe.run_conversion()

This resulted in only the first experiment missing all the DLC related stuff - which was driving me completely nuts for a good day of debugging, being those also quite slow operations, until I realized it sounded like the previous issue - and I fixed it adding a random

from ndx_pose import PoseEstimation, PoseEstimationSeries

At the beginning of the script (alternatively, running twice the conversion of the first experiment :D)

This was massively annoying, I strongly suggest looking deeper into this or change the DLC example to document it.

This time it was happening from a different machine and different OS! (same interface though - always VSCode, running a script)

Steps to Reproduce

The following code results in the bug:

from datetime import datetime
from zoneinfo import ZoneInfo
from pathlib import Path
from neuroconv.datainterfaces import DeepLabCutInterface
import numpy as np

file_path = ".../....h5"
config_file_path = ".../config.yaml"
path_to_save_nwbfile = ".../test_nwb.nwb"

timestamps = np.array([1,2,3])  # stupid, speeds up processing if tstamps provided

interface = DeepLabCutInterface(file_path=file_path, config_file_path=config_file_path, 
                                subject_name="ind1", verbose=False)
interface.set_aligned_timestamps(timestamps)

metadata = interface.get_metadata()

session_start_time = datetime(2020, 1, 1, 12, 30, 0, tzinfo=ZoneInfo("US/Pacific"))
metadata["NWBFile"].update(session_start_time=session_start_time)

interface.run_conversion(nwbfile_path=path_to_save_nwbfile, metadata=metadata)

from pynwb import NWBHDF5IO

# confirm the file contains the new TimeSeries in acquisition
with NWBHDF5IO(path_to_save_nwbfile, "r") as io:
    read_nwbfile = io.read()
    print(read_nwbfile.processing)

The print states:

Fields:
  data_interfaces: {
    PoseEstimation <class 'pynwb.core.NWBDataInterface'>
  }
  description: processed behavioral data
}

And there is no data inside, unless I run the code twice or I add the initial import, in which case I get

Fields:
  data_interfaces: {
    PoseEstimation <class 'ndx_pose.pose.PoseEstimation'>
  }
  description: processed behavioral data
}

and data can be accessed.

Traceback

No response

Operating System

Windows

Python Executable

Conda

Python Version

3.10

Package Versions

neuroconv == 0.6.4

Code of Conduct

@vigji vigji added the bug label Oct 15, 2024
@h-mayorquin
Copy link
Collaborator

Hey, for provenance here is the exact link to your previous report:

#967 (comment)

Thanks a bunch. I am able to reproduce the error. We will take a look into it.

@vigji
Copy link
Contributor Author

vigji commented Oct 15, 2024

Great, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants