You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am replicating the funny behavior I already described in #967: when running DLCInterface conversions, it fails silently not adding the correct fields in the nwb.
This time it was more painful as I forgot about the issue previously raised, and here it was 'in production'. I was running a batch conversion of multiple experiments:
This resulted in only the first experiment missing all the DLC related stuff - which was driving me completely nuts for a good day of debugging, being those also quite slow operations, until I realized it sounded like the previous issue - and I fixed it adding a random
from ndx_pose import PoseEstimation, PoseEstimationSeries
At the beginning of the script (alternatively, running twice the conversion of the first experiment :D)
This was massively annoying, I strongly suggest looking deeper into this or change the DLC example to document it.
This time it was happening from a different machine and different OS! (same interface though - always VSCode, running a script)
Steps to Reproduce
The following code results in the bug:
fromdatetimeimportdatetimefromzoneinfoimportZoneInfofrompathlibimportPathfromneuroconv.datainterfacesimportDeepLabCutInterfaceimportnumpyasnpfile_path=".../....h5"config_file_path=".../config.yaml"path_to_save_nwbfile=".../test_nwb.nwb"timestamps=np.array([1,2,3]) # stupid, speeds up processing if tstamps providedinterface=DeepLabCutInterface(file_path=file_path, config_file_path=config_file_path,
subject_name="ind1", verbose=False)
interface.set_aligned_timestamps(timestamps)
metadata=interface.get_metadata()
session_start_time=datetime(2020, 1, 1, 12, 30, 0, tzinfo=ZoneInfo("US/Pacific"))
metadata["NWBFile"].update(session_start_time=session_start_time)
interface.run_conversion(nwbfile_path=path_to_save_nwbfile, metadata=metadata)
frompynwbimportNWBHDF5IO# confirm the file contains the new TimeSeries in acquisitionwithNWBHDF5IO(path_to_save_nwbfile, "r") asio:
read_nwbfile=io.read()
print(read_nwbfile.processing)
What happened?
I am replicating the funny behavior I already described in #967: when running DLCInterface conversions, it fails silently not adding the correct fields in the nwb.
This time it was more painful as I forgot about the issue previously raised, and here it was 'in production'. I was running a batch conversion of multiple experiments:
This resulted in only the first experiment missing all the DLC related stuff - which was driving me completely nuts for a good day of debugging, being those also quite slow operations, until I realized it sounded like the previous issue - and I fixed it adding a random
At the beginning of the script (alternatively, running twice the conversion of the first experiment :D)
This was massively annoying, I strongly suggest looking deeper into this or change the DLC example to document it.
This time it was happening from a different machine and different OS! (same interface though - always VSCode, running a script)
Steps to Reproduce
The following code results in the bug:
The print states:
And there is no data inside, unless I run the code twice or I add the initial import, in which case I get
and data can be accessed.
Traceback
No response
Operating System
Windows
Python Executable
Conda
Python Version
3.10
Package Versions
neuroconv == 0.6.4
Code of Conduct
The text was updated successfully, but these errors were encountered: