Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KLB file in pipeline ? #30

Open
Xqua opened this issue Feb 21, 2018 · 7 comments
Open

KLB file in pipeline ? #30

Xqua opened this issue Feb 21, 2018 · 7 comments

Comments

@Xqua
Copy link

Xqua commented Feb 21, 2018

Hi,

Any trick or help towards making this available for KLB files ?
That would be of great help !

Thanks !

@schmiedc
Copy link
Contributor

Hi Xqua,

one way to approach this is to write everything into .tif.
That is what I typically did when I had files that I was not able to processes in the standard way.

The .tif files you can then use with the workflow without modifications of it.
Each stack needs to be a independent .tif file and adhere to a naming scheme:
img_TL{t}_Angle{a}.tif
img_TL{t}_Angle{a}_Channels{c}.tif
The writing into .tif can be also done on a cluster but you would need to write the scripts yourself.

Another approach would be to write everything into .hdf5.
But this requires modification of the workflow itself.
Also each timepoint needs to be a separate .h5 file to allow the parallel processing.

Cheers,
Christopher

@Xqua
Copy link
Author

Xqua commented Feb 22, 2018

hi @schmiedc

Sure, that would be an option ... but I feel like this is probably not the best option though ... It would require a lot more HDD space than I have lying around.
I can process KLB files with FIJI interface, I can also generate XML files for KLBs, so it should be doable to make a script for it right ?
Digging around I found the define_czi.bsh that I feel I can probably modify for KLB format fairly easily ? no ?

I'll give it a try and pop back here if I need any help !

EDIT:
I guess I might not even have to ...
So the trick I do in FIJI is to use BDV to define the XML file, then once the XML file is created, I load it in Multiview reconstruction, which just accepts it.
Maybe I just need to add a switch for this ...

@schmiedc
Copy link
Contributor

schmiedc commented Feb 23, 2018

Hi Xqua,

it is true that the .tif option creates more data and should be avoided if possible.

However I am not sure if your proposed solutions can work at all.
For defining a dataset and resaving it into .h5 one requires that the multiview reconstruction application is able to load and process your files. As far as I know there is no support for KLB files.
You would need to ask Stephan Preibisch for supporting this format.

But there is another, albeit smaller catch. My workflow depends on an old Fiji version.
Once you are able to load your files in a new version of the multiview reconstruction application.
You also need to provide this Fiji version with these initial steps (defining and resaving to hdf5).

So are you really sure that the Fiji version I provided is able to load and resave your data?

Cheers,
Christopher

@Xqua
Copy link
Author

Xqua commented Feb 23, 2018

hum I see your point about versions !
As for Multiview, I actually am able to process KLB files without issues.
Basically, the klb-bdv library provided by the SIMview update site allows BDV to open KLB files, and it adds to the SCIFIO the capacity to do so too.
The process is basically this, create a XML using BDV added KLB function, then skip the define dataset step, and finally use this xml in the normal multiview pipeline.

The problem I can see indeed is that the version for the snakemake pipeline is from 2015 which did not have yet support for this. But this might not be problematic is the workflow runs on the newer version ? I haven't tried though ...

The easiest solution would be this one:

  • Create a define_xml_klb step which is basically just creating a XML file
  • skips the define_tiff/czi and the resave as hdf5
  • start the pipeline at detect interest points.

that way there is minimal changes and interference with the rest of the code, and it starts from a similar points, one klb file per timepoint, easily splitable into multiprocessing.

Would you mind if such a script runs in python, called by a bsh script ? instead of imageJ ?

@schmiedc
Copy link
Contributor

schmiedc commented Mar 8, 2018

Hi Xqua,

sorry for my belated reply.

The issue is how the workflow interacts with Fiji.
It is controlling it via beanshell scripts that send string commands to Fiji.
If anything changed in the way the Multiview reconstruction is controlled by a macro then the workflow might fail or produce different results. A simple typo or lower/uppercase change is sufficient there.
Also in these beanshell macros I reproduced a lot of the gui interface with a reasonable amount of choices. So stuff might have changed in the options there as well but that is rather unlikely.

Due to these reasons it is very unlikely that a new Fiji version is working seamlessly with the workflow.
It might work, it might not. One would have to try and debug every little step on the way.

This is compounded by the fact that one needs to execute these steps, write them in a macro. Then compare the string to the string the workflow uses and then change the string assembly in the beanshell. I am currently undergoing a job change so I will not have time in the next couple of months.

The proposed changes specifically would only work if the next following workflow steps also accept the KLB files. So one would need to add this functionality and make sure that all the steps in the workflow is working with a recent Fiji version.

I see the need for updating the workflow but I cannot promise that I will have time soon for a major update.

Cheers,
Christopher

@schmiedc
Copy link
Contributor

@Xqua

there is an updated Fiji version available for the workflow:
http://tomancak-srv1.mpi-cbg.de/~schmied/

tested and implemented by a group in Ostrava: #33

This Fiji version implements the following new format:
Micro Manager diSPIM dataset
Holographic Imaging Dataset
Slidebook6 Dataset
Is this covering your usecase?

It should be relatively easy to adjust the define_czi.bsh script then.
https://github.com/mpicbg-scicomp/snakemake-workflows/blob/master/spim_registration/timelapse/define_czi.bsh

Cheers,
Christopher

@Xqua
Copy link
Author

Xqua commented Jan 9, 2019

hi @schmiedc I'll look into it as I have new datasets coming along. but TBH I've moved on to use google cloud VM with FIJI and the GUI to run all of my datasets... It was just more time efficient than fighting against cluster madness !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants