You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@kylechampley thanks for making such an amazing set of CT tools available! I've been playing around with LEAP for some of my research projects and I am very happy with the high-quality results.
But now I've been asked to reconstruct some data from our bench top CT system, and I am having trouble because the data is very large. Our detector is 4096^2 and we collected 1800 projections. For now, all I want to do is flat field the projection data and reconstruct with FBP, but my computer has only 64 GB of CPU RAM. I think FBP would require almost 400 GB of RAM.
Does LEAP have tools to process this data on my computer?
The text was updated successfully, but these errors were encountered:
The short answer to your question is currently LEAP does not yet have the capability to handle this automatically, but I am actually working on a framework to handle this which should be ready in about 1-3 weeks.
You could of course, divide up the data yourself and feed chunks of it to LEAP. See the following demo script for how to do this: d13_cropping_subchunking.py.
LEAP will automatically divide the data in small enough chunks for GPU processing, but LEAP requires you to have enough CPU memory to run any LEAP algorithm. Removing CPU memory constraints requires one to process the data in chunks and save each chunk to disk.
Anyway, the new UI I am working on is here. It is definitely NOT ready yet, but when ready all you'll have to do is specify the file names of the data you want to process and tell it the maximum amount of CPU memory that LEAP is allowed to use and it will process any LEAP algorithm in chunks and save results to disk. I can let you know when this is ready.
@kylechampley thanks for making such an amazing set of CT tools available! I've been playing around with LEAP for some of my research projects and I am very happy with the high-quality results.
But now I've been asked to reconstruct some data from our bench top CT system, and I am having trouble because the data is very large. Our detector is 4096^2 and we collected 1800 projections. For now, all I want to do is flat field the projection data and reconstruct with FBP, but my computer has only 64 GB of CPU RAM. I think FBP would require almost 400 GB of RAM.
Does LEAP have tools to process this data on my computer?
The text was updated successfully, but these errors were encountered: