Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

big data, small RAM #62

Closed
halfsobolev opened this issue Jun 23, 2024 · 2 comments
Closed

big data, small RAM #62

halfsobolev opened this issue Jun 23, 2024 · 2 comments

Comments

@halfsobolev
Copy link

@kylechampley thanks for making such an amazing set of CT tools available! I've been playing around with LEAP for some of my research projects and I am very happy with the high-quality results.

But now I've been asked to reconstruct some data from our bench top CT system, and I am having trouble because the data is very large. Our detector is 4096^2 and we collected 1800 projections. For now, all I want to do is flat field the projection data and reconstruct with FBP, but my computer has only 64 GB of CPU RAM. I think FBP would require almost 400 GB of RAM.

Does LEAP have tools to process this data on my computer?

@kylechampley
Copy link
Collaborator

Hello. Glad you find LEAP useful!

The short answer to your question is currently LEAP does not yet have the capability to handle this automatically, but I am actually working on a framework to handle this which should be ready in about 1-3 weeks.

You could of course, divide up the data yourself and feed chunks of it to LEAP. See the following demo script for how to do this: d13_cropping_subchunking.py.

LEAP will automatically divide the data in small enough chunks for GPU processing, but LEAP requires you to have enough CPU memory to run any LEAP algorithm. Removing CPU memory constraints requires one to process the data in chunks and save each chunk to disk.

Anyway, the new UI I am working on is here. It is definitely NOT ready yet, but when ready all you'll have to do is specify the file names of the data you want to process and tell it the maximum amount of CPU memory that LEAP is allowed to use and it will process any LEAP algorithm in chunks and save results to disk. I can let you know when this is ready.

@kylechampley
Copy link
Collaborator

So sorry for the long wait! The automatic data chunking algorithms are ready to use here.

You can do this through the GUI or through leapctserver.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants