Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NameError: ("Input formula couldn't be processed: ...) #39

Open
mrahimpour opened this issue Aug 20, 2018 · 14 comments
Open

NameError: ("Input formula couldn't be processed: ...) #39

mrahimpour opened this issue Aug 20, 2018 · 14 comments

Comments

@mrahimpour
Copy link

Hi,

Thank you for sharing your code. I am trying to use your package for PET-MRI co-registration. By using "spm_anat_preproc" and "spm_mrpet_preproc" functions, I am running PETPVC pipeline on MR/PET images; but I am getting the following error.

ERROR:nipype.workflow:

Traceback (most recent call last):
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/plugins/multiproc.py", line 70, in run_node
result['result'] = node.run(updatehash=updatehash)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 480, in run
result = self._run_interface(execute=True)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 564, in _run_interface
return self._run_command(execute)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/pipeline/engine/nodes.py", line 644, in _run_command
result = self._interface.run(cwd=outdir)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/base/core.py", line 521, in run
runtime = self._run_interface(runtime)
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nipype/interfaces/utility/wrappers.py", line 144, in _run_interface
out = function_handle(**args)
File "/home/masoomeh/PET/pypes/neuro_pypes/interfaces/nilearn/image.py", line 34, in wrapped
res_img = f(*args, **kwargs)
File "", line 26, in math_img
File "/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/nilearn-0.4.2-py3.5.egg/nilearn/image/image.py", line 793, in math_img
result = eval(formula, data_dict)
File "", line 1, in
NameError: ("Input formula couldn't be processed, you provided 'img / nan',", "name 'nan' is not defined").

I checked the code and could not find where to modify the formula and define the "val". I will be grateful if you can provide me with some clue to solve this error!

@alexsavio
Copy link
Member

alexsavio commented Aug 21, 2018 via email

@mrahimpour
Copy link
Author

mrahimpour commented Aug 21, 2018

Hi alexsavio,

Thanks for your reply. I am trying to use the functions as you explained in the tutorial (https://neuro-pypes.readthedocs.io/en/latest/), following is the code I am using:

import os
import pdb

import warnings

from hansel import Crumb
from neuro_pypes.anat import attach_spm_anat_preprocessing
from neuro_pypes.pet import attach_spm_mrpet_preprocessing
from neuro_pypes.io import build_crumb_workflow
from neuro_pypes.config import update_config
from neuro_pypes.run import run_debug

"""
Mtalab-SPM path
"""
import nipype.interfaces.matlab as mlab
mlab.MatlabCommand.set_default_paths('/usr/local/MATLAB/R2018a/toolbox/spm12')

warnings.filterwarnings("always")
#cwd = os.getcwd() #print(cwd)

base_dir = "/home/masoomeh/PET-Quantification/Data"
data_path = os.path.join(base_dir, "{subject_id}", "{modality}", "{image}")

data_crumb = Crumb(data_path, ignore_list=[".*"])
print(data_crumb)

subj_ids = data_crumb['subject_id']
print(subj_ids)

attach_functions = {"spm_anat_preproc": attach_spm_anat_preprocessing,
"spm_mrpet_preproc": attach_spm_mrpet_preprocessing}

crumb_arguments = {'anat': [('modality', 'anat_1'), ('image','MPRAGE.nii.gz')],
'pet': [('modality', 'pet_1'), ('image', 'FET_DYN.nii.gz')]}

output_dir = os.path.join(os.path.dirname(base_dir), "out")
cache_dir = os.path.join(os.path.dirname(base_dir), "wd")

#pdb.set_trace()
wf = build_crumb_workflow(attach_functions,
data_crumb=data_crumb,
in_out_kwargs=crumb_arguments,
output_dir=output_dir,
cache_dir=cache_dir,)

#pdb.set_trace()
run_debug(wf, plugin="Linear", n_cpus=2)

@alexsavio
Copy link
Member

alexsavio commented Aug 21, 2018

Hi!

Your code looks fine. Thanks
The issue is probably in the petpvc_mask or intensity_norm functions in:
https://github.com/Neurita/pypes/blob/master/neuro_pypes/pet/utils.py

Pypes calculates the mask or intensity norm based on the tissue segmentation provided by SPM12. If you could identify which subject is throwing this error, go to the working directory folder, look for this subject and its 'tissues' node folder. Have a look at the results, they are probably bad? If you're getting NaN is probably because you might have an empty tissue mask? I am not sure, many things can go wrong there. Please have a look at the intermediate results in the wd folder.

@mrahimpour
Copy link
Author

You are totally right! Checking the working directory for this subject, I have no output for tissues, coreg_pet, etc. But it is wired because before running the code for PET/MR preprocessing, I have run the "spm_anat_preproc" for MR-only tissue segmentation and it worked very well. In the current implementation, I also have right outputs for tissue segmentation in "wd/main_workflow/spm_anat_preproc/subject_id/new_segment".
I am going through the code to find out where is the error happening. I hope I can find something.
Would you please let me know whether it is possible to implement the PET/MR coregistration using your pipeline without PETPVC step?
Many thanks for your time.

@alexsavio
Copy link
Member

Hi.

I am sorry but I don't have a pet/mr workflow without petpvc. It is actually a good feature request, it makes sense. Although this would be simple to implement directly with Nipype, I understand some features here are not in Nipype.
It shouldn't be complicated to implement. I will give it a try for it soon, if you don't want to step in ;)

Please let me know if you can find the issue. Haven't you seen any error in the output log?

Thanks!

@mrahimpour
Copy link
Author

mrahimpour commented Aug 25, 2018

Hi!

Running the code to find out what makes "Input formula couldn't be processed: ..." Error, I got another error :
" raise RuntimeError("Graph changed during iteration")"

This is something that happens when nipype workflow has started to execute.

180825-20:40:01,680 nipype.workflow INFO:
Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True).
INFO:nipype.workflow:Generated workflow graph: /home/masoomeh/PET/wd/main_workflow/main_workflow_colored_workflow.svg (graph2use=colored, simple_form=True).
(<class 'RuntimeError'>, RuntimeError('Graph changed during iteration',), <traceback object at 0x7f9c33ffac88>)
/home/masoomeh/opt/anaconda3/lib/python3.5/site-packages/networkx/algorithms/dag.py(189)topological_sort()
-> raise RuntimeError("Graph changed during iteration")

I ll be grateful if you have any debugging advice or other insight on this?

@alexsavio
Copy link
Member

alexsavio commented Aug 26, 2018 via email

@mrahimpour
Copy link
Author

Hi,

I did not mean that these errors are related, I just asked for some clue and thanks to your help ,by using run_wf(), I don't have the "Graph changed during iteration" error anymore!

In order to debug the former error (no output for tissue node in petpvc workflow and following nodes), I have checked all the nodes and connections but still nothing found. Can I ask which part I need to focus more, build_crumb_workflow or run_wf? and is it something related to nipype pipeline or your pipeline?

I also checked the hansel.crumb paths by this :
crumb ls <data_crumb>
and this was the path to my data, is it right?

Sorry for asking lots of questions, I really hope to run your pipeline successfully!

@alexsavio
Copy link
Member

Hi,

The crumb ls <data_crumb>, it's to check if your crumb path to your data is correct and it is fetching all your files correctly.
In your case it would be: crumb ls "<base_dir>/{subject_id}/{modality}/{image}"

If it is correctly fetching your data you need to check what was the last node that ran. For that you have to have a look at the .svg file with the plot of the graphs, check the order of the blocks, and go through your wd folder looking for reports.

Anyway, do you have any crash files?

@mrahimpour
Copy link
Author

Hi,

I tried to simplify the workflow by removing the rbvpvc and some other nodes; it was easier to check the details in a simplified pipeline. Checking the order of blocks, I found out that main problem is in "coreg_pet" node. I have completely invalid outputs for this node which makes error in the following parts. I also checked spm_coregistration in matlab and got invalid outputs again (it look likes that the algorithm does not converge)! I think there is something wrong with clinical PET, MR data.

@alexsavio
Copy link
Member

Hi,

I am glad you found the error. If I were you I would have a look at the files and try to run SPM separately in one or two subjects. If you can't find the issue, maybe I can help you. Just let me know.
If you paste here your code with the simplified pipeline I could paste it here and add it to the module. I can put you as co-author. As you wish.

Good luck!

@mrahimpour
Copy link
Author

Sure, It would be nice if I can add something useful to your code. I am trying to double check the simplified code to ensure that I did no mistake. If I can have your email address, I will share the simplified code and also all the other issues that I encountered when I was running your code.

I also found out the problem with PET/MR data. Their origins had not been adjusted correctly! I am looking for an automatic way to do it.

Thanks!

@alexsavio
Copy link
Member

If you have time, please send me the code to alexsavio at gmail .com Thanks

@alexsavio
Copy link
Member

Hello!
How are you doing? Need any help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants