From gkiar07 at gmail.com Tue Dec 1 12:55:30 2015 From: gkiar07 at gmail.com (Greg Kiar) Date: Tue, 1 Dec 2015 12:55:30 -0500 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space Message-ID: Hi, I'm working with various parcellations of the brain which have been defined at a 3mm resolution, though my data exists in a 1mm space (the MNI152 space). When opening these images with a nifti viewer it is fine as the affine transform maps them properly to scale. However, when I'm working with data in python I would like my labels defined at 3mm to be in the MNI152 space (that they also overlap in voxel space, as well). I would like to write a script that "ingests" a low resolution (3mm) atlas into the 1mm MNI152 space, and nearest-neighbor interpolates values not defined, if you will. Do you know how I can easily do this? Thanks so much, Greg -- Greg Kiar -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Dec 1 14:53:53 2015 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 1 Dec 2015 11:53:53 -0800 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space In-Reply-To: References: Message-ID: Hi Greg, Using the current development version of dipy, or the very-very-very-soon-to-be-released version 0.10 of the software, you can do something like this: https://gist.github.com/arokem/0aab282a7287245a7e78 Note that this does linear interpolation, but can easily be adjusted to use nearest neighbor, by setting: resampled = affine_map.transform(moving.get_data(), interp='nearest') Cheers, Ariel On Tue, Dec 1, 2015 at 9:55 AM, Greg Kiar wrote: > Hi, > > I'm working with various parcellations of the brain which have been > defined at a 3mm resolution, though my data exists in a 1mm space (the > MNI152 space). When opening these images with a nifti viewer it is fine as > the affine transform maps them properly to scale. However, when I'm working > with data in python I would like my labels defined at 3mm to be in the > MNI152 space (that they also overlap in voxel space, as well). I would like > to write a script that "ingests" a low resolution (3mm) atlas into the 1mm > MNI152 space, and nearest-neighbor interpolates values not defined, if you > will. Do you know how I can easily do this? > > Thanks so much, > Greg > -- > Greg Kiar > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Dec 1 15:01:39 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 1 Dec 2015 12:01:39 -0800 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space In-Reply-To: References: Message-ID: Hi, On Tue, Dec 1, 2015 at 9:55 AM, Greg Kiar wrote: > Hi, > > I'm working with various parcellations of the brain which have been defined > at a 3mm resolution, though my data exists in a 1mm space (the MNI152 > space). When opening these images with a nifti viewer it is fine as the > affine transform maps them properly to scale. However, when I'm working with > data in python I would like my labels defined at 3mm to be in the MNI152 > space (that they also overlap in voxel space, as well). I would like to > write a script that "ingests" a low resolution (3mm) atlas into the 1mm > MNI152 space, and nearest-neighbor interpolates values not defined, if you > will. Do you know how I can easily do this? Maybe something like: vox2mm3 = img_3mm.affine vox2mm1 = img_1mm.affine vox1to3 = np.linalg.inv(vox2mm3).dot(vox2mm1) mat, vec = nibabel.affines.to_matvec(vox1to3) out = scipy.ndimage.affine_transform(img_3mm.get_data(), mat, vec, order=0) out_img = nibabel.Nifti1Image(out, img_1mm.affine, img_1mm.header) nibabel.save(out_img, 'my_resampled_regions.nii') Cheers, Matthew From bertrand.thirion at inria.fr Tue Dec 1 15:50:23 2015 From: bertrand.thirion at inria.fr (bthirion) Date: Tue, 1 Dec 2015 21:50:23 +0100 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space In-Reply-To: References: Message-ID: <565E080F.7080308@inria.fr> Hi Greg, You might take at look at |nilearn.image.||resample_img https://nilearn.github.io/modules/generated/nilearn.image.resample_img.html Best, Bertrand | On 01/12/2015 18:55, Greg Kiar wrote: > Hi, > > I'm working with various parcellations of the brain which have been > defined at a 3mm resolution, though my data exists in a 1mm space (the > MNI152 space). When opening these images with a nifti viewer it is > fine as the affine transform maps them properly to scale. However, > when I'm working with data in python I would like my labels defined at > 3mm to be in the MNI152 space (that they also overlap in voxel space, > as well). I would like to write a script that "ingests" a low > resolution (3mm) atlas into the 1mm MNI152 space, and nearest-neighbor > interpolates values not defined, if you will. Do you know how I can > easily do this? > > Thanks so much, > Greg > -- > Greg Kiar > > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Dec 1 17:05:12 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 1 Dec 2015 14:05:12 -0800 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space In-Reply-To: <565E080F.7080308@inria.fr> References: <565E080F.7080308@inria.fr> Message-ID: On Tue, Dec 1, 2015 at 12:50 PM, bthirion wrote: > Hi Greg, > > You might take at look at nilearn.image.resample_img > https://nilearn.github.io/modules/generated/nilearn.image.resample_img.html Or indeed : https://github.com/nipy/nipy/blob/master/nipy/algorithms/resample.py#L19 Or : https://github.com/nipy/nireg/blob/master/nireg/resample.py#L34 Or : https://github.com/nipy/nibabel/pull/255 https://github.com/nipy/nibabel/pull/255/files#diff-36210eae1f936c385adb1680895b7db6R114 Not our proudest moment ... Cheers, Matthew From dimitri.papadopoulos at cea.fr Wed Dec 2 09:45:27 2015 From: dimitri.papadopoulos at cea.fr (Dimitri Papadopoulos Orfanos) Date: Wed, 2 Dec 2015 15:45:27 +0100 Subject: [Neuroimaging] [nipype] updated external mask handling in SPM Message-ID: <565F0407.5010705@cea.fr> Hi, I have a question on the following commit: https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 Could someone explain the rationale behind the addition of this piece of code in nipype/interfaces/spm/model.py? We came across this code while trying to reproduce the results obtained with an old version of SPM8 run manually vs. the latest version of SPM8 run from Nipype. if isdefined(self.inputs.mask_image): # SPM doesn't handle explicit masking properly, especially # when you want to use the entire mask image postscript = "load SPM;\n" postscript += "SPM.xM.VM = spm_vol('%s');\n" % list_to_filename(self.inputs.mask_image) postscript += "SPM.xM.I = 0;\n" postscript += "SPM.xM.T = [];\n" postscript += "SPM.xM.TH = ones(size(SPM.xM.TH))*(%s);\n" % self.inputs.mask_threshold postscript += "SPM.xM.xs = struct('Masking', 'explicit masking only');\n" postscript += "save SPM SPM;\n" We have understood almost all causes for differences in results. The only cause that remains to be explained is this Nipype hack. Has perhaps this issue been discussed on the SPM mailing list? Best, -- Dimitri Papadopoulos CEA/Saclay I2BM, NeuroSpin F-91191 Gif-sur-Yvette cedex, France From garyfallidis at gmail.com Tue Dec 1 15:14:45 2015 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Tue, 1 Dec 2015 15:14:45 -0500 Subject: [Neuroimaging] [dipy] Installation Problem In-Reply-To: References: Message-ID: Hi William, If you install anaconda and then do pip install nibabel and the same for dipy that should work in windows 7 without a problem. My feeling is that you may have conflicting installations. Make sure that you remove all previous versions and try again. If the problem persists we may have some problem in our setup.py file with windows and python 3.5 which we have missed. If this is the case then we need to write an issue repost and investigate this further. Also, you should know that using python 3.5 you will not have VTK working so no 3D visualization. This is because VTK is still not available for python 3. If you want to use dipy for visualization of streamlines, odfs etc. you will need to switch to anaconda for python 2.x and do conda install vtk. You can also have a look at these links and see if you can get some ideas for fixing your problem. https://groups.google.com/a/continuum.io/forum/#!topic/anaconda/6_reeaIjx5c http://stackoverflow.com/questions/32366195/python-error-3-the-system-cannot-find-the-path-specified Let us know how it goes and apologies for the delayed answer. Cheers, Eleftherios On Fri, Nov 27, 2015 at 2:36 AM, William Liu wrote: > > Hi, > > I have been trying to install dipy on my Windows7, tried to download the > dipy folder from the link then put it into the Python34 folder, then typed > 'import dipy' on python, but it was no luck. Then I decided to use the > Anaconda command line: > pip install dipy > > but there was still an error like the one in this picture > > ? > I have also checked that I've had the other libraries like nibabel > installed. What problems may I have here? Thank you in advance! > > > > Best Regards, > William Liu > Eletrical & Biomedical Engineering IV > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 12345.jpg Type: image/jpeg Size: 141070 bytes Desc: not available URL: From gkiar07 at gmail.com Tue Dec 1 17:03:29 2015 From: gkiar07 at gmail.com (Greg Kiar) Date: Tue, 1 Dec 2015 17:03:29 -0500 Subject: [Neuroimaging] Aligning low and high resolution images in voxel space In-Reply-To: <565E080F.7080308@inria.fr> References: <565E080F.7080308@inria.fr> Message-ID: This is perfect, thank you so much. Cheers, Greg -- Greg Kiar On Tue, Dec 1, 2015 at 3:50 PM, bthirion wrote: > Hi Greg, > > You might take at look at nilearn.image.resample_img > https://nilearn.github.io/modules/generated/nilearn.image.resample_img.html > Best, > > Bertrand > > On 01/12/2015 18:55, Greg Kiar wrote: > > Hi, > > I'm working with various parcellations of the brain which have been > defined at a 3mm resolution, though my data exists in a 1mm space (the > MNI152 space). When opening these images with a nifti viewer it is fine as > the affine transform maps them properly to scale. However, when I'm working > with data in python I would like my labels defined at 3mm to be in the > MNI152 space (that they also overlap in voxel space, as well). I would like > to write a script that "ingests" a low resolution (3mm) atlas into the 1mm > MNI152 space, and nearest-neighbor interpolates values not defined, if you > will. Do you know how I can easily do this? > > Thanks so much, > Greg > -- > Greg Kiar > > > _______________________________________________ > Neuroimaging mailing listNeuroimaging at python.orghttps://mail.python.org/mailman/listinfo/neuroimaging > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Wed Dec 2 17:17:15 2015 From: satra at mit.edu (Satrajit Ghosh) Date: Wed, 2 Dec 2015 17:17:15 -0500 Subject: [Neuroimaging] [nipype] updated external mask handling in SPM In-Reply-To: <565F0407.5010705@cea.fr> References: <565F0407.5010705@cea.fr> Message-ID: hi dimitri, it's been a long while since those lines were written. but i believe this was written as noted in the comments to support the case where the user simply wanted spm to use the explicit mask. the other reason for it was that there were several places that could control spm options (e.g., config file). we did not want to rely on the config file. i believe it was more of a perspective on what an explicit mask meant. i can't remember if this was discussed on the spm list or came from best practices in our lab at that time. cheers, satra On Wed, Dec 2, 2015 at 9:45 AM, Dimitri Papadopoulos Orfanos < dimitri.papadopoulos at cea.fr> wrote: > Hi, > > I have a question on the following commit: > > https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 > > Could someone explain the rationale behind the addition of this piece of > code in nipype/interfaces/spm/model.py? We came across this code while > trying to reproduce the results obtained with an old version of SPM8 run > manually vs. the latest version of SPM8 run from Nipype. > > if isdefined(self.inputs.mask_image): > # SPM doesn't handle explicit masking properly, especially > # when you want to use the entire mask image > postscript = "load SPM;\n" > postscript += "SPM.xM.VM = spm_vol('%s');\n" % > list_to_filename(self.inputs.mask_image) > postscript += "SPM.xM.I = 0;\n" > postscript += "SPM.xM.T = [];\n" > postscript += "SPM.xM.TH = ones(size(SPM.xM.TH))*(%s);\n" % > self.inputs.mask_threshold > postscript += "SPM.xM.xs = struct('Masking', 'explicit > masking only');\n" > postscript += "save SPM SPM;\n" > > We have understood almost all causes for differences in results. The > only cause that remains to be explained is this Nipype hack. Has perhaps > this issue been discussed on the SPM mailing list? > > Best, > -- > Dimitri Papadopoulos > CEA/Saclay > I2BM, NeuroSpin > F-91191 Gif-sur-Yvette cedex, France > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dimitri.papadopoulos at cea.fr Fri Dec 4 10:20:04 2015 From: dimitri.papadopoulos at cea.fr (Dimitri Papadopoulos Orfanos) Date: Fri, 4 Dec 2015 16:20:04 +0100 Subject: [Neuroimaging] [nipype] updated external mask handling in SPM In-Reply-To: References: <565F0407.5010705@cea.fr> Message-ID: <5661AF24.60109@cea.fr> Hi Satra, I understand the whole purpose of this piece of code is to get SPM to use an explicit mask. This is done by manipulating the masking structure "SPM.xM". I'm not yet entirely convinced this still needs to be done "behind the back of SPM" by modifying SPM.mat outside of the SPM code, but here is one of "John's Gems" that recommends this Nipype hack (it refers to SPM2 though): http://blogs.warwick.ac.uk/nichols/entry/spm2_gem_12/ With fMRI data/models, SPM2 is fully capable of doing explicit masking, but the user interface for fMRI doesn't ask for it. One way to do this type of masking anyway is to change the SPM.mat file *after* you specify your model, but *before* clicking 'Estimate'. Specifically: 1. Load the SPM.mat file, load SPM set the SPM.xM.TH values all to -Inf, SPM.xM.TH = -Inf*SPM.xM.TH; and, in case that you have an image format not allowing NaNs, set SPM.xM.I to 0 SPM.xM.I = 0; 2. If using a mask image, set SPM.xM.VM to a vector of structures, where each structure element is the output of spm_vol. For instance: SPM.xM.VM = spm_vol('Maskimage'); 3. Finally, save by save SPM SPM Most importantly, I am puzzled by the fact that we don't find the same results when running SPM8 via Nipype than when running SPM8 back in 2010-2012 using batches created from the SPM user interface. At this point I am not sure if this is an issue with Nipype, an issue with our own scripts created in 2009-2010, or perhaps related to changes in SPM8. I would like to the bottom of it - time permitting... I don't feel comfortable with the idea that the default SPM workflow could have been modified and I'd like to understand the cause of the discrepancy in the results. Best, Dimitri Le 02/12/2015 23:17, Satrajit Ghosh a ?crit : > hi dimitri, > > it's been a long while since those lines were written. but i believe > this was written as noted in the comments to support the case where the > user simply wanted spm to use the explicit mask. the other reason for it > was that there were several places that could control spm options (e.g., > config file). we did not want to rely on the config file. > > i believe it was more of a perspective on what an explicit mask meant. i > can't remember if this was discussed on the spm list or came from best > practices in our lab at that time. > > cheers, > > satra > > On Wed, Dec 2, 2015 at 9:45 AM, Dimitri Papadopoulos Orfanos > > wrote: > > Hi, > > I have a question on the following commit: > https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 > > Could someone explain the rationale behind the addition of this piece of > code in nipype/interfaces/spm/model.py? We came across this code while > trying to reproduce the results obtained with an old version of SPM8 run > manually vs. the latest version of SPM8 run from Nipype. > > if isdefined(self.inputs.mask_image): > # SPM doesn't handle explicit masking properly, especially > # when you want to use the entire mask image > postscript = "load SPM;\n" > postscript += "SPM.xM.VM = spm_vol('%s');\n" % > list_to_filename(self.inputs.mask_image) > postscript += "SPM.xM.I = 0;\n" > postscript += "SPM.xM.T = [];\n" > postscript += "SPM.xM.TH = > ones(size(SPM.xM.TH ))*(%s);\n" % > self.inputs.mask_threshold > postscript += "SPM.xM.xs = struct('Masking', 'explicit > masking only');\n" > postscript += "save SPM SPM;\n" > > We have understood almost all causes for differences in results. The > only cause that remains to be explained is this Nipype hack. Has perhaps > this issue been discussed on the SPM mailing list? > > Best, > -- > Dimitri Papadopoulos > CEA/Saclay > I2BM, NeuroSpin > F-91191 Gif-sur-Yvette cedex, France > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From dimitri.papadopoulos at cea.fr Fri Dec 4 10:56:11 2015 From: dimitri.papadopoulos at cea.fr (Dimitri Papadopoulos Orfanos) Date: Fri, 4 Dec 2015 16:56:11 +0100 Subject: [Neuroimaging] [nipype] Interface Dependencies: SPM12 Message-ID: <5661B79B.6010902@cea.fr> Dear all, This page: http://nipy.org/nipype/users/install.html#interface-dependencies should probably be changed from: SPM5/8 to: SPM5 or later Best, Dimitri From vs2426569610 at gmail.com Mon Dec 7 03:52:25 2015 From: vs2426569610 at gmail.com (=?UTF-8?B?5L2V5bu65b+g?=) Date: Mon, 7 Dec 2015 16:52:25 +0800 Subject: [Neuroimaging] A problem about use dipy Message-ID: arokem? Hello?I am VSsmall, My real name is JianzhongHe, from China. I have a problem when I was using dipy. I saw an example in dipy which name is "Introduction to Basic Tracking", and the example is using "HARDI150.nii.gz", I was thinking if I use "HARDI93.nii.gz" what would it going to be. But, it have code error. And error is showed in picture. I was using dipy 0.10.1 and install by conda. Jianzhong He from Zhejiang University of Technology -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Error Picture.png Type: image/png Size: 90305 bytes Desc: not available URL: From mrbago at gmail.com Mon Dec 7 14:44:27 2015 From: mrbago at gmail.com (Bago) Date: Mon, 7 Dec 2015 11:44:27 -0800 Subject: [Neuroimaging] A problem about use dipy In-Reply-To: References: Message-ID: Looking at the error makes me think you're using the gradient table for HARDI150 to fit the HARDI193 data. Make sure your gradient table had the same number of gradients as the number of diffusion and b0 volumes in your data set. Let me know if that's helpful. Bago On Mon, Dec 7, 2015 at 12:52 AM, ??? wrote: > arokem? > > Hello?I am VSsmall, My real name is JianzhongHe, from China. > > I have a problem when I was using dipy. I saw an example in dipy > which name is "Introduction to Basic Tracking", and the example is using > "HARDI150.nii.gz", I was thinking if I use "HARDI93.nii.gz" what would it > going to be. But, it have code error. And error is showed in picture. > > I was using dipy 0.10.1 and install by conda. > > > > > > Jianzhong He > > > from Zhejiang University of Technology > > > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Error Picture.png Type: image/png Size: 90305 bytes Desc: not available URL: From gael.varoquaux at normalesup.org Tue Dec 8 18:55:06 2015 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 9 Dec 2015 00:55:06 +0100 Subject: [Neuroimaging] Post-doc position: Learning functional-connectivity biomarkers of pathologies Message-ID: <20151208235506.GY2001665@phare.normalesup.org> Dear Nipy community, I have an exciting post-doc position to fill that I beleive can be a much interest to people of this community. Ga?l Post-doc position: Learning functional-connectivity biomarkers of pathologies ============================================================================== Parietal (https://team.inria.fr/parietal/) is looking to fill a post-doc position on learning biomarkers from functional connectivity. Scientific context ------------------ The challenge is to use resting-state fMRI at the level of a population to understand how intrinsic functional connectivity captures pathologies and other cognitive phenotypes. Rest fMRI is a promising tool for large-scale population analysis of brain function as it is easy to acquire and accumulate. Scans for thousands of subjects have already been shared, and more is to come. However, the signature of cognitions in this modality are weak. Extracting biomarkers is a challenging data processing and machine learning problem. This challenge is the expertise of my research group. Medical applications cover a wider range of brain pathologies, for which diagnosis is challenging, such as autism or Alzheimer's disease. This project is a collaboration with the Child Mind Institute (http://www.childmind.org/), experts on psychiatric disorders and resting-state fMRI, and coordinators of the major data sharing initiatives for rest fRMI data (eg ABIDE). Objectives of the project -------------------------- The project hinges on processing of very large rest fMRI databases. Important novelties of the project are: - Building predictive models to discriminate multiple pathologies in large inhomogeneous datasets. - Using and improving advanced connectomics and brain-parcellation techniques in fMRI. Expected results include the discovery of neurophenotypes for several brain pathologies, as well as intrinsic brain structures ?eg functional parcellations or connectomes? that carry signatures of cognition. Desired profile ----------------- We are looking for a post-doctoral fellow to hire in spring. The ideal candidate would have some, but not all, of the following expertise and interests: * Experience in advanced processing of fMRI * General knowledge of brain structure and function * Good communication skills to write high-impact neuroscience publications * Good computing skills, in particular with Python. Cluster computing experience is desired. A great research environment ------------------------------ The work environment is dynamic and exiting, using state-of-the-art machine learning to answer challenging functional neuroimaging question. The post-doc will be employed by INRIA (http://www.inria.fr), the lead computing research institute in France. We are a team of computer scientists specialized in image processing and statistical data analysis, integrated in one of the top French brain research centers, NeuroSpin (http://i2bm.cea.fr/dsv/i2bm/Pages/NeuroSpin.aspx), south of Paris. We work mostly in Python. The team includes core contributors to the scikit-learn project (http://scikit-learn.org), for machine learning in Python, and the nilearn project (http://nilearn.github.io), for statistical learning in NeuroImaging. In addition, the post-doc will interact closely with researchers from the Child Mind Institute (http://www.childmind.org), with deep expertise in brain pathologies and in the details of the fMRI acquisitions. Finally, he or she will have access to advanced storage and grid computing facilities at INRIA. **Contact**: gael.varoquaux at inria.fr, bertrand.thirion at inria.fr **Application**: Interested candidate should send CV and motivation letter -- Gael Varoquaux Researcher, INRIA Parietal NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France Phone: ++ 33-1-69-08-79-68 http://gael-varoquaux.info http://twitter.com/GaelVaroquaux From dimitri.papadopoulos at cea.fr Wed Dec 9 06:43:25 2015 From: dimitri.papadopoulos at cea.fr (Dimitri Papadopoulos Orfanos) Date: Wed, 9 Dec 2015 12:43:25 +0100 Subject: [Neuroimaging] [nipype] updated external mask handling in SPM In-Reply-To: <5661AF24.60109@cea.fr> References: <565F0407.5010705@cea.fr> <5661AF24.60109@cea.fr> Message-ID: <566813DD.1020805@cea.fr> Dear all, I looked into masking in SPM and with some help I came up with the following findings: * The SPM user interface only sets "analysis threshold masking", see file spm_fmri_spm_ui.m around line 375: %-Masking %========================================================================== %-Masking threshold, as proportion of globals %-------------------------------------------------------------------------- try gMT = SPM.xM.gMT; catch gMT = spm_get_defaults('mask.thresh'); end TH = g.*gSF*gMT; %-Place masking structure in xM %-------------------------------------------------------------------------- SPM.xM = struct(... 'T', ones(q,1),... 'TH', TH,... 'gMT', gMT,... 'I', 0,... 'VM', {[]},... 'xs', struct('Masking','analysis threshold')); * When specifying the model in batch jobs, SPM itself sets "explicit masking" in the following way in file /spm_run_fmri_spec.m around line 385: %-Explicit mask %-------------------------------------------------------------------------- if ~design_only if ~isempty(job.mask{1}) SPM.xM.VM = spm_data_hdr_read(job.mask{1}); SPM.xM.xs.Masking = [SPM.xM.xs.Masking, '+explicit mask']; end end * The intent of SPM seems to be that the "explicit mask" is a superset of the final mask, *not* that it will be the exact mask used during estimation. From spm_spm.m around line 50: % xM.VM - struct array of explicit mask image handles % - (empty if no explicit masks) % - Explicit mask images are >0 for valid voxels to assess. % - Mask images can have any orientation, voxel size or data % type. They are interpolated using nearest neighbour % interpolation to the voxel locations of the data Y. % - Note that voxels with constant data (i.e. the same value across % scans) are also automatically masked out. * We find different results with Nipype than when running vanilla SPM batch jobs. I would therefore suggest removing this Nipype hack by default: https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 It could be added as an option, for different "explicit masking" than initially. I intend to propose a patch to remove the current hack. Does anyone have more insight on this Nipype hack that modifies default SPM behavior and would be against the patch? Best, Dimitri Le 04/12/2015 16:20, Dimitri Papadopoulos Orfanos a ?crit : > Hi Satra, > > I understand the whole purpose of this piece of code is to get SPM to > use an explicit mask. This is done by manipulating the masking structure > "SPM.xM". > > I'm not yet entirely convinced this still needs to be done "behind the > back of SPM" by modifying SPM.mat outside of the SPM code, but here is > one of "John's Gems" that recommends this Nipype hack (it refers to SPM2 > though): > http://blogs.warwick.ac.uk/nichols/entry/spm2_gem_12/ > > With fMRI data/models, SPM2 is fully capable of doing explicit > masking, but the user interface for fMRI doesn't ask for it. > One way to do this type of masking anyway is to change the > SPM.mat file *after* you specify your model, but *before* > clicking 'Estimate'. > Specifically: > 1. Load the SPM.mat file, > load SPM > set the SPM.xM.TH values all to -Inf, > SPM.xM.TH = -Inf*SPM.xM.TH; > and, in case that you have an image format not allowing > NaNs, set SPM.xM.I to 0 > SPM.xM.I = 0; > 2. If using a mask image, set SPM.xM.VM to a vector of > structures, where each structure element is the output > of spm_vol. For instance: > SPM.xM.VM = spm_vol('Maskimage'); > 3. Finally, save by > save SPM SPM > > > Most importantly, I am puzzled by the fact that we don't find the same > results when running SPM8 via Nipype than when running SPM8 back in > 2010-2012 using batches created from the SPM user interface. At this > point I am not sure if this is an issue with Nipype, an issue with our > own scripts created in 2009-2010, or perhaps related to changes in SPM8. > I would like to the bottom of it - time permitting... I don't feel > comfortable with the idea that the default SPM workflow could have been > modified and I'd like to understand the cause of the discrepancy in the > results. > > Best, > Dimitri > > Le 02/12/2015 23:17, Satrajit Ghosh a ?crit : >> hi dimitri, >> >> it's been a long while since those lines were written. but i believe >> this was written as noted in the comments to support the case where the >> user simply wanted spm to use the explicit mask. the other reason for it >> was that there were several places that could control spm options (e.g., >> config file). we did not want to rely on the config file. >> >> i believe it was more of a perspective on what an explicit mask meant. i >> can't remember if this was discussed on the spm list or came from best >> practices in our lab at that time. >> >> cheers, >> >> satra >> >> On Wed, Dec 2, 2015 at 9:45 AM, Dimitri Papadopoulos Orfanos >> > wrote: >> >> Hi, >> >> I have a question on the following commit: >> https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 >> >> Could someone explain the rationale behind the addition of this piece of >> code in nipype/interfaces/spm/model.py? We came across this code while >> trying to reproduce the results obtained with an old version of SPM8 run >> manually vs. the latest version of SPM8 run from Nipype. >> >> if isdefined(self.inputs.mask_image): >> # SPM doesn't handle explicit masking properly, especially >> # when you want to use the entire mask image >> postscript = "load SPM;\n" >> postscript += "SPM.xM.VM = spm_vol('%s');\n" % >> list_to_filename(self.inputs.mask_image) >> postscript += "SPM.xM.I = 0;\n" >> postscript += "SPM.xM.T = [];\n" >> postscript += "SPM.xM.TH = >> ones(size(SPM.xM.TH ))*(%s);\n" % >> self.inputs.mask_threshold >> postscript += "SPM.xM.xs = struct('Masking', 'explicit >> masking only');\n" >> postscript += "save SPM SPM;\n" >> >> We have understood almost all causes for differences in results. The >> only cause that remains to be explained is this Nipype hack. Has perhaps >> this issue been discussed on the SPM mailing list? >> >> Best, >> -- >> Dimitri Papadopoulos >> CEA/Saclay >> I2BM, NeuroSpin >> F-91191 Gif-sur-Yvette cedex, France >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging From satra at mit.edu Wed Dec 9 07:43:22 2015 From: satra at mit.edu (Satrajit Ghosh) Date: Wed, 9 Dec 2015 07:43:22 -0500 Subject: [Neuroimaging] [nipype] updated external mask handling in SPM In-Reply-To: <566813DD.1020805@cea.fr> References: <565F0407.5010705@cea.fr> <5661AF24.60109@cea.fr> <566813DD.1020805@cea.fr> Message-ID: hi dimitri, a change that allows both versions would be fine. please submit a PR and we can iterate on how to convey the differences through the option names and description. cheers, satra On Wed, Dec 9, 2015 at 6:43 AM, Dimitri Papadopoulos Orfanos < dimitri.papadopoulos at cea.fr> wrote: > Dear all, > > I looked into masking in SPM and with some help I came up with the > following findings: > > > * The SPM user interface only sets "analysis threshold masking", see > file spm_fmri_spm_ui.m around line 375: > > %-Masking > %========================================================================== > > %-Masking threshold, as proportion of globals > %-------------------------------------------------------------------------- > try > gMT = SPM.xM.gMT; > catch > gMT = spm_get_defaults('mask.thresh'); > end > TH = g.*gSF*gMT; > > %-Place masking structure in xM > %-------------------------------------------------------------------------- > SPM.xM = struct(... > 'T', ones(q,1),... > 'TH', TH,... > 'gMT', gMT,... > 'I', 0,... > 'VM', {[]},... > 'xs', struct('Masking','analysis threshold')); > > > * When specifying the model in batch jobs, SPM itself sets "explicit > masking" in the following way in file /spm_run_fmri_spec.m around line 385: > > %-Explicit mask > %-------------------------------------------------------------------------- > if ~design_only > if ~isempty(job.mask{1}) > SPM.xM.VM = spm_data_hdr_read(job.mask{1}); > SPM.xM.xs.Masking = [SPM.xM.xs.Masking, '+explicit mask']; > end > end > > > * The intent of SPM seems to be that the "explicit mask" is a superset > of the final mask, *not* that it will be the exact mask used during > estimation. From spm_spm.m around line 50: > > % xM.VM - struct array of explicit mask image handles > % - (empty if no explicit masks) > % - Explicit mask images are >0 for valid voxels to assess. > % - Mask images can have any orientation, voxel size or data > % type. They are interpolated using nearest neighbour > % interpolation to the voxel locations of the data Y. > % - Note that voxels with constant data (i.e. the same value across > % scans) are also automatically masked out. > > > > * We find different results with Nipype than when running vanilla SPM > batch jobs. > > > I would therefore suggest removing this Nipype hack by default: > > https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 > It could be added as an option, for different "explicit masking" than > initially. > I intend to propose a patch to remove the current hack. Does anyone have > more insight on this Nipype hack that modifies default SPM behavior and > would be against the patch? > > Best, > Dimitri > > Le 04/12/2015 16:20, Dimitri Papadopoulos Orfanos a ?crit : > > Hi Satra, > > > > I understand the whole purpose of this piece of code is to get SPM to > > use an explicit mask. This is done by manipulating the masking structure > > "SPM.xM". > > > > I'm not yet entirely convinced this still needs to be done "behind the > > back of SPM" by modifying SPM.mat outside of the SPM code, but here is > > one of "John's Gems" that recommends this Nipype hack (it refers to SPM2 > > though): > > http://blogs.warwick.ac.uk/nichols/entry/spm2_gem_12/ > > > > With fMRI data/models, SPM2 is fully capable of doing explicit > > masking, but the user interface for fMRI doesn't ask for it. > > One way to do this type of masking anyway is to change the > > SPM.mat file *after* you specify your model, but *before* > > clicking 'Estimate'. > > Specifically: > > 1. Load the SPM.mat file, > > load SPM > > set the SPM.xM.TH values all to -Inf, > > SPM.xM.TH = -Inf*SPM.xM.TH; > > and, in case that you have an image format not allowing > > NaNs, set SPM.xM.I to 0 > > SPM.xM.I = 0; > > 2. If using a mask image, set SPM.xM.VM to a vector of > > structures, where each structure element is the output > > of spm_vol. For instance: > > SPM.xM.VM = spm_vol('Maskimage'); > > 3. Finally, save by > > save SPM SPM > > > > > > Most importantly, I am puzzled by the fact that we don't find the same > > results when running SPM8 via Nipype than when running SPM8 back in > > 2010-2012 using batches created from the SPM user interface. At this > > point I am not sure if this is an issue with Nipype, an issue with our > > own scripts created in 2009-2010, or perhaps related to changes in SPM8. > > I would like to the bottom of it - time permitting... I don't feel > > comfortable with the idea that the default SPM workflow could have been > > modified and I'd like to understand the cause of the discrepancy in the > > results. > > > > Best, > > Dimitri > > > > Le 02/12/2015 23:17, Satrajit Ghosh a ?crit : > >> hi dimitri, > >> > >> it's been a long while since those lines were written. but i believe > >> this was written as noted in the comments to support the case where the > >> user simply wanted spm to use the explicit mask. the other reason for it > >> was that there were several places that could control spm options (e.g., > >> config file). we did not want to rely on the config file. > >> > >> i believe it was more of a perspective on what an explicit mask meant. i > >> can't remember if this was discussed on the spm list or came from best > >> practices in our lab at that time. > >> > >> cheers, > >> > >> satra > >> > >> On Wed, Dec 2, 2015 at 9:45 AM, Dimitri Papadopoulos Orfanos > >> > > wrote: > >> > >> Hi, > >> > >> I have a question on the following commit: > >> > https://github.com/nipy/nipype/commit/fe9d0e07a288afefb34e99f488ef194a443d6089 > >> > >> Could someone explain the rationale behind the addition of this > piece of > >> code in nipype/interfaces/spm/model.py? We came across this code > while > >> trying to reproduce the results obtained with an old version of > SPM8 run > >> manually vs. the latest version of SPM8 run from Nipype. > >> > >> if isdefined(self.inputs.mask_image): > >> # SPM doesn't handle explicit masking properly, > especially > >> # when you want to use the entire mask image > >> postscript = "load SPM;\n" > >> postscript += "SPM.xM.VM = spm_vol('%s');\n" % > >> list_to_filename(self.inputs.mask_image) > >> postscript += "SPM.xM.I = 0;\n" > >> postscript += "SPM.xM.T = [];\n" > >> postscript += "SPM.xM.TH = > >> ones(size(SPM.xM.TH ))*(%s);\n" % > >> self.inputs.mask_threshold > >> postscript += "SPM.xM.xs = struct('Masking', 'explicit > >> masking only');\n" > >> postscript += "save SPM SPM;\n" > >> > >> We have understood almost all causes for differences in results. The > >> only cause that remains to be explained is this Nipype hack. Has > perhaps > >> this issue been discussed on the SPM mailing list? > >> > >> Best, > >> -- > >> Dimitri Papadopoulos > >> CEA/Saclay > >> I2BM, NeuroSpin > >> F-91191 Gif-sur-Yvette cedex, France > >> _______________________________________________ > >> Neuroimaging mailing list > >> Neuroimaging at python.org > >> https://mail.python.org/mailman/listinfo/neuroimaging > >> > >> > >> > >> > >> _______________________________________________ > >> Neuroimaging mailing list > >> Neuroimaging at python.org > >> https://mail.python.org/mailman/listinfo/neuroimaging > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From b02207006 at ntu.edu.tw Wed Dec 9 11:33:47 2015 From: b02207006 at ntu.edu.tw (=?big5?B?qky4cae7?=) Date: Wed, 9 Dec 2015 16:33:47 +0000 Subject: [Neuroimaging] Dipy installation problem Message-ID: Hi, I got a problem while installing Dipy package on my Windows x64 system. I've solved the problem like "unable to find vcvarsall.bat" by solutions online. That is, the distutils.cfg has been made. But now new problem pops up... Error information like?"TypeError: unoderable types...." and "failed building wheel for dipy."shows and I can not find usable solution online. The followings are the full error information. (but some of this limited by the capacity of CMD I think) ts copying dipy\viz\tests\test_fvtk_utils.py -> build\lib.win32-3.5\dipy\viz\test s copying dipy\viz\tests\test_fvtk_widgets.py -> build\lib.win32-3.5\dipy\viz\te sts copying dipy\viz\tests\test_fvtk_window.py -> build\lib.win32-3.5\dipy\viz\tes ts copying dipy\viz\tests\test_regtools.py -> build\lib.win32-3.5\dipy\viz\tests copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests creating build\lib.win32-3.5\dipy\testing copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing creating build\lib.win32-3.5\dipy\testing\tests copying dipy\testing\tests\test_decorators.py -> build\lib.win32-3.5\dipy\test ing\tests copying dipy\testing\tests\test_memory.py -> build\lib.win32-3.5\dipy\testing\ tests copying dipy\testing\tests\__init__.py -> build\lib.win32-3.5\dipy\testing\tes ts creating build\lib.win32-3.5\dipy\boots copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots creating build\lib.win32-3.5\dipy\data copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data creating build\lib.win32-3.5\dipy\utils copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils creating build\lib.win32-3.5\dipy\utils\tests copying dipy\utils\tests\test_arrfuncs.py -> build\lib.win32-3.5\dipy\utils\te sts copying dipy\utils\tests\test_tripwire.py -> build\lib.win32-3.5\dipy\utils\te sts copying dipy\utils\tests\__init__.py -> build\lib.win32-3.5\dipy\utils\tests creating build\lib.win32-3.5\dipy\fixes copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes creating build\lib.win32-3.5\dipy\external copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external creating build\lib.win32-3.5\dipy\external\tests copying dipy\external\tests\__init__.py -> build\lib.win32-3.5\dipy\external\t ests creating build\lib.win32-3.5\dipy\segment copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment creating build\lib.win32-3.5\dipy\segment\benchmarks copying dipy\segment\benchmarks\bench_quickbundles.py -> build\lib.win32-3.5\d ipy\segment\benchmarks copying dipy\segment\benchmarks\__init__.py -> build\lib.win32-3.5\dipy\segmen t\benchmarks creating build\lib.win32-3.5\dipy\segment\tests copying dipy\segment\tests\test_clustering.py -> build\lib.win32-3.5\dipy\segm ent\tests copying dipy\segment\tests\test_feature.py -> build\lib.win32-3.5\dipy\segment \tests copying dipy\segment\tests\test_mask.py -> build\lib.win32-3.5\dipy\segment\te sts copying dipy\segment\tests\test_metric.py -> build\lib.win32-3.5\dipy\segment\ tests copying dipy\segment\tests\test_qb.py -> build\lib.win32-3.5\dipy\segment\test s copying dipy\segment\tests\test_quickbundles.py -> build\lib.win32-3.5\dipy\se gment\tests copying dipy\segment\tests\__init__.py -> build\lib.win32-3.5\dipy\segment\tes ts creating build\lib.win32-3.5\dipy\sims copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims creating build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_phantom.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\test_voxel.py -> build\lib.win32-3.5\dipy\sims\tests copying dipy\sims\tests\__init__.py -> build\lib.win32-3.5\dipy\sims\tests creating build\lib.win32-3.5\dipy\denoise copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\noise_estimate.py -> build\lib.win32-3.5\dipy\denoise copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise creating build\lib.win32-3.5\dipy\denoise\tests copying dipy\denoise\tests\test_denoise.py -> build\lib.win32-3.5\dipy\denoise \tests copying dipy\denoise\tests\test_nlmeans.py -> build\lib.win32-3.5\dipy\denoise \tests copying dipy\denoise\tests\test_noise_estimate.py -> build\lib.win32-3.5\dipy\ denoise\tests copying dipy\denoise\tests\__init__.py -> build\lib.win32-3.5\dipy\denoise\tes ts creating build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\55dir_grad.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\aniso_vox.nii.gz -> build\lib.win32-3.5\dipy\data\file s copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\dipy_colormaps.json -> build\lib.win32-3.5\dipy\data\f iles copying dipy\data\files\dsi4169_b_table.txt -> build\lib.win32-3.5\dipy\data\f iles copying dipy\data\files\dsi515_b_table.txt -> build\lib.win32-3.5\dipy\data\fi les copying dipy\data\files\eg_3voxels.pkl -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_362.npz -> build\lib.win32-3 .5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_642.npz -> build\lib.win32-3 .5\dipy\data\files copying dipy\data\files\evenly_distributed_sphere_724.npz -> build\lib.win32-3 .5\dipy\data\files copying dipy\data\files\fib0.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib1.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\fib2.pkl.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\func_coef.nii.gz -> build\lib.win32-3.5\dipy\data\file s copying dipy\data\files\func_discrete.nii.gz -> build\lib.win32-3.5\dipy\data\ files copying dipy\data\files\grad_514.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_3shell.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\gtab_isbi2013_2shell.txt -> build\lib.win32-3.5\dipy\d ata\files copying dipy\data\files\gtab_taiwan_dsi.txt -> build\lib.win32-3.5\dipy\data\f iles copying dipy\data\files\life_matlab_rmse.npy -> build\lib.win32-3.5\dipy\data\ files copying dipy\data\files\life_matlab_weights.npy -> build\lib.win32-3.5\dipy\da ta\files copying dipy\data\files\repulsion100.npz -> build\lib.win32-3.5\dipy\data\file s copying dipy\data\files\repulsion724.npz -> build\lib.win32-3.5\dipy\data\file s copying dipy\data\files\S0_10slices.nii.gz -> build\lib.win32-3.5\dipy\data\fi les copying dipy\data\files\ScannerVectors_GQI101.txt -> build\lib.win32-3.5\dipy\ data\files copying dipy\data\files\small_101D.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_101D.nii.gz -> build\lib.win32-3.5\dipy\data\fil es copying dipy\data\files\small_25.bval -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.bvec -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_25.nii.gz -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\small_64D.bvals.npy -> build\lib.win32-3.5\dipy\data\f iles copying dipy\data\files\small_64D.gradients.npy -> build\lib.win32-3.5\dipy\da ta\files copying dipy\data\files\small_64D.nii -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\sphere_grad.txt -> build\lib.win32-3.5\dipy\data\files copying dipy\data\files\t1_coronal_slice.npy -> build\lib.win32-3.5\dipy\data\ files copying dipy\data\files\test_piesno.nii.gz -> build\lib.win32-3.5\dipy\data\fi les copying dipy\data\files\tracks300.trk -> build\lib.win32-3.5\dipy\data\files running build_ext Traceback (most recent call last): File "", line 1, in File "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", line 241, in main(**extra_setuptools_args) File "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", line 234, in main **extra_args File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in setup dist.run_commands() File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in run_co mmands self.run_command(cmd) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_co mmand cmd_obj.run() File "C:\Users\andylin\Anaconda3\lib\site-packages\wheel\bdist_wheel.py", li ne 176, in run self.run_command('build') File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in run_com mand self.distribution.run_command(command) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_co mmand cmd_obj.run() File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line 135, in run self.run_command(cmd_name) File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in run_com mand self.distribution.run_command(command) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_co mmand cmd_obj.run() File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", line 3 07, in run force=self.force) File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line 1031, in new_compiler return klass(None, dry_run, force) File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line 282 , in __init__ CygwinCCompiler.__init__ (self, verbose, dry_run, force) File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line 126 , in __init__ if self.ld_version >= "2.10.90": TypeError: unorderable types: NoneType() >= str() ---------------------------------------- Failed building wheel for dipy Failed to build dipy Installing collected packages: dipy Running setup.py install for dipy Complete output from command C:\Users\andylin\Anaconda3\python.exe -c "impor t setuptools, tokenize;__file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-b uild-8ygf1kth\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__fi le__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\User s\andylin\AppData\Local\Temp\pip-r_4d6qec-record\install-record.txt --single-ver sion-externally-managed --compile: running install running build running build_py running build_ext Traceback (most recent call last): File "", line 1, in File "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py ", line 241, in main(**extra_setuptools_args) File "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py ", line 234, in main **extra_args File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in setu p dist.run_commands() File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in run_ commands self.run_command(cmd) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_ command cmd_obj.run() File "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py ", line 153, in run install.install.run(self) File "C:\Users\andylin\Anaconda3\lib\site-packages\setuptools-18.5-py3.5.e gg\setuptools\command\install.py", line 61, in run File "C:\Users\andylin\Anaconda3\lib\distutils\command\install.py", line 5 39, in run self.run_command('build') File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in run_c ommand self.distribution.run_command(command) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_ command cmd_obj.run() File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line 135 , in run self.run_command(cmd_name) File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in run_c ommand self.distribution.run_command(command) File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in run_ command cmd_obj.run() File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", line 307, in run force=self.force) File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line 1031, i n new_compiler return klass(None, dry_run, force) File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line 2 82, in __init__ CygwinCCompiler.__init__ (self, verbose, dry_run, force) File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line 1 26, in __init__ if self.ld_version >= "2.10.90": TypeError: unorderable types: NoneType() >= str() ---------------------------------------- Command "C:\Users\andylin\Anaconda3\python.exe -c "import setuptools, tokenize;_ _file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-build-8ygf1kth\\dipy\\set up.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r \n', '\n'), __file__, 'exec'))" install --record C:\Users\andylin\AppData\Local\ Temp\pip-r_4d6qec-record\install-record.txt --single-version-externally-managed --compile" failed with error code 1 in C:\Users\andylin\AppData\Local\Temp\pip-b uild-8ygf1kth\dipy C:\Users\andylin\Anaconda3> -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Dec 10 01:01:11 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 9 Dec 2015 22:01:11 -0800 Subject: [Neuroimaging] Dipy installation problem In-Reply-To: References: Message-ID: Hi, Thanks for the report. On Wed, Dec 9, 2015 at 8:33 AM, ??? wrote: > Hi, I got a problem while installing Dipy package on my Windows x64 system. > I've solved the problem like "unable to find vcvarsall.bat" by solutions > online. > That is, the distutils.cfg has been made. > > But now new problem pops up... > Error information like?"TypeError: unoderable types...." and "failed > building wheel for dipy."shows > and I can not find usable solution online. > The followings are the full error information. (but some of this limited by > the capacity of CMD I think) > > > ts > copying dipy\viz\tests\test_fvtk_utils.py -> > build\lib.win32-3.5\dipy\viz\test > s > copying dipy\viz\tests\test_fvtk_widgets.py -> > build\lib.win32-3.5\dipy\viz\te > sts > copying dipy\viz\tests\test_fvtk_window.py -> > build\lib.win32-3.5\dipy\viz\tes > ts > copying dipy\viz\tests\test_regtools.py -> > build\lib.win32-3.5\dipy\viz\tests > copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests > creating build\lib.win32-3.5\dipy\testing > copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing > copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing > creating build\lib.win32-3.5\dipy\testing\tests > copying dipy\testing\tests\test_decorators.py -> > build\lib.win32-3.5\dipy\test > ing\tests > copying dipy\testing\tests\test_memory.py -> > build\lib.win32-3.5\dipy\testing\ > tests > copying dipy\testing\tests\__init__.py -> > build\lib.win32-3.5\dipy\testing\tes > ts > creating build\lib.win32-3.5\dipy\boots > copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots > copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots > creating build\lib.win32-3.5\dipy\data > copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data > copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data > creating build\lib.win32-3.5\dipy\utils > copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils > copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils > creating build\lib.win32-3.5\dipy\utils\tests > copying dipy\utils\tests\test_arrfuncs.py -> > build\lib.win32-3.5\dipy\utils\te > sts > copying dipy\utils\tests\test_tripwire.py -> > build\lib.win32-3.5\dipy\utils\te > sts > copying dipy\utils\tests\__init__.py -> > build\lib.win32-3.5\dipy\utils\tests > creating build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes > copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes > creating build\lib.win32-3.5\dipy\external > copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external > copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external > creating build\lib.win32-3.5\dipy\external\tests > copying dipy\external\tests\__init__.py -> > build\lib.win32-3.5\dipy\external\t > ests > creating build\lib.win32-3.5\dipy\segment > copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment > copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment > creating build\lib.win32-3.5\dipy\segment\benchmarks > copying dipy\segment\benchmarks\bench_quickbundles.py -> > build\lib.win32-3.5\d > ipy\segment\benchmarks > copying dipy\segment\benchmarks\__init__.py -> > build\lib.win32-3.5\dipy\segmen > t\benchmarks > creating build\lib.win32-3.5\dipy\segment\tests > copying dipy\segment\tests\test_clustering.py -> > build\lib.win32-3.5\dipy\segm > ent\tests > copying dipy\segment\tests\test_feature.py -> > build\lib.win32-3.5\dipy\segment > \tests > copying dipy\segment\tests\test_mask.py -> > build\lib.win32-3.5\dipy\segment\te > sts > copying dipy\segment\tests\test_metric.py -> > build\lib.win32-3.5\dipy\segment\ > tests > copying dipy\segment\tests\test_qb.py -> > build\lib.win32-3.5\dipy\segment\test > s > copying dipy\segment\tests\test_quickbundles.py -> > build\lib.win32-3.5\dipy\se > gment\tests > copying dipy\segment\tests\__init__.py -> > build\lib.win32-3.5\dipy\segment\tes > ts > creating build\lib.win32-3.5\dipy\sims > copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims > copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims > creating build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\test_phantom.py -> > build\lib.win32-3.5\dipy\sims\tests > > copying dipy\sims\tests\test_voxel.py -> > build\lib.win32-3.5\dipy\sims\tests > copying dipy\sims\tests\__init__.py -> build\lib.win32-3.5\dipy\sims\tests > creating build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\noise_estimate.py -> build\lib.win32-3.5\dipy\denoise > copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise > creating build\lib.win32-3.5\dipy\denoise\tests > copying dipy\denoise\tests\test_denoise.py -> > build\lib.win32-3.5\dipy\denoise > \tests > copying dipy\denoise\tests\test_nlmeans.py -> > build\lib.win32-3.5\dipy\denoise > \tests > copying dipy\denoise\tests\test_noise_estimate.py -> > build\lib.win32-3.5\dipy\ > denoise\tests > copying dipy\denoise\tests\__init__.py -> > build\lib.win32-3.5\dipy\denoise\tes > ts > creating build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\55dir_grad.bval -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\55dir_grad.bvec -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\aniso_vox.nii.gz -> > build\lib.win32-3.5\dipy\data\file > s > copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\dipy_colormaps.json -> > build\lib.win32-3.5\dipy\data\f > iles > copying dipy\data\files\dsi4169_b_table.txt -> > build\lib.win32-3.5\dipy\data\f > iles > copying dipy\data\files\dsi515_b_table.txt -> > build\lib.win32-3.5\dipy\data\fi > les > copying dipy\data\files\eg_3voxels.pkl -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_362.npz -> > build\lib.win32-3 > .5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_642.npz -> > build\lib.win32-3 > .5\dipy\data\files > copying dipy\data\files\evenly_distributed_sphere_724.npz -> > build\lib.win32-3 > .5\dipy\data\files > copying dipy\data\files\fib0.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib1.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\fib2.pkl.gz -> build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\func_coef.nii.gz -> > build\lib.win32-3.5\dipy\data\file > s > copying dipy\data\files\func_discrete.nii.gz -> > build\lib.win32-3.5\dipy\data\ > files > copying dipy\data\files\grad_514.txt -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\gtab_3shell.txt -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\gtab_isbi2013_2shell.txt -> > build\lib.win32-3.5\dipy\d > ata\files > copying dipy\data\files\gtab_taiwan_dsi.txt -> > build\lib.win32-3.5\dipy\data\f > iles > copying dipy\data\files\life_matlab_rmse.npy -> > build\lib.win32-3.5\dipy\data\ > files > copying dipy\data\files\life_matlab_weights.npy -> > build\lib.win32-3.5\dipy\da > ta\files > copying dipy\data\files\repulsion100.npz -> > build\lib.win32-3.5\dipy\data\file > s > copying dipy\data\files\repulsion724.npz -> > build\lib.win32-3.5\dipy\data\file > s > copying dipy\data\files\S0_10slices.nii.gz -> > build\lib.win32-3.5\dipy\data\fi > les > copying dipy\data\files\ScannerVectors_GQI101.txt -> > build\lib.win32-3.5\dipy\ > data\files > copying dipy\data\files\small_101D.bval -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\small_101D.bvec -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\small_101D.nii.gz -> > build\lib.win32-3.5\dipy\data\fil > es > copying dipy\data\files\small_25.bval -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.bvec -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\small_25.nii.gz -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\small_64D.bvals.npy -> > build\lib.win32-3.5\dipy\data\f > iles > copying dipy\data\files\small_64D.gradients.npy -> > build\lib.win32-3.5\dipy\da > ta\files > copying dipy\data\files\small_64D.nii -> > build\lib.win32-3.5\dipy\data\files > copying dipy\data\files\sphere_grad.txt -> > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\t1_coronal_slice.npy -> > build\lib.win32-3.5\dipy\data\ > files > copying dipy\data\files\test_piesno.nii.gz -> > build\lib.win32-3.5\dipy\data\fi > les > copying dipy\data\files\tracks300.trk -> > build\lib.win32-3.5\dipy\data\files > running build_ext > Traceback (most recent call last): > File "", line 1, in > File > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", > line 241, in > main(**extra_setuptools_args) > File > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", > line 234, in main > **extra_args > File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in > setup > dist.run_commands() > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in > run_co > mmands > self.run_command(cmd) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_co > mmand > cmd_obj.run() > File > "C:\Users\andylin\Anaconda3\lib\site-packages\wheel\bdist_wheel.py", li > ne 176, in run > self.run_command('build') > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > run_com > mand > self.distribution.run_command(command) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_co > mmand > cmd_obj.run() > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line > 135, > in run > self.run_command(cmd_name) > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > run_com > mand > self.distribution.run_command(command) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_co > mmand > cmd_obj.run() > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", > line 3 > 07, in run > force=self.force) > File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line 1031, > in > new_compiler > return klass(None, dry_run, force) > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line > 282 > , in __init__ > CygwinCCompiler.__init__ (self, verbose, dry_run, force) > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line > 126 > , in __init__ > if self.ld_version >= "2.10.90": > TypeError: unorderable types: NoneType() >= str() > > ---------------------------------------- > Failed building wheel for dipy > Failed to build dipy > Installing collected packages: dipy > Running setup.py install for dipy > Complete output from command C:\Users\andylin\Anaconda3\python.exe -c > "impor > t setuptools, > tokenize;__file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-b > uild-8ygf1kth\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', > open)(__fi > le__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record > C:\User > s\andylin\AppData\Local\Temp\pip-r_4d6qec-record\install-record.txt > --single-ver > sion-externally-managed --compile: > running install > running build > running build_py > running build_ext > Traceback (most recent call last): > File "", line 1, in > File > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > ", line 241, in > main(**extra_setuptools_args) > File > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > ", line 234, in main > **extra_args > File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in > setu > p > dist.run_commands() > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in > run_ > commands > self.run_command(cmd) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_ > command > cmd_obj.run() > File > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > ", line 153, in run > install.install.run(self) > File > "C:\Users\andylin\Anaconda3\lib\site-packages\setuptools-18.5-py3.5.e > gg\setuptools\command\install.py", line 61, in run > File "C:\Users\andylin\Anaconda3\lib\distutils\command\install.py", > line 5 > 39, in run > self.run_command('build') > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > run_c > ommand > self.distribution.run_command(command) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_ > command > cmd_obj.run() > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line > 135 > , in run > self.run_command(cmd_name) > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > run_c > ommand > self.distribution.run_command(command) > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > run_ > command > cmd_obj.run() > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", > line > 307, in run > force=self.force) > File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line > 1031, i > n new_compiler > return klass(None, dry_run, force) > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", > line 2 > 82, in __init__ > CygwinCCompiler.__init__ (self, verbose, dry_run, force) > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", > line 1 > 26, in __init__ > if self.ld_version >= "2.10.90": > TypeError: unorderable types: NoneType() >= str() > > ---------------------------------------- > Command "C:\Users\andylin\Anaconda3\python.exe -c "import setuptools, > tokenize;_ > _file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-build-8ygf1kth\\dipy\\set > up.py';exec(compile(getattr(tokenize, 'open', > open)(__file__).read().replace('\r > \n', '\n'), __file__, 'exec'))" install --record > C:\Users\andylin\AppData\Local\ > Temp\pip-r_4d6qec-record\install-record.txt > --single-version-externally-managed > --compile" failed with error code 1 in > C:\Users\andylin\AppData\Local\Temp\pip-b > uild-8ygf1kth\dipy > > C:\Users\andylin\Anaconda3> I'm afraid I don't personally use Anaconda, so I haven't seen an error like this. Have you tried installing any other Python packages that need compilation? Best, Matthew From adam.rybinski at outlook.com Thu Dec 10 04:35:09 2015 From: adam.rybinski at outlook.com (=?gb2312?B?QWRhbSBSeWJpqL1za2k=?=) Date: Thu, 10 Dec 2015 09:35:09 +0000 Subject: [Neuroimaging] Dipy installation problem In-Reply-To: References: , Message-ID: Hi,I remember having similar problem on Windows, it was while back ago, amd now I'm on linux most of the time.Eventually, the way I ran dipy on Windows was using anaconda 2.7, and then installing compiler for Python 2.7 in Windows https://www.microsoft.com/en-us/download/details.aspx?id=44266 I couldn't find microsoft compiler for Python 3.5---this one is somehow bundled with their Visual Studio Python IDE I think. So If you don't care that much about Python version, try: conda create --name python2 python=2.7 activate python2 pip install nibabel dipy (and all other dependencies you need) I hope it helps, Adam > From: matthew.brett at gmail.com > Date: Wed, 9 Dec 2015 22:01:11 -0800 > To: neuroimaging at python.org > Subject: Re: [Neuroimaging] Dipy installation problem > > Hi, > > Thanks for the report. > > On Wed, Dec 9, 2015 at 8:33 AM, ???x?? wrote: > > Hi, I got a problem while installing Dipy package on my Windows x64 system. > > I've solved the problem like "unable to find vcvarsall.bat" by solutions > > online. > > That is, the distutils.cfg has been made. > > > > But now new problem pops up... > > Error information like??"TypeError: unoderable types...." and "failed > > building wheel for dipy."shows > > and I can not find usable solution online. > > The followings are the full error information. (but some of this limited by > > the capacity of CMD I think) > > > > > > ts > > copying dipy\viz\tests\test_fvtk_utils.py -> > > build\lib.win32-3.5\dipy\viz\test > > s > > copying dipy\viz\tests\test_fvtk_widgets.py -> > > build\lib.win32-3.5\dipy\viz\te > > sts > > copying dipy\viz\tests\test_fvtk_window.py -> > > build\lib.win32-3.5\dipy\viz\tes > > ts > > copying dipy\viz\tests\test_regtools.py -> > > build\lib.win32-3.5\dipy\viz\tests > > copying dipy\viz\tests\__init__.py -> build\lib.win32-3.5\dipy\viz\tests > > creating build\lib.win32-3.5\dipy\testing > > copying dipy\testing\decorators.py -> build\lib.win32-3.5\dipy\testing > > copying dipy\testing\memory.py -> build\lib.win32-3.5\dipy\testing > > copying dipy\testing\spherepoints.py -> build\lib.win32-3.5\dipy\testing > > copying dipy\testing\__init__.py -> build\lib.win32-3.5\dipy\testing > > creating build\lib.win32-3.5\dipy\testing\tests > > copying dipy\testing\tests\test_decorators.py -> > > build\lib.win32-3.5\dipy\test > > ing\tests > > copying dipy\testing\tests\test_memory.py -> > > build\lib.win32-3.5\dipy\testing\ > > tests > > copying dipy\testing\tests\__init__.py -> > > build\lib.win32-3.5\dipy\testing\tes > > ts > > creating build\lib.win32-3.5\dipy\boots > > copying dipy\boots\resampling.py -> build\lib.win32-3.5\dipy\boots > > copying dipy\boots\__init__.py -> build\lib.win32-3.5\dipy\boots > > creating build\lib.win32-3.5\dipy\data > > copying dipy\data\fetcher.py -> build\lib.win32-3.5\dipy\data > > copying dipy\data\__init__.py -> build\lib.win32-3.5\dipy\data > > creating build\lib.win32-3.5\dipy\utils > > copying dipy\utils\arrfuncs.py -> build\lib.win32-3.5\dipy\utils > > copying dipy\utils\optpkg.py -> build\lib.win32-3.5\dipy\utils > > copying dipy\utils\six.py -> build\lib.win32-3.5\dipy\utils > > copying dipy\utils\tripwire.py -> build\lib.win32-3.5\dipy\utils > > copying dipy\utils\_importlib.py -> build\lib.win32-3.5\dipy\utils > > copying dipy\utils\__init__.py -> build\lib.win32-3.5\dipy\utils > > creating build\lib.win32-3.5\dipy\utils\tests > > copying dipy\utils\tests\test_arrfuncs.py -> > > build\lib.win32-3.5\dipy\utils\te > > sts > > copying dipy\utils\tests\test_tripwire.py -> > > build\lib.win32-3.5\dipy\utils\te > > sts > > copying dipy\utils\tests\__init__.py -> > > build\lib.win32-3.5\dipy\utils\tests > > creating build\lib.win32-3.5\dipy\fixes > > copying dipy\fixes\argparse.py -> build\lib.win32-3.5\dipy\fixes > > copying dipy\fixes\__init__.py -> build\lib.win32-3.5\dipy\fixes > > creating build\lib.win32-3.5\dipy\external > > copying dipy\external\fsl.py -> build\lib.win32-3.5\dipy\external > > copying dipy\external\__init__.py -> build\lib.win32-3.5\dipy\external > > creating build\lib.win32-3.5\dipy\external\tests > > copying dipy\external\tests\__init__.py -> > > build\lib.win32-3.5\dipy\external\t > > ests > > creating build\lib.win32-3.5\dipy\segment > > copying dipy\segment\clustering.py -> build\lib.win32-3.5\dipy\segment > > copying dipy\segment\mask.py -> build\lib.win32-3.5\dipy\segment > > copying dipy\segment\metric.py -> build\lib.win32-3.5\dipy\segment > > copying dipy\segment\quickbundles.py -> build\lib.win32-3.5\dipy\segment > > copying dipy\segment\threshold.py -> build\lib.win32-3.5\dipy\segment > > copying dipy\segment\__init__.py -> build\lib.win32-3.5\dipy\segment > > creating build\lib.win32-3.5\dipy\segment\benchmarks > > copying dipy\segment\benchmarks\bench_quickbundles.py -> > > build\lib.win32-3.5\d > > ipy\segment\benchmarks > > copying dipy\segment\benchmarks\__init__.py -> > > build\lib.win32-3.5\dipy\segmen > > t\benchmarks > > creating build\lib.win32-3.5\dipy\segment\tests > > copying dipy\segment\tests\test_clustering.py -> > > build\lib.win32-3.5\dipy\segm > > ent\tests > > copying dipy\segment\tests\test_feature.py -> > > build\lib.win32-3.5\dipy\segment > > \tests > > copying dipy\segment\tests\test_mask.py -> > > build\lib.win32-3.5\dipy\segment\te > > sts > > copying dipy\segment\tests\test_metric.py -> > > build\lib.win32-3.5\dipy\segment\ > > tests > > copying dipy\segment\tests\test_qb.py -> > > build\lib.win32-3.5\dipy\segment\test > > s > > copying dipy\segment\tests\test_quickbundles.py -> > > build\lib.win32-3.5\dipy\se > > gment\tests > > copying dipy\segment\tests\__init__.py -> > > build\lib.win32-3.5\dipy\segment\tes > > ts > > creating build\lib.win32-3.5\dipy\sims > > copying dipy\sims\phantom.py -> build\lib.win32-3.5\dipy\sims > > copying dipy\sims\voxel.py -> build\lib.win32-3.5\dipy\sims > > copying dipy\sims\__init__.py -> build\lib.win32-3.5\dipy\sims > > creating build\lib.win32-3.5\dipy\sims\tests > > copying dipy\sims\tests\test_phantom.py -> > > build\lib.win32-3.5\dipy\sims\tests > > > > copying dipy\sims\tests\test_voxel.py -> > > build\lib.win32-3.5\dipy\sims\tests > > copying dipy\sims\tests\__init__.py -> build\lib.win32-3.5\dipy\sims\tests > > creating build\lib.win32-3.5\dipy\denoise > > copying dipy\denoise\nlmeans.py -> build\lib.win32-3.5\dipy\denoise > > copying dipy\denoise\noise_estimate.py -> build\lib.win32-3.5\dipy\denoise > > copying dipy\denoise\__init__.py -> build\lib.win32-3.5\dipy\denoise > > creating build\lib.win32-3.5\dipy\denoise\tests > > copying dipy\denoise\tests\test_denoise.py -> > > build\lib.win32-3.5\dipy\denoise > > \tests > > copying dipy\denoise\tests\test_nlmeans.py -> > > build\lib.win32-3.5\dipy\denoise > > \tests > > copying dipy\denoise\tests\test_noise_estimate.py -> > > build\lib.win32-3.5\dipy\ > > denoise\tests > > copying dipy\denoise\tests\__init__.py -> > > build\lib.win32-3.5\dipy\denoise\tes > > ts > > creating build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\55dir_grad.bval -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\55dir_grad.bvec -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\aniso_vox.nii.gz -> > > build\lib.win32-3.5\dipy\data\file > > s > > copying dipy\data\files\C.npy -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\C1.pkl.gz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\C3.pkl.gz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\cb_2.npz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\circle.npy -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\dipy_colormaps.json -> > > build\lib.win32-3.5\dipy\data\f > > iles > > copying dipy\data\files\dsi4169_b_table.txt -> > > build\lib.win32-3.5\dipy\data\f > > iles > > copying dipy\data\files\dsi515_b_table.txt -> > > build\lib.win32-3.5\dipy\data\fi > > les > > copying dipy\data\files\eg_3voxels.pkl -> > > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\evenly_distributed_sphere_362.npz -> > > build\lib.win32-3 > > .5\dipy\data\files > > copying dipy\data\files\evenly_distributed_sphere_642.npz -> > > build\lib.win32-3 > > .5\dipy\data\files > > copying dipy\data\files\evenly_distributed_sphere_724.npz -> > > build\lib.win32-3 > > .5\dipy\data\files > > copying dipy\data\files\fib0.pkl.gz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\fib1.pkl.gz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\fib2.pkl.gz -> build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\func_coef.nii.gz -> > > build\lib.win32-3.5\dipy\data\file > > s > > copying dipy\data\files\func_discrete.nii.gz -> > > build\lib.win32-3.5\dipy\data\ > > files > > copying dipy\data\files\grad_514.txt -> > > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\gtab_3shell.txt -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\gtab_isbi2013_2shell.txt -> > > build\lib.win32-3.5\dipy\d > > ata\files > > copying dipy\data\files\gtab_taiwan_dsi.txt -> > > build\lib.win32-3.5\dipy\data\f > > iles > > copying dipy\data\files\life_matlab_rmse.npy -> > > build\lib.win32-3.5\dipy\data\ > > files > > copying dipy\data\files\life_matlab_weights.npy -> > > build\lib.win32-3.5\dipy\da > > ta\files > > copying dipy\data\files\repulsion100.npz -> > > build\lib.win32-3.5\dipy\data\file > > s > > copying dipy\data\files\repulsion724.npz -> > > build\lib.win32-3.5\dipy\data\file > > s > > copying dipy\data\files\S0_10slices.nii.gz -> > > build\lib.win32-3.5\dipy\data\fi > > les > > copying dipy\data\files\ScannerVectors_GQI101.txt -> > > build\lib.win32-3.5\dipy\ > > data\files > > copying dipy\data\files\small_101D.bval -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\small_101D.bvec -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\small_101D.nii.gz -> > > build\lib.win32-3.5\dipy\data\fil > > es > > copying dipy\data\files\small_25.bval -> > > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\small_25.bvec -> > > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\small_25.nii.gz -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\small_64D.bvals.npy -> > > build\lib.win32-3.5\dipy\data\f > > iles > > copying dipy\data\files\small_64D.gradients.npy -> > > build\lib.win32-3.5\dipy\da > > ta\files > > copying dipy\data\files\small_64D.nii -> > > build\lib.win32-3.5\dipy\data\files > > copying dipy\data\files\sphere_grad.txt -> > > build\lib.win32-3.5\dipy\data\files > > > > copying dipy\data\files\t1_coronal_slice.npy -> > > build\lib.win32-3.5\dipy\data\ > > files > > copying dipy\data\files\test_piesno.nii.gz -> > > build\lib.win32-3.5\dipy\data\fi > > les > > copying dipy\data\files\tracks300.trk -> > > build\lib.win32-3.5\dipy\data\files > > running build_ext > > Traceback (most recent call last): > > File "", line 1, in > > File > > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", > > line 241, in > > main(**extra_setuptools_args) > > File > > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py", > > line 234, in main > > **extra_args > > File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in > > setup > > dist.run_commands() > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in > > run_co > > mmands > > self.run_command(cmd) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_co > > mmand > > cmd_obj.run() > > File > > "C:\Users\andylin\Anaconda3\lib\site-packages\wheel\bdist_wheel.py", li > > ne 176, in run > > self.run_command('build') > > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > > run_com > > mand > > self.distribution.run_command(command) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_co > > mmand > > cmd_obj.run() > > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line > > 135, > > in run > > self.run_command(cmd_name) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > > run_com > > mand > > self.distribution.run_command(command) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_co > > mmand > > cmd_obj.run() > > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", > > line 3 > > 07, in run > > force=self.force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line 1031, > > in > > new_compiler > > return klass(None, dry_run, force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line > > 282 > > , in __init__ > > CygwinCCompiler.__init__ (self, verbose, dry_run, force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", line > > 126 > > , in __init__ > > if self.ld_version >= "2.10.90": > > TypeError: unorderable types: NoneType() >= str() > > > > ---------------------------------------- > > Failed building wheel for dipy > > Failed to build dipy > > Installing collected packages: dipy > > Running setup.py install for dipy > > Complete output from command C:\Users\andylin\Anaconda3\python.exe -c > > "impor > > t setuptools, > > tokenize;__file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-b > > uild-8ygf1kth\\dipy\\setup.py';exec(compile(getattr(tokenize, 'open', > > open)(__fi > > le__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record > > C:\User > > s\andylin\AppData\Local\Temp\pip-r_4d6qec-record\install-record.txt > > --single-ver > > sion-externally-managed --compile: > > running install > > running build > > running build_py > > running build_ext > > Traceback (most recent call last): > > File "", line 1, in > > File > > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > > ", line 241, in > > main(**extra_setuptools_args) > > File > > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > > ", line 234, in main > > **extra_args > > File "C:\Users\andylin\Anaconda3\lib\distutils\core.py", line 148, in > > setu > > p > > dist.run_commands() > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 955, in > > run_ > > commands > > self.run_command(cmd) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_ > > command > > cmd_obj.run() > > File > > "C:\Users\andylin\AppData\Local\Temp\pip-build-8ygf1kth\dipy\setup.py > > ", line 153, in run > > install.install.run(self) > > File > > "C:\Users\andylin\Anaconda3\lib\site-packages\setuptools-18.5-py3.5.e > > gg\setuptools\command\install.py", line 61, in run > > File "C:\Users\andylin\Anaconda3\lib\distutils\command\install.py", > > line 5 > > 39, in run > > self.run_command('build') > > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > > run_c > > ommand > > self.distribution.run_command(command) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_ > > command > > cmd_obj.run() > > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build.py", line > > 135 > > , in run > > self.run_command(cmd_name) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cmd.py", line 313, in > > run_c > > ommand > > self.distribution.run_command(command) > > File "C:\Users\andylin\Anaconda3\lib\distutils\dist.py", line 974, in > > run_ > > command > > cmd_obj.run() > > File "C:\Users\andylin\Anaconda3\lib\distutils\command\build_ext.py", > > line > > 307, in run > > force=self.force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\ccompiler.py", line > > 1031, i > > n new_compiler > > return klass(None, dry_run, force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", > > line 2 > > 82, in __init__ > > CygwinCCompiler.__init__ (self, verbose, dry_run, force) > > File "C:\Users\andylin\Anaconda3\lib\distutils\cygwinccompiler.py", > > line 1 > > 26, in __init__ > > if self.ld_version >= "2.10.90": > > TypeError: unorderable types: NoneType() >= str() > > > > ---------------------------------------- > > Command "C:\Users\andylin\Anaconda3\python.exe -c "import setuptools, > > tokenize;_ > > _file__='C:\\Users\\andylin\\AppData\\Local\\Temp\\pip-build-8ygf1kth\\dipy\\set > > up.py';exec(compile(getattr(tokenize, 'open', > > open)(__file__).read().replace('\r > > \n', '\n'), __file__, 'exec'))" install --record > > C:\Users\andylin\AppData\Local\ > > Temp\pip-r_4d6qec-record\install-record.txt > > --single-version-externally-managed > > --compile" failed with error code 1 in > > C:\Users\andylin\AppData\Local\Temp\pip-b > > uild-8ygf1kth\dipy > > > > C:\Users\andylin\Anaconda3> > > I'm afraid I don't personally use Anaconda, so I haven't seen an error > like this. > > Have you tried installing any other Python packages that need compilation? > > Best, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From effigies at bu.edu Fri Dec 11 14:43:45 2015 From: effigies at bu.edu (Christopher J Markiewicz) Date: Fri, 11 Dec 2015 14:43:45 -0500 Subject: [Neuroimaging] Effects of motion outliers on HRF model (in sparse acquisition fMRI) Message-ID: <566B2771.3090104@bu.edu> Hi all, I apologize in advance because, as Pythonic as my pipeline is, my issue here isn't really Python-related. However, the people on this list are the most likely to have dealt with similar issues (of places I know to look). If you'd rather I post on NeuroStars, I can, but I'm not sure how much people are actually using that. Anyway, my functional data comes from evenly-spaced, sparse acquisitions (TA=2.25s, TR=3.375s), and I've used artdetect in nipype to tag motion and intensity outliers. It's a fast, event-related design (one event every 2 TRs). In the past, my strategy has been to estimate HRF betas on the full dataset, and then excluding motion outliers in analysis by removing any event estimate that had an above-threshold contribution from an outlier volume. That is, in an NxM design matrix estimating N events from M scans, if scan j is an outlier, we exclude all events i such that DM[i,j] > (e.g.) 10% of max(HRF). Another strategy I'm looking into is to add nuisance regressors for outlier volumes to the design matrix, and limiting bleed-over into unrelated events. This is running into problems with "runs" of outliers, which can leave some events with nothing by which to estimate or only very small contributions from volumes that are going to be dominated by other events. I could remove such events, entirely, but for various reasons (mostly involving maintaining ordering so that off-by-one errors don't slip into our analysis) I'd like to have some representation of each event. The best external-to-our-lab resource I could find was this Gabrieli Lab protocol (https://github.com/gablab/mindhive/wiki/Example-of-sparse-fMRI-data-analysis-using-BIPS), which seems to indicate they've included the full dataset and noted outliers after estimation. Does anybody have any experience with outlier exclusion at or before HRF estimation, or is this the current best practice? Thanks, -- Christopher J Markiewicz Ph.D. Candidate, Quantitative Neuroscience Laboratory Boston University From satra at mit.edu Fri Dec 11 15:05:21 2015 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 11 Dec 2015 15:05:21 -0500 Subject: [Neuroimaging] Effects of motion outliers on HRF model (in sparse acquisition fMRI) In-Reply-To: <566B2771.3090104@bu.edu> References: <566B2771.3090104@bu.edu> Message-ID: hi chris, this is a standard sparse design and you can use the sparse model in nipype to get at events and amplitudes that you feed into standard SPM/FSL designers. the key is to not use a canonical HRF in the modeling stage. we have a standard openfmri (not BIDS yet) sparse script that can do the entire preprocessing and estimation on such data and would be happy to share. cheers, satra On Fri, Dec 11, 2015 at 2:43 PM, Christopher J Markiewicz wrote: > Hi all, > > I apologize in advance because, as Pythonic as my pipeline is, my issue > here isn't really Python-related. However, the people on this list are > the most likely to have dealt with similar issues (of places I know to > look). If you'd rather I post on NeuroStars, I can, but I'm not sure how > much people are actually using that. > > Anyway, my functional data comes from evenly-spaced, sparse acquisitions > (TA=2.25s, TR=3.375s), and I've used artdetect in nipype to tag motion > and intensity outliers. It's a fast, event-related design (one event > every 2 TRs). > > In the past, my strategy has been to estimate HRF betas on the full > dataset, and then excluding motion outliers in analysis by removing any > event estimate that had an above-threshold contribution from an outlier > volume. That is, in an NxM design matrix estimating N events from M > scans, if scan j is an outlier, we exclude all events i such that > DM[i,j] > (e.g.) 10% of max(HRF). > > Another strategy I'm looking into is to add nuisance regressors for > outlier volumes to the design matrix, and limiting bleed-over into > unrelated events. This is running into problems with "runs" of outliers, > which can leave some events with nothing by which to estimate or only > very small contributions from volumes that are going to be dominated by > other events. I could remove such events, entirely, but for various > reasons (mostly involving maintaining ordering so that off-by-one errors > don't slip into our analysis) I'd like to have some representation of > each event. > > The best external-to-our-lab resource I could find was this Gabrieli Lab > protocol > ( > https://github.com/gablab/mindhive/wiki/Example-of-sparse-fMRI-data-analysis-using-BIPS > ), > which seems to indicate they've included the full dataset and noted > outliers after estimation. > > Does anybody have any experience with outlier exclusion at or before HRF > estimation, or is this the current best practice? > > Thanks, > -- > Christopher J Markiewicz > Ph.D. Candidate, Quantitative Neuroscience Laboratory > Boston University > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From effigies at bu.edu Fri Dec 11 15:44:05 2015 From: effigies at bu.edu (Christopher J Markiewicz) Date: Fri, 11 Dec 2015 15:44:05 -0500 Subject: [Neuroimaging] Effects of motion outliers on HRF model (in sparse acquisition fMRI) In-Reply-To: References: <566B2771.3090104@bu.edu> Message-ID: <566B3595.7060909@bu.edu> On 12/11/2015 03:05 PM, Satrajit Ghosh wrote: > hi chris, > > this is a standard sparse design and you can use the sparse model in > nipype to get at events and amplitudes that you feed into standard > SPM/FSL designers. the key is to not use a canonical HRF in the modeling > stage. Thanks Satra. Looking at the SpecifySparseModel code, it looks like motion outliers are not taken into consideration at estimation time, so I'm inferring that these pipelines also don't worry about artifact-induced issues at this stage. That's fine. I just want to make sure that there isn't a standard (or emerging consensus) step that we're skipping. > we have a standard openfmri (not BIDS yet) sparse script that can do the > entire preprocessing and estimation on such data and would be happy to > share. That would be great, at the very least to validate that our approach hasn't been giving us wildly different results. Thanks, -- Christopher J Markiewicz Ph.D. Candidate, Quantitative Neuroscience Laboratory Boston University From satra at mit.edu Fri Dec 11 15:50:10 2015 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 11 Dec 2015 15:50:10 -0500 Subject: [Neuroimaging] Effects of motion outliers on HRF model (in sparse acquisition fMRI) In-Reply-To: <566B3595.7060909@bu.edu> References: <566B2771.3090104@bu.edu> <566B3595.7060909@bu.edu> Message-ID: hi chris, the outliers are taken into account at the estimation stage. one column for each volume discarded, not convolved with a canonical hrf. cheers, satra On Fri, Dec 11, 2015 at 3:44 PM, Christopher J Markiewicz wrote: > On 12/11/2015 03:05 PM, Satrajit Ghosh wrote: > > hi chris, > > > > this is a standard sparse design and you can use the sparse model in > > nipype to get at events and amplitudes that you feed into standard > > SPM/FSL designers. the key is to not use a canonical HRF in the modeling > > stage. > > Thanks Satra. Looking at the SpecifySparseModel code, it looks like > motion outliers are not taken into consideration at estimation time, so > I'm inferring that these pipelines also don't worry about > artifact-induced issues at this stage. > > That's fine. I just want to make sure that there isn't a standard (or > emerging consensus) step that we're skipping. > > > we have a standard openfmri (not BIDS yet) sparse script that can do the > > entire preprocessing and estimation on such data and would be happy to > > share. > > That would be great, at the very least to validate that our approach > hasn't been giving us wildly different results. > > > > Thanks, > -- > Christopher J Markiewicz > Ph.D. Candidate, Quantitative Neuroscience Laboratory > Boston University > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jetzel at wustl.edu Sat Dec 12 10:53:21 2015 From: jetzel at wustl.edu (Jo Etzel) Date: Sat, 12 Dec 2015 09:53:21 -0600 Subject: [Neuroimaging] 1st call for papers: PRNI 2016 Message-ID: <566C42F1.3030302@wustl.edu> ******* please accept our apologies for cross-posting ******* ------------------------------------------------------------------------------ FIRST CALL FOR PAPERS PRNI 2016 6th International Workshop on Pattern Recognition in Neuroimaging 22-24 June 2016 Fondazione Bruno Kessler (FBK), Trento, Italy www.prni.org - @PRNI2016 - www.facebook.com/PRNI2016/ ------------------------------------------------------------------------------ Paper submission deadline: 18 March 2016, 11:59 pm PST Acceptance notification: 22 April 2016 Camera-ready paper deadline: 7 May 2016 Oral and poster sessions: 22-24 June 2016 Pattern recognition techniques have become an important tool for neuroimaging data analysis. These techniques are helping to elucidate normal and abnormal brain function, cognition and perception, anatomical and functional brain architecture, biomarkers for diagnosis and personalized medicine, and as a scientific tool to decipher neural mechanisms underlying human cognition. The International Workshop on Pattern Recognition in Neuroimaging (PRNI) aims to: (1) foster dialogue between developers and users of cutting-edge analysis techniques in order to find matches between analysis techniques and neuroscientific questions; (2) showcase recent methodological advances in pattern recognition algorithms for neuroimaging analysis; and (3) identify challenging neuroscientific questions in need of new analysis approaches. PRNI welcomes submissions on topics including, but not limited to: * Learning from neuroimaging data - Algorithms for brain-state decoding or encoding - Optimization and regularization - Bayesian analysis of neuroimaging data - Causal inference and time delay techniques - Network and connectivity models (the connectome) - Dynamic and time-varying models - Dynamical systems and simulations - Empirical mode decomposition, multiscale decompositions - Combination of different data modalities - Efficient algorithms for large-scale data analysis * Interpretability of models and results - High-dimensional data visualization - Multivariate and multiple hypothesis testing - Summarization and presentation of inference results * Applications - Disease diagnosis and prognosis - Real-time decoding of brain states - Analysis of resting-state and task-based data - MEG, EEG, structural MRI, fMRI, diffusion MRI, ECoG, NIRS Authors should prepare full papers with a maximum length of 4 pages (two column IEEE style) for double-blind review. The manuscript submission deadline is 18 March 2016. Accepted manuscripts will be assigned either to an oral or poster sessions; all accepted manuscripts will be included in the workshop proceedings. -- Joset A. Etzel, Ph.D. Research Analyst Cognitive Control & Psychopathology Lab Washington University in St. Louis http://mvpa.blogspot.com/ From rockelcp at mcmaster.ca Tue Dec 15 17:55:41 2015 From: rockelcp at mcmaster.ca (C.P. Rockel) Date: Tue, 15 Dec 2015 17:55:41 -0500 Subject: [Neuroimaging] Mac OSX 10.6.8 problem Message-ID: Hi folks, I am brand new to Python, and I am having difficulties installing Dipy onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of the dependencies as per the Dipy website, and it said that this packages were already 'satisfied'. However, when I then 'pip install dipy', I get errors such as: SystemError: Cannot locate working compiler ---------------------------------------- Failed building wheel for dipy Failed to build dipy ... AND... Command "/Applications/anaconda3/bin/python3 -c "import setuptools, tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy As I'm not familiar with the Python environment, I'm not quite sure where to start troubleshooting. Does anyone have any suggestions? Thanks, Conrad -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Dec 15 18:08:27 2015 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 15 Dec 2015 15:08:27 -0800 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Hi Conrad, Looks like pip is not managing to find your compiler, though if you have installed XCode that should be in place. What do you get when you type "gcc" at the command line? Assuming that indicates that the shell can find your compiler -- you should get a message indicating that the compiler couldn't find any input files, or something to that nature -- does typing (as suggested here: http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip ): export CC=gcc and then running: pip install dipy do the trick for you? Cheers, Ariel On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel wrote: > Hi folks, > > I am brand new to Python, and I am having difficulties installing Dipy > onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and > set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of the > dependencies as per the Dipy website, and it said that this packages were > already 'satisfied'. > > However, when I then 'pip install dipy', I get errors such as: > > SystemError: Cannot locate working compiler > > ---------------------------------------- > Failed building wheel for dipy > Failed to build dipy > > ... AND... > > Command "/Applications/anaconda3/bin/python3 -c "import setuptools, > tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, > 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" > install --record > /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt > --single-version-externally-managed --compile" failed with error code 1 in > /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy > > As I'm not familiar with the Python environment, I'm not quite sure where > to start troubleshooting. > > Does anyone have any suggestions? > > Thanks, > > Conrad > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rockelcp at mcmaster.ca Tue Dec 15 18:18:29 2015 From: rockelcp at mcmaster.ca (C.P. Rockel) Date: Tue, 15 Dec 2015 18:18:29 -0500 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Hi Ariel, when I type 'gcc' I get "command not found". The rest of the commands did nothing, and I got a similar error when trying to pip install dipy. I know I have XCode, because I am looking at its icon on my doc. Hmmm, any chance that I don't have the right XCode? How can I test this? Thanks, Conrad On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: > Hi Conrad, > > Looks like pip is not managing to find your compiler, though if you have > installed XCode that should be in place. > > What do you get when you type "gcc" at the command line? > > Assuming that indicates that the shell can find your compiler -- you > should get a message indicating that the compiler couldn't find any input > files, or something to that nature -- does typing (as suggested here: > http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip > ): > > export CC=gcc > > and then running: > > pip install dipy > > do the trick for you? > > Cheers, > > Ariel > > On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel wrote: > >> Hi folks, >> >> I am brand new to Python, and I am having difficulties installing Dipy >> onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and >> set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of the >> dependencies as per the Dipy website, and it said that this packages were >> already 'satisfied'. >> >> However, when I then 'pip install dipy', I get errors such as: >> >> SystemError: Cannot locate working compiler >> >> ---------------------------------------- >> Failed building wheel for dipy >> Failed to build dipy >> >> ... AND... >> >> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >> install --record >> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >> --single-version-externally-managed --compile" failed with error code 1 in >> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >> >> As I'm not familiar with the Python environment, I'm not quite sure where >> to start troubleshooting. >> >> Does anyone have any suggestions? >> >> Thanks, >> >> Conrad >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Dec 15 18:23:27 2015 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 15 Dec 2015 15:23:27 -0800 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Sorry for doing this step-by-step, but did you also install the command line developer tools? http://apple.stackexchange.com/questions/48099/gcc-not-found-but-xcode-is-installed (occurs to me this needs better explanation in our documentation...) On Tue, Dec 15, 2015 at 3:18 PM, C.P. Rockel wrote: > Hi Ariel, > > when I type 'gcc' I get "command not found". The rest of the commands did > nothing, and I got a similar error when trying to pip install dipy. > > I know I have XCode, because I am looking at its icon on my doc. Hmmm, > any chance that I don't have the right XCode? How can I test this? > > Thanks, > > Conrad > > On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: > >> Hi Conrad, >> >> Looks like pip is not managing to find your compiler, though if you have >> installed XCode that should be in place. >> >> What do you get when you type "gcc" at the command line? >> >> Assuming that indicates that the shell can find your compiler -- you >> should get a message indicating that the compiler couldn't find any input >> files, or something to that nature -- does typing (as suggested here: >> http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip >> ): >> >> export CC=gcc >> >> and then running: >> >> pip install dipy >> >> do the trick for you? >> >> Cheers, >> >> Ariel >> >> On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel >> wrote: >> >>> Hi folks, >>> >>> I am brand new to Python, and I am having difficulties installing Dipy >>> onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and >>> set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of the >>> dependencies as per the Dipy website, and it said that this packages were >>> already 'satisfied'. >>> >>> However, when I then 'pip install dipy', I get errors such as: >>> >>> SystemError: Cannot locate working compiler >>> >>> ---------------------------------------- >>> Failed building wheel for dipy >>> Failed to build dipy >>> >>> ... AND... >>> >>> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >>> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >>> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >>> install --record >>> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >>> --single-version-externally-managed --compile" failed with error code 1 in >>> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >>> >>> As I'm not familiar with the Python environment, I'm not quite sure >>> where to start troubleshooting. >>> >>> Does anyone have any suggestions? >>> >>> Thanks, >>> >>> Conrad >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rockelcp at mcmaster.ca Tue Dec 15 18:35:33 2015 From: rockelcp at mcmaster.ca (C.P. Rockel) Date: Tue, 15 Dec 2015 18:35:33 -0500 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: I tried to update XCode, but it wouldn't let me since my OS is only 10.6.8. :( I'm not in the mood to update my OS at this point in my PhD... My XCode is version is 3.2.6, and it doesn't have the Downloads option of which you speak. Are there any substitutes that you can think of? Cheers, Conrad On Tue, Dec 15, 2015 at 6:23 PM, Ariel Rokem wrote: > Sorry for doing this step-by-step, but did you also install the command > line developer tools? > > > http://apple.stackexchange.com/questions/48099/gcc-not-found-but-xcode-is-installed > > (occurs to me this needs better explanation in our documentation...) > > On Tue, Dec 15, 2015 at 3:18 PM, C.P. Rockel wrote: > >> Hi Ariel, >> >> when I type 'gcc' I get "command not found". The rest of the commands >> did nothing, and I got a similar error when trying to pip install dipy. >> >> I know I have XCode, because I am looking at its icon on my doc. Hmmm, >> any chance that I don't have the right XCode? How can I test this? >> >> Thanks, >> >> Conrad >> >> On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: >> >>> Hi Conrad, >>> >>> Looks like pip is not managing to find your compiler, though if you have >>> installed XCode that should be in place. >>> >>> What do you get when you type "gcc" at the command line? >>> >>> Assuming that indicates that the shell can find your compiler -- you >>> should get a message indicating that the compiler couldn't find any input >>> files, or something to that nature -- does typing (as suggested here: >>> http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip >>> ): >>> >>> export CC=gcc >>> >>> and then running: >>> >>> pip install dipy >>> >>> do the trick for you? >>> >>> Cheers, >>> >>> Ariel >>> >>> On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel >>> wrote: >>> >>>> Hi folks, >>>> >>>> I am brand new to Python, and I am having difficulties installing Dipy >>>> onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and >>>> set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of the >>>> dependencies as per the Dipy website, and it said that this packages were >>>> already 'satisfied'. >>>> >>>> However, when I then 'pip install dipy', I get errors such as: >>>> >>>> SystemError: Cannot locate working compiler >>>> >>>> ---------------------------------------- >>>> Failed building wheel for dipy >>>> Failed to build dipy >>>> >>>> ... AND... >>>> >>>> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >>>> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >>>> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >>>> install --record >>>> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >>>> --single-version-externally-managed --compile" failed with error code 1 in >>>> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >>>> >>>> As I'm not familiar with the Python environment, I'm not quite sure >>>> where to start troubleshooting. >>>> >>>> Does anyone have any suggestions? >>>> >>>> Thanks, >>>> >>>> Conrad >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Dec 15 18:41:57 2015 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 15 Dec 2015 15:41:57 -0800 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Hmm. Maybe this one is the right for you? https://github.com/kennethreitz/osx-gcc-installer (See also: http://stackoverflow.com/questions/4360110/installing-gcc-to-mac-os-x-leopard-without-installing-xcode ) On Tue, Dec 15, 2015 at 3:35 PM, C.P. Rockel wrote: > I tried to update XCode, but it wouldn't let me since my OS is only > 10.6.8. :( > > I'm not in the mood to update my OS at this point in my PhD... > > My XCode is version is 3.2.6, and it doesn't have the Downloads option of > which you speak. > > Are there any substitutes that you can think of? > > Cheers, > > Conrad > > On Tue, Dec 15, 2015 at 6:23 PM, Ariel Rokem wrote: > >> Sorry for doing this step-by-step, but did you also install the command >> line developer tools? >> >> >> http://apple.stackexchange.com/questions/48099/gcc-not-found-but-xcode-is-installed >> >> (occurs to me this needs better explanation in our documentation...) >> >> On Tue, Dec 15, 2015 at 3:18 PM, C.P. Rockel >> wrote: >> >>> Hi Ariel, >>> >>> when I type 'gcc' I get "command not found". The rest of the commands >>> did nothing, and I got a similar error when trying to pip install dipy. >>> >>> I know I have XCode, because I am looking at its icon on my doc. Hmmm, >>> any chance that I don't have the right XCode? How can I test this? >>> >>> Thanks, >>> >>> Conrad >>> >>> On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: >>> >>>> Hi Conrad, >>>> >>>> Looks like pip is not managing to find your compiler, though if you >>>> have installed XCode that should be in place. >>>> >>>> What do you get when you type "gcc" at the command line? >>>> >>>> Assuming that indicates that the shell can find your compiler -- you >>>> should get a message indicating that the compiler couldn't find any input >>>> files, or something to that nature -- does typing (as suggested here: >>>> http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip >>>> ): >>>> >>>> export CC=gcc >>>> >>>> and then running: >>>> >>>> pip install dipy >>>> >>>> do the trick for you? >>>> >>>> Cheers, >>>> >>>> Ariel >>>> >>>> On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel >>>> wrote: >>>> >>>>> Hi folks, >>>>> >>>>> I am brand new to Python, and I am having difficulties installing Dipy >>>>> onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded and >>>>> set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all of >>>>> the dependencies as per the Dipy website, and it said that this packages >>>>> were already 'satisfied'. >>>>> >>>>> However, when I then 'pip install dipy', I get errors such as: >>>>> >>>>> SystemError: Cannot locate working compiler >>>>> >>>>> ---------------------------------------- >>>>> Failed building wheel for dipy >>>>> Failed to build dipy >>>>> >>>>> ... AND... >>>>> >>>>> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >>>>> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >>>>> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >>>>> install --record >>>>> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >>>>> --single-version-externally-managed --compile" failed with error code 1 in >>>>> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >>>>> >>>>> As I'm not familiar with the Python environment, I'm not quite sure >>>>> where to start troubleshooting. >>>>> >>>>> Does anyone have any suggestions? >>>>> >>>>> Thanks, >>>>> >>>>> Conrad >>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rockelcp at mcmaster.ca Tue Dec 15 18:44:10 2015 From: rockelcp at mcmaster.ca (C.P. Rockel) Date: Tue, 15 Dec 2015 18:44:10 -0500 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Just found that one also! Currently trying it.... On Tue, Dec 15, 2015 at 6:41 PM, Ariel Rokem wrote: > Hmm. Maybe this one is the right for you? > > https://github.com/kennethreitz/osx-gcc-installer > > (See also: > http://stackoverflow.com/questions/4360110/installing-gcc-to-mac-os-x-leopard-without-installing-xcode > ) > > > > On Tue, Dec 15, 2015 at 3:35 PM, C.P. Rockel wrote: > >> I tried to update XCode, but it wouldn't let me since my OS is only >> 10.6.8. :( >> >> I'm not in the mood to update my OS at this point in my PhD... >> >> My XCode is version is 3.2.6, and it doesn't have the Downloads option of >> which you speak. >> >> Are there any substitutes that you can think of? >> >> Cheers, >> >> Conrad >> >> On Tue, Dec 15, 2015 at 6:23 PM, Ariel Rokem wrote: >> >>> Sorry for doing this step-by-step, but did you also install the command >>> line developer tools? >>> >>> >>> http://apple.stackexchange.com/questions/48099/gcc-not-found-but-xcode-is-installed >>> >>> (occurs to me this needs better explanation in our documentation...) >>> >>> On Tue, Dec 15, 2015 at 3:18 PM, C.P. Rockel >>> wrote: >>> >>>> Hi Ariel, >>>> >>>> when I type 'gcc' I get "command not found". The rest of the commands >>>> did nothing, and I got a similar error when trying to pip install dipy. >>>> >>>> I know I have XCode, because I am looking at its icon on my doc. Hmmm, >>>> any chance that I don't have the right XCode? How can I test this? >>>> >>>> Thanks, >>>> >>>> Conrad >>>> >>>> On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: >>>> >>>>> Hi Conrad, >>>>> >>>>> Looks like pip is not managing to find your compiler, though if you >>>>> have installed XCode that should be in place. >>>>> >>>>> What do you get when you type "gcc" at the command line? >>>>> >>>>> Assuming that indicates that the shell can find your compiler -- you >>>>> should get a message indicating that the compiler couldn't find any input >>>>> files, or something to that nature -- does typing (as suggested here: >>>>> http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip >>>>> ): >>>>> >>>>> export CC=gcc >>>>> >>>>> and then running: >>>>> >>>>> pip install dipy >>>>> >>>>> do the trick for you? >>>>> >>>>> Cheers, >>>>> >>>>> Ariel >>>>> >>>>> On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel >>>>> wrote: >>>>> >>>>>> Hi folks, >>>>>> >>>>>> I am brand new to Python, and I am having difficulties installing >>>>>> Dipy onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded >>>>>> and set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all >>>>>> of the dependencies as per the Dipy website, and it said that this packages >>>>>> were already 'satisfied'. >>>>>> >>>>>> However, when I then 'pip install dipy', I get errors such as: >>>>>> >>>>>> SystemError: Cannot locate working compiler >>>>>> >>>>>> ---------------------------------------- >>>>>> Failed building wheel for dipy >>>>>> Failed to build dipy >>>>>> >>>>>> ... AND... >>>>>> >>>>>> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >>>>>> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >>>>>> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >>>>>> install --record >>>>>> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >>>>>> --single-version-externally-managed --compile" failed with error code 1 in >>>>>> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >>>>>> >>>>>> As I'm not familiar with the Python environment, I'm not quite sure >>>>>> where to start troubleshooting. >>>>>> >>>>>> Does anyone have any suggestions? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Conrad >>>>>> >>>>>> _______________________________________________ >>>>>> Neuroimaging mailing list >>>>>> Neuroimaging at python.org >>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rockelcp at mcmaster.ca Tue Dec 15 19:18:11 2015 From: rockelcp at mcmaster.ca (C.P. Rockel) Date: Tue, 15 Dec 2015 19:18:11 -0500 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Ok, that didn't seem to quite work either. It seems like there is a crazy amount of errors, which I can send you if that would help. I uninstalled XCode beforehand, btw. Would I need to re-install Anaconda now that the Kenneth Reitz compiler is in place, perhaps? Thanks, C On Tue, Dec 15, 2015 at 6:41 PM, Ariel Rokem wrote: > Hmm. Maybe this one is the right for you? > > https://github.com/kennethreitz/osx-gcc-installer > > (See also: > http://stackoverflow.com/questions/4360110/installing-gcc-to-mac-os-x-leopard-without-installing-xcode > ) > > > > On Tue, Dec 15, 2015 at 3:35 PM, C.P. Rockel wrote: > >> I tried to update XCode, but it wouldn't let me since my OS is only >> 10.6.8. :( >> >> I'm not in the mood to update my OS at this point in my PhD... >> >> My XCode is version is 3.2.6, and it doesn't have the Downloads option of >> which you speak. >> >> Are there any substitutes that you can think of? >> >> Cheers, >> >> Conrad >> >> On Tue, Dec 15, 2015 at 6:23 PM, Ariel Rokem wrote: >> >>> Sorry for doing this step-by-step, but did you also install the command >>> line developer tools? >>> >>> >>> http://apple.stackexchange.com/questions/48099/gcc-not-found-but-xcode-is-installed >>> >>> (occurs to me this needs better explanation in our documentation...) >>> >>> On Tue, Dec 15, 2015 at 3:18 PM, C.P. Rockel >>> wrote: >>> >>>> Hi Ariel, >>>> >>>> when I type 'gcc' I get "command not found". The rest of the commands >>>> did nothing, and I got a similar error when trying to pip install dipy. >>>> >>>> I know I have XCode, because I am looking at its icon on my doc. Hmmm, >>>> any chance that I don't have the right XCode? How can I test this? >>>> >>>> Thanks, >>>> >>>> Conrad >>>> >>>> On Tue, Dec 15, 2015 at 6:08 PM, Ariel Rokem wrote: >>>> >>>>> Hi Conrad, >>>>> >>>>> Looks like pip is not managing to find your compiler, though if you >>>>> have installed XCode that should be in place. >>>>> >>>>> What do you get when you type "gcc" at the command line? >>>>> >>>>> Assuming that indicates that the shell can find your compiler -- you >>>>> should get a message indicating that the compiler couldn't find any input >>>>> files, or something to that nature -- does typing (as suggested here: >>>>> http://stackoverflow.com/questions/21193936/getting-errors-when-trying-to-install-matplotlib-and-numpy-using-pip >>>>> ): >>>>> >>>>> export CC=gcc >>>>> >>>>> and then running: >>>>> >>>>> pip install dipy >>>>> >>>>> do the trick for you? >>>>> >>>>> Cheers, >>>>> >>>>> Ariel >>>>> >>>>> On Tue, Dec 15, 2015 at 2:55 PM, C.P. Rockel >>>>> wrote: >>>>> >>>>>> Hi folks, >>>>>> >>>>>> I am brand new to Python, and I am having difficulties installing >>>>>> Dipy onto my Mac (OSX 10.6.8). I already had X-code, and I just downloaded >>>>>> and set up Anaconda3-2.4.1-MacOSX-x86_64.sh. I tried "pipping" all >>>>>> of the dependencies as per the Dipy website, and it said that this packages >>>>>> were already 'satisfied'. >>>>>> >>>>>> However, when I then 'pip install dipy', I get errors such as: >>>>>> >>>>>> SystemError: Cannot locate working compiler >>>>>> >>>>>> ---------------------------------------- >>>>>> Failed building wheel for dipy >>>>>> Failed to build dipy >>>>>> >>>>>> ... AND... >>>>>> >>>>>> Command "/Applications/anaconda3/bin/python3 -c "import setuptools, >>>>>> tokenize;__file__='/private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy/setup.py';exec(compile(getattr(tokenize, >>>>>> 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" >>>>>> install --record >>>>>> /var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-l7x7k8y9-record/install-record.txt >>>>>> --single-version-externally-managed --compile" failed with error code 1 in >>>>>> /private/var/folders/5p/5pdhhqwBGJCf2rrSB7PRn++++TQ/-Tmp-/pip-build-4zbiedwj/dipy >>>>>> >>>>>> As I'm not familiar with the Python environment, I'm not quite sure >>>>>> where to start troubleshooting. >>>>>> >>>>>> Does anyone have any suggestions? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Conrad >>>>>> >>>>>> _______________________________________________ >>>>>> Neuroimaging mailing list >>>>>> Neuroimaging at python.org >>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Dec 15 19:27:16 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 16 Dec 2015 00:27:16 +0000 Subject: [Neuroimaging] Mac OSX 10.6.8 problem In-Reply-To: References: Message-ID: Hi, On Wed, Dec 16, 2015 at 12:18 AM, C.P. Rockel wrote: > Ok, that didn't seem to quite work either. It seems like there is a crazy > amount of errors, which I can send you if that would help. I uninstalled > XCode beforehand, btw. > > Would I need to re-install Anaconda now that the Kenneth Reitz compiler is > in place, perhaps? Another option is to install Python via Python.org installers or homebrew, then: pip install scipy dipy If you use the Python.org installers, you may get away without needing a compiler - if that isn't true, please let me know what error you get. Cheers, Matthew From arokem at gmail.com Wed Dec 16 11:59:47 2015 From: arokem at gmail.com (Ariel Rokem) Date: Wed, 16 Dec 2015 08:59:47 -0800 Subject: [Neuroimaging] Homebrew Python for Dipy on Mac? (Was: Mac OSX 10.6.8 problem) Message-ID: Hi, On Tue, Dec 15, 2015 at 4:27 PM, Matthew Brett wrote: > Hi, > > On Wed, Dec 16, 2015 at 12:18 AM, C.P. Rockel > wrote: > > Ok, that didn't seem to quite work either. It seems like there is a > crazy > > amount of errors, which I can send you if that would help. I uninstalled > > XCode beforehand, btw. > > > > Would I need to re-install Anaconda now that the Kenneth Reitz compiler > is > > in place, perhaps? > > Another option is to install Python via Python.org installers or homebrew, > then: > Interesting. Homebrew + pip might actually be a good way to install Dipy + dependencies on OS X, because it seems that homebrew can also be used to install VTK: https://github.com/nipy/dipy/pull/739#issuecomment-152006600 But there are a few wrinkles. For example, tcl-tk needs to be installed separately, and a flag needs to be passed to the Python installation, for tkinter to to work (we use tkinter in some of the newer viz stuff). VTK also needs to be installed separately, through its own incantation. Here are some instructions that seem to work on my machine: 1. Install homebrew from http://brew.sh/ 2. brew install homebrew/dupes/tcl-tk 3. brew install python --with-tcl-tk 4. brew install vtk --with-python 5. sudo pip install scipy dipy Maybe we can somehow test this out on the Travis Mac machines, or on the buildbots? If this does generally work well, we can also change the installation instructions in our docs. Cheers, Ariel > pip install scipy dipy > > If you use the Python.org installers, you may get away without needing > a compiler - if that isn't true, please let me know what error you > get. > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Thu Dec 17 11:42:27 2015 From: stjeansam at gmail.com (Samuel St-Jean) Date: Thu, 17 Dec 2015 17:42:27 +0100 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel Message-ID: Hello, I'd like to get the scaling and offset value to compare data from two volumes acquired in the same scan. Since the floating points values are wildly different, we need ot put them on the same scaling of intensity. This can be done from the dicom files directly, but it a painful and error prone process and difficult to automate. Turns out mrconvert from mrtrix does not take into account scaling when your data is in floating point (by design, so even the options to correct this scaling are disabled), but still stores the correct values. When opening the nifti files converted from nibabel the headers just show nan values, while mrtrix can parse them correctly. Here is the header from mrinfo as an example. ************************************************ Image: "noise.nii.gz" ************************************************ Format: NIfTI-1.1 Dimensions: 128 x 128 x 78 Voxel size: 1.79665 x 1.79665 x 1.8 Dimension labels: 0. left->right (mm) 1. posterior->anterior (mm) 2. inferior->superior (mm) Data type: signed 16 bit integer (little endian) Data layout: [ -0 -1 +2 ] Data scaling: offset = 0, multiplier = 166.825 Comments: SAMUEL 9JUIL (SAMU MRtrix version: 0.3.12-1050-g62ace960 Transform: 0.9993 -0.00287 -0.03691 -115.6 -5.649e-11 0.997 -0.07752 -108.4 0.03702 0.07746 0.9963 -60.36 0 0 0 1 Same header through nibabel object, endian='<' sizeof_hdr : 348 data_type : db_name : SAMUEL 9JUIL (SAMU extents : 16384 session_error : 0 regular : r dim_info : 0 dim : [ 3 128 128 78 1 1 1 1] intent_p1 : 0.0 intent_p2 : 0.0 intent_p3 : 0.0 intent_code : none datatype : int16 bitpix : 16 slice_start : 0 pixdim : [ 1. 1.79665172 1.79665172 1.79999924 0. 0. 0. 0. ] vox_offset : 0.0 scl_slope : nan scl_inter : nan slice_end : 0 slice_code : unknown xyzt_units : 10 cal_max : 0.0 cal_min : 0.0 slice_duration : 0.0 toffset : 0.0 glmax : 0 glmin : 0 descrip : MRtrix version: 0.3.12-1050-g62ace960 aux_file : qform_code : scanner sform_code : scanner quatern_b : -0.0185005571693 quatern_c : -0.038780156523 quatern_d : 0.999076247215 qoffset_x : 111.719642639 qoffset_y : 119.12978363 qoffset_z : -34.2374076843 srow_x : [ -1.79541993e+00 5.15606394e-03 -6.64402023e-02 1.11719643e+02] srow_y : [ 1.01492648e-10 -1.79124594e+00 -1.39527366e-01 1.19129784e+02] srow_z : [ -0.06651677 -0.13917242 1.79335308 -34.23740768] intent_name : magic : n+1 Those would probably be scl_slope and scl_inter, but the values just got lost somewhere apparently. The same behavior is seen from the example in nifti header here : http://nipy.org/nibabel/nifti_images.html#the-nifti-header So the questions 1. Did something went wrong, or do both program just use different field? If so, does nibabel show all fields in the header or only known ones? 2. Is the field scl_slope/scl_inter even related to the dicom scaling (and offset/multiplier from mrtrix, though this one goes in their mailing list if nobody knows)? Philips call those SS and SI values, and you simply need to fix the offset + scaling with these numbers to get sane intensities out of your scanner, but they are a huge pain to get out. Thanks, Samuel -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Dec 17 12:57:09 2015 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 17 Dec 2015 17:57:09 +0000 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel In-Reply-To: References: Message-ID: Hi, On Thu, Dec 17, 2015 at 4:42 PM, Samuel St-Jean wrote: > Hello, > > I'd like to get the scaling and offset value to compare data from two > volumes acquired in the same scan. Since the floating points values are > wildly different, we need ot put them on the same scaling of intensity. This > can be done from the dicom files directly, but it a painful and error prone > process and difficult to automate. > > Turns out mrconvert from mrtrix does not take into account scaling when your > data is in floating point (by design, so even the options to correct this > scaling are disabled), but still stores the correct values. When opening the > nifti files converted from nibabel the headers just show nan values, while > mrtrix can parse them correctly. Here is the header from mrinfo as an > example. > > ************************************************ > Image: "noise.nii.gz" > ************************************************ > Format: NIfTI-1.1 > Dimensions: 128 x 128 x 78 > Voxel size: 1.79665 x 1.79665 x 1.8 > Dimension labels: 0. left->right (mm) > 1. posterior->anterior (mm) > 2. inferior->superior (mm) > Data type: signed 16 bit integer (little endian) > Data layout: [ -0 -1 +2 ] > Data scaling: offset = 0, multiplier = 166.825 > Comments: SAMUEL 9JUIL (SAMU > MRtrix version: 0.3.12-1050-g62ace960 > Transform: 0.9993 -0.00287 -0.03691 -115.6 > -5.649e-11 0.997 -0.07752 -108.4 > 0.03702 0.07746 0.9963 -60.36 > 0 0 0 1 > > Same header through nibabel > > object, endian='<' > sizeof_hdr : 348 > data_type : > db_name : SAMUEL 9JUIL (SAMU > extents : 16384 > session_error : 0 > regular : r > dim_info : 0 > dim : [ 3 128 128 78 1 1 1 1] > intent_p1 : 0.0 > intent_p2 : 0.0 > intent_p3 : 0.0 > intent_code : none > datatype : int16 > bitpix : 16 > slice_start : 0 > pixdim : [ 1. 1.79665172 1.79665172 1.79999924 0. > 0. 0. > 0. ] > vox_offset : 0.0 > scl_slope : nan > scl_inter : nan > slice_end : 0 > slice_code : unknown > xyzt_units : 10 > cal_max : 0.0 > cal_min : 0.0 > slice_duration : 0.0 > toffset : 0.0 > glmax : 0 > glmin : 0 > descrip : MRtrix version: 0.3.12-1050-g62ace960 > aux_file : > qform_code : scanner > sform_code : scanner > quatern_b : -0.0185005571693 > quatern_c : -0.038780156523 > quatern_d : 0.999076247215 > qoffset_x : 111.719642639 > qoffset_y : 119.12978363 > qoffset_z : -34.2374076843 > srow_x : [ -1.79541993e+00 5.15606394e-03 -6.64402023e-02 > 1.11719643e+02] > srow_y : [ 1.01492648e-10 -1.79124594e+00 -1.39527366e-01 > 1.19129784e+02] > srow_z : [ -0.06651677 -0.13917242 1.79335308 -34.23740768] > intent_name : > magic : n+1 > > > Those would probably be scl_slope and scl_inter, but the values just got > lost somewhere apparently. The same behavior is seen from the example in > nifti header here : > http://nipy.org/nibabel/nifti_images.html#the-nifti-header > > So the questions > > 1. Did something went wrong, or do both program just use different field? If > so, does nibabel show all fields in the header or only known ones? > 2. Is the field scl_slope/scl_inter even related to the dicom scaling (and > offset/multiplier from mrtrix, though this one goes in their mailing list if > nobody knows)? Philips call those SS and SI values, and you simply need to > fix the offset + scaling with these numbers to get sane intensities out of > your scanner, but they are a huge pain to get out. No - nothing wrong. The nibabel reader resets the scale and slope to NaN by default when it reads in the image. If you want to read the original scale and slope, you can get them from the image array proxy object, as in: img.dataobj.slope img.dataobj.inter Is that what you needed? Matthew From alexandre.gramfort at telecom-paristech.fr Sat Dec 26 04:14:47 2015 From: alexandre.gramfort at telecom-paristech.fr (Alexandre Gramfort) Date: Sat, 26 Dec 2015 10:14:47 +0100 Subject: [Neuroimaging] [ANN] MNE-Python 0.11 Release Message-ID: Hi, We are pleased to announce the new 0.11 release of MNE-python. A few highlights: Maxwell filtering (SSS) has been implemented primarily by Mark Wronkiewicz as part of his Google Summer of Code with the help of Eric Larson and Samu Taulu. Our implementation includes support for: - Fine calibration - Cross-talk correction - Temporal SSS (tSSS) - Head position translation - Internal component regularization - Compensation for movements using Maxwell filtering on epoched data Support for raw movement compensation is planned for a future release. See http://martinos.org/mne/stable/generated/mne.preprocessing.maxwell_filter.html New data readers have also been added for Nicolet, EEGLAB (.set), and CTF data (.ds). For a full list of improvements and API changes, see: http://martinos.org/mne/stable/whats_new.html#version-0-11 To install the latest release the following command should do the job: pip install --upgrade --user mne As usual we welcome your bug reports, feature requests, critiques and contributions. Some links: - https://github.com/mne-tools/mne-python (code + readme on how to install) - http://martinos.org/mne/stable/ (full MNE documentation) Follow us on Twitter: https://twitter.com/mne_python Regards, The MNE-Python developers People who contributed to this release with their number of commits: The committer list for this release is the following (preceded by number of commits): 171 Eric Larson 117 Jaakko Leppakangas 58 Jona Sassenhagen 52 Mainak Jas 46 Alexandre Gramfort 33 Denis A. Engemann 28 Teon Brooks 24 Clemens Brunner 23 Christian Brodbeck 15 Mark Wronkiewicz 10 Jean-Remi King 5 Marijn van Vliet 3 Fede Raimondo 2 Alexander Rudiuk 2 Emily Stephen 2 lennyvarghese 1 Marian Dovgialo From bcipolli at ucsd.edu Sat Dec 26 17:31:20 2015 From: bcipolli at ucsd.edu (Ben Cipollini) Date: Sat, 26 Dec 2015 14:31:20 -0800 Subject: [Neuroimaging] [ANN] MNE-Python 0.11 Release In-Reply-To: References: Message-ID: Congratulations! Also, very cool integration with Zenodo. I haven't seen that before, and it's just what I've been looking for. On another day, I may ask more about that :) Ben On Sat, Dec 26, 2015 at 1:14 AM, Alexandre Gramfort < alexandre.gramfort at telecom-paristech.fr> wrote: > Hi, > > We are pleased to announce the new 0.11 release of MNE-python. > > A few highlights: > > Maxwell filtering (SSS) has been implemented primarily by Mark > Wronkiewicz as part of his Google Summer of Code with the help of Eric > Larson and Samu Taulu. Our implementation includes support for: > - Fine calibration > - Cross-talk correction > - Temporal SSS (tSSS) > - Head position translation > - Internal component regularization > - Compensation for movements using Maxwell filtering on epoched data > > Support for raw movement compensation is planned for a future release. > > See > > > http://martinos.org/mne/stable/generated/mne.preprocessing.maxwell_filter.html > > New data readers have also been added for Nicolet, EEGLAB (.set), and > CTF data (.ds). > > For a full list of improvements and API changes, see: > > http://martinos.org/mne/stable/whats_new.html#version-0-11 > > To install the latest release the following command should do the job: > > pip install --upgrade --user mne > > As usual we welcome your bug reports, feature requests, critiques and > contributions. > > Some links: > > - https://github.com/mne-tools/mne-python (code + readme on how to > install) > - http://martinos.org/mne/stable/ (full MNE documentation) > > Follow us on Twitter: https://twitter.com/mne_python > > Regards, > The MNE-Python developers > > People who contributed to this release with their number of commits: > > The committer list for this release is the following (preceded by > number of commits): > > 171 Eric Larson > 117 Jaakko Leppakangas > 58 Jona Sassenhagen > 52 Mainak Jas > 46 Alexandre Gramfort > 33 Denis A. Engemann > 28 Teon Brooks > 24 Clemens Brunner > 23 Christian Brodbeck > 15 Mark Wronkiewicz > 10 Jean-Remi King > 5 Marijn van Vliet > 3 Fede Raimondo > 2 Alexander Rudiuk > 2 Emily Stephen > 2 lennyvarghese > 1 Marian Dovgialo > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: