From jhlegarreta at vicomtech.org Wed Jun 1 12:56:25 2016 From: jhlegarreta at vicomtech.org (Jon Haitz Legarreta) Date: Wed, 1 Jun 2016 18:56:25 +0200 Subject: [Neuroimaging] Fwd: [dipy] Issues trying to install dipy In-Reply-To: References: Message-ID: Dear Eleftherios, thanks for the suggestion. Uninstalling MinGW ports me back to the original error of the thread: File: "C:\Anaconda3\lib\distutils\cygwinccompiler.py", line 126, __init__ if self.ld_version >= "2.10.90": TypeError: unorderable types: NoneType() >= str() I agree that no extra compiler should be necessary, but one of the links Ariel suggested was pointing in that direction. Anyway, yes, let's see if we can solve it through a hangout and come back to the list with the solution. Sincerely, JON HAITZ On 1 June 2016 at 00:39, Eleftherios Garyfallidis wrote: > Hi Jon, > > When I develop DIPY using Anaconda in Windows I don't think I need to > install a compiler separately. Anaconda is coming with its own MinGW > compiler. Could it be that you are adding an extra compiler when it is not > necessary? > > Also I 'll be happy to hangout and help you with your installation > problem. Send me an e-mail off the list please to arrange a meeting. > > Cheers, > Eleftherios > > > > On Tue, May 31, 2016 at 5:34 PM Jon Haitz Legarreta < > jhlegarreta at vicomtech.org> wrote: > >> Hi Greg, >> using the virtual env I'm able to run the examples without major issues. >> Indeed, for the virtual env I downloaded the conda-forge dipy package. >> >> But my intent is to build from the latest source. >> >> Thanks, >> JON HAITZ >> >> >> >> >> >> On 31 May 2016 at 22:01, Gregory Lee wrote: >> >>> Hi Jon, >>> >>> Do you need to build from the lastest source or are you just trying to >>> install a working version to run examples? If the later, it may be worth >>> trying the pre-built conda packages Ariel recently created at conda-forge. >>> To install: >>> >>> conda install -c conda-forge dipy >>> >>> - Greg >>> >>> On Tue, May 31, 2016 at 2:58 PM, Jon Haitz Legarreta < >>> jhlegarreta at vicomtech.org> wrote: >>> >>>> Dear Eleftherios, >>>> thanks for the follow-up. >>>> >>>> No, I did not contact the Anaconda developers, since I was a little bit >>>> lost in the messages/source problem. But if your guess is that it is more >>>> likely an Anaconda problem, I will contact them and let you know. >>>> >>>> On the other hand, you are right; I installed Anaconda, then tried to >>>> set up dipy for development. >>>> >>>> Also I have a virtual environment where I downloaded the dipy >>>> dependencies. I do not know if the latter step can be avoided or else, >>>> whether one can be substituted by the other: i.e. and whether when one >>>> installs dipy for development from sources, dipy scripts take care of >>>> putting in place the necessary packages. I just thought that creating a >>>> virtual env with just the dipy dependencies would be cleaner for >>>> development (i.e. avoid clashes with other packages or repos, etc.) >>>> >>>> Since one of the links posted suggested it, I installed MinGW, but had >>>> no other compiler installer on my machine. >>>> >>>> Sincerely, >>>> JON HAITZ >>>> >>>> >>>> >>>> >>>> >>>> On 31 May 2016 at 00:16, Eleftherios Garyfallidis < >>>> garyfallidis at gmail.com> wrote: >>>> >>>>> Hi Jon, >>>>> >>>>> The error is not related to SSE or to OMP. Those are ommitted and then >>>>> the compilation continues properly. The problem appears later. Here is the >>>>> message >>>>> >>>>> C:\Anaconda3\libs/python35.lib: error adding symbols: File format not recognized >>>>> collect2.exe: error: ld returned 1 exit status >>>>> >>>>> Have you contacted the Anaconda developers? This looks like a problem >>>>> on their side. >>>>> >>>>> Let us know what they said to you. Otherwise I wonder if this is a >>>>> specific problem with Python 3 or if it affects also Python 2. You may want >>>>> to try that too. The problem does look more likely to be related to the >>>>> compiler used. >>>>> >>>>> Am I correct to say that the only thing that you did was to install >>>>> Anaconda and then pip install dipy? Did you have other compilers already >>>>> installed in your system? >>>>> >>>>> Best regards, >>>>> Eleftherios >>>>> >>>>> On Mon, May 30, 2016 at 5:51 PM Jon Haitz Legarreta < >>>>> jhlegarreta at vicomtech.org> wrote: >>>>> >>>>>> Dear Ariel, >>>>>> thanks for your suggestion. >>>>>> >>>>>> The patches in the link seem to help a little bit, but the process >>>>>> seems still to be unsuccessful: the MinGW gcc complains with the message: >>>>>> gcc: error: /arch:SSE2: No such file or directory >>>>>> >>>>>> Attached is the new log. >>>>>> >>>>>> Again, googling was not of much help. I got bits and parts of related >>>>>> errors, but have no clear picture of the issue. >>>>>> >>>>>> Any suggestion is appreciated. >>>>>> >>>>>> JON HAITZ >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On 24 May 2016 at 01:08, Ariel Rokem wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, May 23, 2016 at 3:19 PM, Jon Haitz Legarreta < >>>>>>> jhlegarreta at vicomtech.org> wrote: >>>>>>> >>>>>>>> Hi there, >>>>>>>> thank you Matthew and Ariel. >>>>>>>> >>>>>>>> The link pointed by Ariel does not seem to be a solution; after >>>>>>>> having installed MinGW, as suggested in the link and although I'm aware it >>>>>>>> might be unnecessary, the Anaconda3 powershell still yields a similar >>>>>>>> error, now pointing to MSVC (which I do not have on my system): >>>>>>>> >>>>>>>> File "C:\Anaconda3\lib\distutils\cygwinccompiler.py", line 157, >>>>>>>> in __init__ >>>>>>>> self.dll_libraries = get_msvcr() >>>>>>>> File "C:\Anaconda3\lib\distutils\cygwinccompiler.py", line 86, in >>>>>>>> get_msvcr >>>>>>>> raise ValueError("Unknown MS Compiler version %s " % msc_ver) >>>>>>>> ValueError: Unknown MS Compiler version 1900 >>>>>>>> >>>>>>>> Looks like maybe you ran into this corner case? >>>>>>> >>>>>>> http://stackoverflow.com/a/34427014/3532933 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> I'll try to investigate further, and will let you know. >>>>>>>> >>>>>>>> Kind regards, >>>>>>>> JON HAITZ >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 21 May 2016 at 16:55, Ariel Rokem wrote: >>>>>>>> >>>>>>>>> Hi Jon and Matthew, >>>>>>>>> >>>>>>>>> >>>>>>>>> On Sat, May 21, 2016 at 7:30 AM, Matthew Brett < >>>>>>>>> matthew.brett at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> Hi, >>>>>>>>>> >>>>>>>>>> On Sat, May 21, 2016 at 9:18 AM, Jon Haitz Legarreta >>>>>>>>>> wrote: >>>>>>>>>> > Hi there, >>>>>>>>>> > has anybody experienced the issue below? >>>>>>>>>> > >>>>>>>>>> > Thanks, >>>>>>>>>> > JON HAITZ >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > ---------- Forwarded message ---------- >>>>>>>>>> > From: Jon Haitz Legarreta >>>>>>>>>> > Date: 18 May 2016 at 19:10 >>>>>>>>>> > Subject: [dipy] Issues trying to install dipy >>>>>>>>>> > To: neuroimaging at python.org >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > Hi there, >>>>>>>>>> > I'm a newbie to dipy. >>>>>>>>>> > >>>>>>>>>> > I was trying to follow the instructions in [1] to have dipy >>>>>>>>>> installed from >>>>>>>>>> > the source code, so that I could execute the dipy examples. >>>>>>>>>> > >>>>>>>>>> > I'm using Windows 10 and Anaconda 3. >>>>>>>>>> > >>>>>>>>>> > When trying to execute >>>>>>>>>> > python setup.py develop >>>>>>>>>> > >>>>>>>>>> > the Anaconda prompt yields an error that says in the end: >>>>>>>>>> > File: "C:\Anaconda3\lib\distutils\cygwinccompiler.py", line >>>>>>>>>> 126, __init__ >>>>>>>>>> > if self.ld_version >= "2.10.90": >>>>>>>>>> > TypeError: unorderable types: NoneType() >= str() >>>>>>>>>> > >>>>>>>>>> > I've been googling for a solution without success. >>>>>>>>>> > >>>>>>>>>> > I don't know whether this looks like Anaconda3 is trying to use >>>>>>>>>> cygwin >>>>>>>>>> > instead of mingw32, and whether that is the root cause. >>>>>>>>>> > >>>>>>>>>> > In either case, does anyone know how to solve the issue? >>>>>>>>>> > >>>>>>>>>> > Attached is the trace (it's short) of the error if this is of >>>>>>>>>> any help. >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > Thank you, >>>>>>>>>> > JON HAITZ >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > [1] http://nipy.org/dipy/installation.html#install-source-nix >>>>>>>>>> >>>>>>>>>> I'm sorry, I'm afraid I don't personally use Anaconda, so I have >>>>>>>>>> no >>>>>>>>>> experience of fixing compilation errors on Anaconda. Ariel - >>>>>>>>>> have >>>>>>>>>> you come across this? >>>>>>>>>> >>>>>>>>> >>>>>>>>> And I don't personally use Windows... >>>>>>>>> >>>>>>>>> Might this be helpful: >>>>>>>>> >>>>>>>>> >>>>>>>>> http://stackoverflow.com/questions/24683305/python-cant-install-packages-typeerror-unorderable-types-nonetype-str >>>>>>>>> >>>>>>>>> It seems like it could be related, though it's all Greek to me. >>>>>>>>> >>>>>>>>> Cheers, >>>>>>>>> >>>>>>>>> Ariel >>>>>>>>> >>>>>>>>> >>>>>>>>>> You could also try on the anaconda support channels (issues, >>>>>>>>>> mailing >>>>>>>>>> list) - it may well be a general problem rather than one specific >>>>>>>>>> to >>>>>>>>>> dipy, >>>>>>>>>> >>>>>>>>>> Best, >>>>>>>>>> >>>>>>>>>> Matthew >>>>>>>>>> _______________________________________________ >>>>>>>>>> Neuroimaging mailing list >>>>>>>>>> Neuroimaging at python.org >>>>>>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> _______________________________________________ >>>>>>>>> Neuroimaging mailing list >>>>>>>>> Neuroimaging at python.org >>>>>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> _______________________________________________ >>>>>>>> Neuroimaging mailing list >>>>>>>> Neuroimaging at python.org >>>>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> _______________________________________________ >>>>>>> Neuroimaging mailing list >>>>>>> Neuroimaging at python.org >>>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>>> >>>>>>> >>>>>> _______________________________________________ >>>>>> Neuroimaging mailing list >>>>>> Neuroimaging at python.org >>>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>>> >>>>> >>>>> _______________________________________________ >>>>> Neuroimaging mailing list >>>>> Neuroimaging at python.org >>>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sulantha.s at gmail.com Fri Jun 3 10:11:59 2016 From: sulantha.s at gmail.com (Sulantha Sanjeewa) Date: Fri, 3 Jun 2016 10:11:59 -0400 Subject: [Neuroimaging] [nibabel] Issue when saving nifti files Message-ID: Dear all, I am having trouble saving nifti files using nibabel. This is a simple load - mask - save situation. However, when I save, the resultant image is only two values [min and max] of the original image (almost like a mask). Can you help me on this. I have added the code here. and a link to the three files as well. Thanks a lot for the help. Best regards, Sulantha. Files Link: https://drive.google.com/open?id=0B-TWCTRv7UM1a0hxaWotbjRSSUE Code: img = nibabel.load('I300779.nii') imgData = img.get_data() mask = nibabel.load('FullBrain.nii') maskData = mask.get_data() maskedImgData = imgData[maskData>0.9] new_image = numpy.zeros(maskData.shape) new_image[maskData>0.9] = maskedImgData aff = mask.affine nibabel.save(nibabel.Nifti1Image(new_image, aff), 'I300779_m.nii') -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Fri Jun 3 11:38:38 2016 From: arokem at gmail.com (Ariel Rokem) Date: Fri, 3 Jun 2016 08:38:38 -0700 Subject: [Neuroimaging] [nibabel] Issue when saving nifti files In-Reply-To: References: Message-ID: Hi Sulantha, Thanks for getting in touch. Let's see if I can be helpful. On Fri, Jun 3, 2016 at 7:11 AM, Sulantha Sanjeewa wrote: > Dear all, > I am having trouble saving nifti files using nibabel. This is a simple > load - mask - save situation. However, when I save, the resultant image is > only two values [min and max] of the original image (almost like a mask). > Can you help me on this. I have added the code here. and a link to the > three files as well. > Thanks a lot for the help. > Best regards, > Sulantha. > > Files Link: https://drive.google.com/open?id=0B-TWCTRv7UM1a0hxaWotbjRSSUE > Code: > > img = nibabel.load('I300779.nii') > imgData = img.get_data() > mask = nibabel.load('FullBrain.nii') > maskData = mask.get_data() > maskedImgData = imgData[maskData>0.9] > new_image = numpy.zeros(maskData.shape) > new_image[maskData>0.9] = maskedImgData > aff = mask.affine > nibabel.save(nibabel.Nifti1Image(new_image, aff), 'I300779_m.nii') > > I am not seeing this when I run your code ( https://gist.github.com/arokem/7f996b56d8f7e17ae8377677314a25ad). I also find that whole range of values in the file you processed and provided (try running those two last cells on your file). What are you doing to check the values in "I300779_m.nii"? Maybe the software reading this file downstream is doing something funny? Cheers, Ariel > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sulantha.s at gmail.com Fri Jun 3 12:32:36 2016 From: sulantha.s at gmail.com (Sulantha Sanjeewa) Date: Fri, 3 Jun 2016 12:32:36 -0400 Subject: [Neuroimaging] [nibabel] Issue when saving nifti files In-Reply-To: References: Message-ID: Hi Ariel, Thanks a lot for helping me to figure this out. In fact, I used nii2mnc to convert the image to MINC before viewing the final image. You are correct. The problem is with that conversion, not from nibabel. I viewed the image with fslview and the image looks as expected. Thanks a lot, sorry to bother you with this. Best regards, Sulantha. On Fri, Jun 3, 2016 at 11:38 AM, Ariel Rokem wrote: > Hi Sulantha, > > Thanks for getting in touch. Let's see if I can be helpful. > > On Fri, Jun 3, 2016 at 7:11 AM, Sulantha Sanjeewa > wrote: > >> Dear all, >> I am having trouble saving nifti files using nibabel. This is a simple >> load - mask - save situation. However, when I save, the resultant image is >> only two values [min and max] of the original image (almost like a mask). >> Can you help me on this. I have added the code here. and a link to the >> three files as well. >> Thanks a lot for the help. >> Best regards, >> Sulantha. >> >> Files Link: https://drive.google.com/open?id=0B-TWCTRv7UM1a0hxaWotbjRSSUE >> Code: >> >> img = nibabel.load('I300779.nii') >> imgData = img.get_data() >> mask = nibabel.load('FullBrain.nii') >> maskData = mask.get_data() >> maskedImgData = imgData[maskData>0.9] >> new_image = numpy.zeros(maskData.shape) >> new_image[maskData>0.9] = maskedImgData >> aff = mask.affine >> nibabel.save(nibabel.Nifti1Image(new_image, aff), 'I300779_m.nii') >> >> > I am not seeing this when I run your code ( > https://gist.github.com/arokem/7f996b56d8f7e17ae8377677314a25ad). I also > find that whole range of values in the file you processed and provided (try > running those two last cells on your file). What are you doing to check the > values in "I300779_m.nii"? Maybe the software reading this file downstream > is doing something funny? > > Cheers, > > Ariel > > >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vivekjoshi1894 at gmail.com Sun Jun 5 06:30:07 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Sun, 5 Jun 2016 16:00:07 +0530 Subject: [Neuroimaging] errors installing other modules in dipy Message-ID: hello I ve installed dipy using the instructions given in the dipy website. But when trying to install other modules my setup.py got overwritten. Now i m getting errors trying to fetch sherbrooke_3shell dataset. Also i want to denoise the dataset using wavelet denoising. For that i need to install other modules. How to do that? -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sun Jun 5 11:51:50 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sun, 5 Jun 2016 08:51:50 -0700 Subject: [Neuroimaging] errors installing other modules in dipy In-Reply-To: References: Message-ID: Hi Vivek, On Sun, Jun 5, 2016 at 3:30 AM, Vivek Joshi wrote: > hello > I ve installed dipy using the instructions given in the dipy website. But > when trying to install other modules my setup.py got overwritten. Now i m > getting errors trying to fetch sherbrooke_3shell dataset. > I am not exactly sure what you mean. Could you tell me exactly what you did (step by step)? Which instructions did you follow? The ones to install from the source code ( http://nipy.org/dipy/installation.html#installing-from-source)? Or for installing a released version (I think that would be better in this case: http://nipy.org/dipy/installation.html#installing-a-release)? > Also i want to denoise the dataset using wavelet denoising. For that i > need to install other modules. How to do that? > Again - I am not exactly sure what you mean. Do you mean that you want to install other libraries (e.g., pywavelets )? You should be able to do that in parallel to your Dipy installation. Cheers, Ariel > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vivekjoshi1894 at gmail.com Sun Jun 5 12:19:41 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Sun, 5 Jun 2016 21:49:41 +0530 Subject: [Neuroimaging] Neuroimaging Digest, Vol 13, Issue 5 In-Reply-To: References: Message-ID: <57545127.e9c2420a.890f1.ffffcd71@mx.google.com> Hello Can you tell me the steps to install dipy from a release. Steps to install modules like pywavelets nibabel and modules for adaptive denoising. Thank you -----Original Message----- From: "neuroimaging-request at python.org" Sent: ?05-?06-?2016 21:33 To: "neuroimaging at python.org" Subject: Neuroimaging Digest, Vol 13, Issue 5 Send Neuroimaging mailing list submissions to neuroimaging at python.org To subscribe or unsubscribe via the World Wide Web, visit https://mail.python.org/mailman/listinfo/neuroimaging or, via email, send a message with subject or body 'help' to neuroimaging-request at python.org You can reach the person managing the list at neuroimaging-owner at python.org When replying, please edit your Subject line so it is more specific than "Re: Contents of Neuroimaging digest..." Today's Topics: 1. errors installing other modules in dipy (Vivek Joshi) 2. Re: errors installing other modules in dipy (Ariel Rokem) ---------------------------------------------------------------------- Message: 1 Date: Sun, 5 Jun 2016 16:00:07 +0530 From: Vivek Joshi To: neuroimaging at python.org Subject: [Neuroimaging] errors installing other modules in dipy Message-ID: Content-Type: text/plain; charset="utf-8" hello I ve installed dipy using the instructions given in the dipy website. But when trying to install other modules my setup.py got overwritten. Now i m getting errors trying to fetch sherbrooke_3shell dataset. Also i want to denoise the dataset using wavelet denoising. For that i need to install other modules. How to do that? -------------- next part -------------- An HTML attachment was scrubbed... URL: ------------------------------ Message: 2 Date: Sun, 5 Jun 2016 08:51:50 -0700 From: Ariel Rokem To: Neuroimaging analysis in Python Subject: Re: [Neuroimaging] errors installing other modules in dipy Message-ID: Content-Type: text/plain; charset="utf-8" Hi Vivek, On Sun, Jun 5, 2016 at 3:30 AM, Vivek Joshi wrote: > hello > I ve installed dipy using the instructions given in the dipy website. But > when trying to install other modules my setup.py got overwritten. Now i m > getting errors trying to fetch sherbrooke_3shell dataset. > I am not exactly sure what you mean. Could you tell me exactly what you did (step by step)? Which instructions did you follow? The ones to install from the source code ( http://nipy.org/dipy/installation.html#installing-from-source)? Or for installing a released version (I think that would be better in this case: http://nipy.org/dipy/installation.html#installing-a-release)? > Also i want to denoise the dataset using wavelet denoising. For that i > need to install other modules. How to do that? > Again - I am not exactly sure what you mean. Do you mean that you want to install other libraries (e.g., pywavelets )? You should be able to do that in parallel to your Dipy installation. Cheers, Ariel > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: ------------------------------ Subject: Digest Footer _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging ------------------------------ End of Neuroimaging Digest, Vol 13, Issue 5 ******************************************* -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Sun Jun 5 12:41:06 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Sun, 5 Jun 2016 18:41:06 +0200 Subject: [Neuroimaging] Neuroimaging Digest, Vol 13, Issue 5 In-Reply-To: <57545127.e9c2420a.890f1.ffffcd71@mx.google.com> References: <57545127.e9c2420a.890f1.ffffcd71@mx.google.com> Message-ID: <809e2123-207b-e44d-5cf4-d28bc5811c63@gmail.com> A quick beginner guide to python would probably help you https://www.python.org/about/gettingstarted/ As a quick tip, most stuff can be easily installed with pip like this pip install dipy nibabel numpy scipy pywavelets or any other library available from pypi. For other software, most of them are also easily installable through a setup.py file, but I strongly suggest to read a bit about python and how it works first. As for denoising, well, there is nlmeans in dipy (good for structural imaging for example), some other algorithms are probably available in pytohn if you look around and apparently a pr for wavelet mixing denoising is in the works, but it has not been reviewed so far, and you would need to checkout and use the guy branch to try it out. Try to get the examples/some test code running, importing modules and such on your own and we can see afterward how it goes. Le 2016-06-05 ? 18:19, Vivek Joshi a ?crit : > Hello > Can you tell me the steps to install dipy from a release. > Steps to install modules like pywavelets nibabel and modules for > adaptive denoising. > Thank you > ------------------------------------------------------------------------ > From: neuroimaging-request at python.org > > Sent: ?05-?06-?2016 21:33 > To: neuroimaging at python.org > Subject: Neuroimaging Digest, Vol 13, Issue 5 > > Send Neuroimaging mailing list submissions to > neuroimaging at python.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://mail.python.org/mailman/listinfo/neuroimaging > or, via email, send a message with subject or body 'help' to > neuroimaging-request at python.org > > You can reach the person managing the list at > neuroimaging-owner at python.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Neuroimaging digest..." > > > Today's Topics: > > 1. errors installing other modules in dipy (Vivek Joshi) > 2. Re: errors installing other modules in dipy (Ariel Rokem) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Sun, 5 Jun 2016 16:00:07 +0530 > From: Vivek Joshi > To: neuroimaging at python.org > Subject: [Neuroimaging] errors installing other modules in dipy > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > hello > I ve installed dipy using the instructions given in the dipy website. But > when trying to install other modules my setup.py got overwritten. Now i m > getting errors trying to fetch sherbrooke_3shell dataset. > Also i want to denoise the dataset using wavelet denoising. For that i > need > to install other modules. How to do that? > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > > ------------------------------ > > Message: 2 > Date: Sun, 5 Jun 2016 08:51:50 -0700 > From: Ariel Rokem > To: Neuroimaging analysis in Python > Subject: Re: [Neuroimaging] errors installing other modules in dipy > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > Hi Vivek, > > On Sun, Jun 5, 2016 at 3:30 AM, Vivek Joshi > wrote: > > > hello > > I ve installed dipy using the instructions given in the dipy > website. But > > when trying to install other modules my setup.py got overwritten. > Now i m > > getting errors trying to fetch sherbrooke_3shell dataset. > > > > I am not exactly sure what you mean. Could you tell me exactly what > you did > (step by step)? Which instructions did you follow? The ones to install > from > the source code ( > http://nipy.org/dipy/installation.html#installing-from-source)? Or for > installing a released version (I think that would be better in this case: > http://nipy.org/dipy/installation.html#installing-a-release)? > > > > Also i want to denoise the dataset using wavelet denoising. For that i > > need to install other modules. How to do that? > > > > Again - I am not exactly sure what you mean. Do you mean that you want to > install other libraries (e.g., pywavelets > )? You should be able to do that in > parallel to your Dipy installation. > > Cheers, > > Ariel > > > > > _______________________________________________ > > Neuroimaging mailing list > > Neuroimaging at python.org > > https://mail.python.org/mailman/listinfo/neuroimaging > > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > ------------------------------ > > End of Neuroimaging Digest, Vol 13, Issue 5 > ******************************************* > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From vivekjoshi1894 at gmail.com Thu Jun 9 08:44:24 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Thu, 9 Jun 2016 18:14:24 +0530 Subject: [Neuroimaging] regarding other denoising tecniques for 3d mri datasets Message-ID: Hello The denoising technique mentioned in the dipy examples is NLM means technique. Is it possible to denoise the Datasets with 3d wavelet subband mixing tecnique and a LMMSE statistical approach? In case of 3d wavelet subband mixing, how to install supreme.lib module so that we can access pywt? Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Fri Jun 10 08:01:21 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Fri, 10 Jun 2016 14:01:21 +0200 Subject: [Neuroimaging] regarding other denoising tecniques for 3d mri datasets In-Reply-To: References: Message-ID: I don't see why it would not be possible, of course you would to roll out your own version. I have to admit I do not fully understand the question, so here is what I could make out of it : For the lmmse approach, there is a dwi version made for denoising and the authors give some matlab code [1], so that would give you a starting point to turn it into a 3D version (I don't remember all the inner details, so maybe a 3D formulation is not valid, but anyway, that is the version I know of, the statistical framework itself is probably valid for any type of data). As for the lib you mention, if it is this one https://github.com/stefanv/supreme, running the setup.py would do the job. Or you could try writing a nice email to the author also (he might even be reading this mailing list, who knows) for more specific answers. Thirdly, I also remember you asked about piesno, as a word of caution, it is designed as an automatic noise estimator for repeated 3D volumes (dmri, fmri, anything kind of time series related I'd say) relying on the 'flatness' of the signal along the 4th dimension. So, it would unfortunately not work in theory for 3D mri volumes. I am not saying it is impossible, just to be careful on how you feed the data to the function. In any case, you can have a look at the inner piesno function as the public one is a thin wrapper over slices, so it is possible to modify the iterations to work over 3D smartly provided you segment the background yourself for example or even just use the noise estimator directly over segmented noise, bypassing the histogram estimation. 2016-06-09 14:44 GMT+02:00 Vivek Joshi : > Hello > The denoising technique mentioned in the dipy examples is NLM means > technique. > Is it possible to denoise the Datasets with 3d wavelet subband mixing > tecnique and a LMMSE statistical approach? In case of 3d wavelet subband > mixing, how to install supreme.lib module so that we can access pywt? > Thank you > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vivekjoshi1894 at gmail.com Sat Jun 11 06:15:29 2016 From: vivekjoshi1894 at gmail.com (Vivek Joshi) Date: Sat, 11 Jun 2016 15:45:29 +0530 Subject: [Neuroimaging] nlsam installation Message-ID: While installing nlsam using release we have been getting an error. we installed all modules except spams. while installing spams we get the above error. Can you please help me in denoising the dipy datasets using nlsam? thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Sat Jun 11 07:54:15 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Sat, 11 Jun 2016 13:54:15 +0200 Subject: [Neuroimaging] nlsam installation In-Reply-To: References: Message-ID: <7e267494-1365-2540-3d55-499e5562a063@gmail.com> Well, for this one I would need the error message, but since this is a public mailing list you can write me directly instead if you need more help (yes, I am the same guy that wrote the software, small world after all). As for the installation, normally if you run pip install https://github.com/samuelstjean/nlsam/archive/master.zip --user --process-dependency-links it shoudl fetch master and compile it, including spams. It actually fetches spams from my github repo, but the original installer is here http://spams-devel.gforge.inria.fr/downloads.html Doing pip install above_source_zip_file will build it for you, or you can fetch the prebuilt 2.4 for windows from the same website (so that way you don't need to install visual studio to build the whole thing.) I never tried the install on a mac personally, but people running macs have never reported that it does not work, so it should be fine. Spams is only for python2 as an official release, so that could be your problem also. I actually have an un-official version for python3, just requires a few patches made by the fine people over at archlinux (https://aur.archlinux.org/packages/python-spams-svn/). Here is the actual patched version, quickly put together until I (maybe) put a cleaner release on said repo until they officially rebuild it themselves https://www.dropbox.com/s/wv7562pvhc37o75/spams-python.zip?dl=0 Or there is also the prebuilt nlsam version under the releases over at github for windows and linux if you want to give it a test run also. Le 2016-06-11 ? 12:15, Vivek Joshi a ?crit : > While installing nlsam using release we have been getting an error. > we installed all modules except spams. while installing spams we get > the above error. > Can you please help me in denoising the dipy datasets using nlsam? > thank you > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From decom0405 at gmail.com Sun Jun 12 07:56:48 2016 From: decom0405 at gmail.com (Wook Kim) Date: Sun, 12 Jun 2016 20:56:48 +0900 Subject: [Neuroimaging] Dear. staff of Neuroimaging in python Message-ID: <82ECE4D9-F5ED-4E20-990B-C6E12C25EA15@gmail.com> Hi. I?m the research of Korea Institutes of Radiological and Medical science (Kirams) In these day our research team want to know about the analysis of fMRI study. So I was interested in using the python. recently I was know the Nipy !!! And I am really hope that using the Nipy in our research. But unfortunately I am poor experience of python. So please help me using the Nipy in fMRI study and if you have some kind of manual of using Nipy or tutorial of Nipy. (When i assessed the Nipy wed site and try to your site of tutorial, but it is hard to me? sorry) Thanks. From kats.vassia at gmail.com Sun Jun 12 09:46:56 2016 From: kats.vassia at gmail.com (Vassia Katsageorgiou) Date: Sun, 12 Jun 2016 15:46:56 +0200 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk Message-ID: Hello, I recently installed dipy and all the required additional packages, including vtk, but when I try to use fvtk for visualization (for example "r=fvtk.ren()", or "fvtk.line()") I get the error "no module ren, line..". I searched this error and I found these posts: https://github.com/nipy/dipy/issues/1016 https://neurostars.org/p/3724/ saying that vtk needs to be installed (which in my case is installed). I am using python2 under manjaro linux distribution and I have installed vtk-7, I tried also with vtk-6, but the same happens. I also tried to add in the python path the path to vtk's installation and the problem remains. Did anyone using linux have the same problem? Any suggestion? Thank you. Vassia -- *Vasiliki-Maria Katsageorgiou*, M.Eng. Fellow-PhD, Pattern Analysis and Computer Vision Istituto Italiano di Tecnologia Genova, Italy Phone number: +39 3899171853 E-mail: kats.vassia at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sun Jun 12 12:58:39 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sun, 12 Jun 2016 09:58:39 -0700 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk In-Reply-To: References: Message-ID: Hi Vassia, Thanks for your email. Let's see if I can be helpful. On Sun, Jun 12, 2016 at 6:46 AM, Vassia Katsageorgiou wrote: > Hello, > > I recently installed dipy and all the required additional packages, > including vtk, but when I try to use fvtk for visualization (for example > "r=fvtk.ren()", or "fvtk.line()") I get the error "no module ren, line..". > I searched this error and I found these posts: > > https://github.com/nipy/dipy/issues/1016 > https://neurostars.org/p/3724/ > > saying that vtk needs to be installed (which in my case is installed). I > am using python2 under manjaro linux distribution and I have installed > vtk-7, I tried also with vtk-6, but the same happens. I also tried to add > in the python path the path to vtk's installation and the problem remains. > Did anyone using linux have the same problem? Any suggestion? > But first let me just make sure that I understand what is going on. What happens when you `import vtk` in Python? Cheers, Ariel > Thank you. > Vassia > > -- > *Vasiliki-Maria Katsageorgiou*, M.Eng. > > Fellow-PhD, Pattern Analysis and Computer Vision > Istituto Italiano di Tecnologia > Genova, Italy > > Phone number: +39 3899171853 > E-mail: kats.vassia at gmail.com > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kats.vassia at gmail.com Sun Jun 12 13:25:26 2016 From: kats.vassia at gmail.com (Vassia Katsageorgiou) Date: Sun, 12 Jun 2016 19:25:26 +0200 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk In-Reply-To: References: Message-ID: Hi Ariel, Thank you very much for the reply! So, this is what happens if you import vtk in python: --------------------------------------------------------------------------- ImportError Traceback (most recent call last) in () ----> 1 import vtk /usr/lib/python2.7/site-packages/vtk/__init__.py in () 88 from .vtkParallelCore import * 89 from .vtkFiltersAMR import * ---> 90 from .vtkIOAMR import * 91 from .vtkRenderingVolumeOpenGL2 import * 92 from .vtkFiltersFlowPaths import * /usr/lib/python2.7/site-packages/vtk/vtkIOAMR.py in () 7 # during build and testing, the modules will be elsewhere, 8 # e.g. in lib directory or Release/Debug config directories ----> 9 from vtkIOAMRPython import * ImportError: No module named vtkIOAMRPython I found this link, which I think will work: http://ghoshbishakh.github.io/blog/blogpost/2016/03/05/buid-vtk.html Though, after "cmake ." I get another error: CMake Error at CMake/vtkCompilerExtras.cmake:47 (if): if given arguments: "cc: error: ARGS: No such file or directory cc (GCC) 6.1.1 20160501 Copyright (C) 2016 Free Software Foundation, Inc. This is free software" " see the source for copying conditions. There is NO warranty" " not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. " "VERSION_GREATER" "4.2.0" "AND" "BUILD_SHARED_LIBS" "AND" "HAVE_GCC_VISIBILITY" "AND" "VTK_USE_GCC_VISIBILITY" "AND" "NOT" "MINGW" "AND" "NOT" "CYGWIN" Unknown arguments specified Call Stack (most recent call first): CMakeLists.txt:286 (include) -- Configuring incomplete, errors occurred! Which I think is due to gcc's compilation.. I am trying to fix this.. If I manage to, I will report it! But of course, if you had to suggest something different, it would be welcomed! :) Thanks again, Vassia On 12 June 2016 at 18:58, Ariel Rokem wrote: > Hi Vassia, > > Thanks for your email. Let's see if I can be helpful. > On Sun, Jun 12, 2016 at 6:46 AM, Vassia Katsageorgiou < > kats.vassia at gmail.com> wrote: > >> Hello, >> >> I recently installed dipy and all the required additional packages, >> including vtk, but when I try to use fvtk for visualization (for example >> "r=fvtk.ren()", or "fvtk.line()") I get the error "no module ren, line..". >> I searched this error and I found these posts: >> >> https://github.com/nipy/dipy/issues/1016 >> https://neurostars.org/p/3724/ >> >> saying that vtk needs to be installed (which in my case is installed). I >> am using python2 under manjaro linux distribution and I have installed >> vtk-7, I tried also with vtk-6, but the same happens. I also tried to add >> in the python path the path to vtk's installation and the problem remains. >> Did anyone using linux have the same problem? Any suggestion? >> > > But first let me just make sure that I understand what is going on. What > happens when you `import vtk` in Python? > > Cheers, > > Ariel > > >> Thank you. >> Vassia >> >> -- >> *Vasiliki-Maria Katsageorgiou*, M.Eng. >> >> Fellow-PhD, Pattern Analysis and Computer Vision >> Istituto Italiano di Tecnologia >> Genova, Italy >> >> Phone number: +39 3899171853 >> E-mail: kats.vassia at gmail.com >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- *Vasiliki-Maria Katsageorgiou*, M.Eng. Fellow-PhD, Pattern Analysis and Computer Vision Istituto Italiano di Tecnologia Genova, Italy Phone number: +39 3899171853 E-mail: kats.vassia at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Sun Jun 12 14:13:57 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Sun, 12 Jun 2016 20:13:57 +0200 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk In-Reply-To: References: Message-ID: <46cdd2a0-fb7a-d08b-72f1-58ce3454a077@gmail.com> I am currently on arch (manjaro uses the same repo for the most part), and I just realized it is indeed plain broken, I'll flag a bug report for that. The repos also have vtk6, which get installed to a non standard path, and is also broken unfortunately. If you don't need it right now, they usually fix and rebuild stuff pretty fast, so might be worth to wait a few days before compiling everything in cmake if it's not urgent. Le 2016-06-12 ? 19:25, Vassia Katsageorgiou a ?crit : > Hi Ariel, > > Thank you very much for the reply! > So, this is what happens if you import vtk in python: > > --------------------------------------------------------------------------- > ImportError Traceback (most recent call last) > in () > ----> 1 import vtk > > /usr/lib/python2.7/site-packages/vtk/__init__.py in () > 88 from .vtkParallelCore import * > 89 from .vtkFiltersAMR import * > ---> 90 from .vtkIOAMR import * > 91 from .vtkRenderingVolumeOpenGL2 import * > 92 from .vtkFiltersFlowPaths import * > > /usr/lib/python2.7/site-packages/vtk/vtkIOAMR.py in () > 7 # during build and testing, the modules will be elsewhere, > 8 # e.g. in lib directory or Release/Debug config directories > ----> 9 from vtkIOAMRPython import * > > ImportError: No module named vtkIOAMRPython > > I found this link, which I think will work: > http://ghoshbishakh.github.io/blog/blogpost/2016/03/05/buid-vtk.html > > Though, after "cmake ." I get another error: > > CMake Error at CMake/vtkCompilerExtras.cmake:47 (if): > if given arguments: > > "cc: error: ARGS: No such file or directory > > cc (GCC) 6.1.1 20160501 > > Copyright (C) 2016 Free Software Foundation, Inc. > > This is free software" " see the source for copying conditions. > There is > NO > > warranty" " not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR > PURPOSE. > > > > " "VERSION_GREATER" "4.2.0" "AND" "BUILD_SHARED_LIBS" "AND" > "HAVE_GCC_VISIBILITY" "AND" "VTK_USE_GCC_VISIBILITY" "AND" "NOT" "MINGW" > "AND" "NOT" "CYGWIN" > > Unknown arguments specified > Call Stack (most recent call first): > CMakeLists.txt:286 (include) > > > -- Configuring incomplete, errors occurred! > > Which I think is due to gcc's compilation.. I am trying to fix this.. > If I manage to, I will report it! > > But of course, if you had to suggest something different, it would be > welcomed! :) > > Thanks again, > Vassia > > > On 12 June 2016 at 18:58, Ariel Rokem > wrote: > > Hi Vassia, > > Thanks for your email. Let's see if I can be helpful. > On Sun, Jun 12, 2016 at 6:46 AM, Vassia Katsageorgiou > > wrote: > > Hello, > > I recently installed dipy and all the required additional > packages, including vtk, but when I try to use fvtk for > visualization (for example "r=fvtk.ren()", or "fvtk.line()") I > get the error "no module ren, line..". > I searched this error and I found these posts: > > https://github.com/nipy/dipy/issues/1016 > https://neurostars.org/p/3724/ > > saying that vtk needs to be installed (which in my case is > installed). I am using python2 under manjaro linux > distribution and I have installed vtk-7, I tried also with > vtk-6, but the same happens. I also tried to add in the python > path the path to vtk's installation and the problem remains. > Did anyone using linux have the same problem? Any suggestion? > > > But first let me just make sure that I understand what is going > on. What happens when you `import vtk` in Python? > > Cheers, > > Ariel > > Thank you. > Vassia > > -- > *Vasiliki-Maria Katsageorgiou*, M.Eng. > > Fellow-PhD, Pattern Analysis and Computer Vision > Istituto Italiano di Tecnologia > Genova, Italy > > Phone number: +39 3899171853 > E-mail: kats.vassia at gmail.com > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > > > -- > *Vasiliki-Maria Katsageorgiou*, M.Eng. > > Fellow-PhD, Pattern Analysis and Computer Vision > Istituto Italiano di Tecnologia > Genova, Italy > > Phone number: +39 3899171853 > E-mail: kats.vassia at gmail.com > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From berleant at stanford.edu Mon Jun 13 02:19:59 2016 From: berleant at stanford.edu (Shoshana Berleant) Date: Mon, 13 Jun 2016 06:19:59 +0000 Subject: [Neuroimaging] Dear. staff of Neuroimaging in python In-Reply-To: <82ECE4D9-F5ED-4E20-990B-C6E12C25EA15@gmail.com> References: <82ECE4D9-F5ED-4E20-990B-C6E12C25EA15@gmail.com> Message-ID: For installation: https://github.com/nipy/nipype/blob/master/doc/users/install.rst List of tutorials: http://nipy.org/nipype/users/pipeline_tutorial.html I am a little unclear on what you are asking for. Hope that helps. On Sun, Jun 12, 2016 at 4:56 AM Wook Kim wrote: > Hi. > I?m the research of Korea Institutes of Radiological and Medical science > (Kirams) > In these day our research team want to know about the analysis of fMRI > study. > So I was interested in using the python. recently I was know the Nipy !!! > And I am really hope that using the Nipy in our research. > But unfortunately I am poor experience of python. > So please help me using the Nipy in fMRI study and if you have some kind > of manual of using Nipy > or tutorial of Nipy. > (When i assessed the Nipy wed site and try to your site of tutorial, but > it is hard to me? sorry) > > Thanks. > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kats.vassia at gmail.com Mon Jun 13 07:39:17 2016 From: kats.vassia at gmail.com (Vassia Katsageorgiou) Date: Mon, 13 Jun 2016 13:39:17 +0200 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk In-Reply-To: <46cdd2a0-fb7a-d08b-72f1-58ce3454a077@gmail.com> References: <46cdd2a0-fb7a-d08b-72f1-58ce3454a077@gmail.com> Message-ID: Hi Samuel, thank you for the reply! Finally there is solution for arch linux (also manjaro)! So, in case someone has the same problem, he needs to install *vtk-qt4 7.0.0-2 *(here: https://aur.archlinux.org/packages/vtk-qt4/) and it works! Cheers, Vassia On 12 June 2016 at 20:13, Samuel St-Jean wrote: > I am currently on arch (manjaro uses the same repo for the most part), and > I just realized it is indeed plain broken, I'll flag a bug report for that. > The repos also have vtk6, which get installed to a non standard path, and > is also broken unfortunately. > > > If you don't need it right now, they usually fix and rebuild stuff pretty > fast, so might be worth to wait a few days before compiling everything in > cmake if it's not urgent. > > Le 2016-06-12 ? 19:25, Vassia Katsageorgiou a ?crit : > > Hi Ariel, > > Thank you very much for the reply! > So, this is what happens if you import vtk in python: > > --------------------------------------------------------------------------- > ImportError Traceback (most recent call last) > in () > ----> 1 import vtk > > /usr/lib/python2.7/site-packages/vtk/__init__.py in () > 88 from .vtkParallelCore import * > 89 from .vtkFiltersAMR import * > ---> 90 from .vtkIOAMR import * > 91 from .vtkRenderingVolumeOpenGL2 import * > 92 from .vtkFiltersFlowPaths import * > > /usr/lib/python2.7/site-packages/vtk/vtkIOAMR.py in () > 7 # during build and testing, the modules will be elsewhere, > 8 # e.g. in lib directory or Release/Debug config directories > ----> 9 from vtkIOAMRPython import * > > ImportError: No module named vtkIOAMRPython > > I found this link, which I think will work: > > http://ghoshbishakh.github.io/blog/blogpost/2016/03/05/buid-vtk.html > > Though, after "cmake ." I get another error: > > CMake Error at CMake/vtkCompilerExtras.cmake:47 (if): > if given arguments: > > "cc: error: ARGS: No such file or directory > > cc (GCC) 6.1.1 20160501 > > Copyright (C) 2016 Free Software Foundation, Inc. > > This is free software" " see the source for copying conditions. There is > NO > > warranty" " not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR > PURPOSE. > > > > " "VERSION_GREATER" "4.2.0" "AND" "BUILD_SHARED_LIBS" "AND" > "HAVE_GCC_VISIBILITY" "AND" "VTK_USE_GCC_VISIBILITY" "AND" "NOT" "MINGW" > "AND" "NOT" "CYGWIN" > > Unknown arguments specified > Call Stack (most recent call first): > CMakeLists.txt:286 (include) > > > -- Configuring incomplete, errors occurred! > > Which I think is due to gcc's compilation.. I am trying to fix this.. If I > manage to, I will report it! > > But of course, if you had to suggest something different, it would be > welcomed! :) > > Thanks again, > Vassia > > > On 12 June 2016 at 18:58, Ariel Rokem wrote: > >> Hi Vassia, >> >> Thanks for your email. Let's see if I can be helpful. >> On Sun, Jun 12, 2016 at 6:46 AM, Vassia Katsageorgiou < >> kats.vassia at gmail.com> wrote: >> >>> Hello, >>> >>> I recently installed dipy and all the required additional packages, >>> including vtk, but when I try to use fvtk for visualization (for example >>> "r=fvtk.ren()", or "fvtk.line()") I get the error "no module ren, line..". >>> I searched this error and I found these posts: >>> >>> https://github.com/nipy/dipy/issues/1016 >>> https://neurostars.org/p/3724/ >>> >>> saying that vtk needs to be installed (which in my case is installed). I >>> am using python2 under manjaro linux distribution and I have installed >>> vtk-7, I tried also with vtk-6, but the same happens. I also tried to add >>> in the python path the path to vtk's installation and the problem remains. >>> Did anyone using linux have the same problem? Any suggestion? >>> >> >> But first let me just make sure that I understand what is going on. What >> happens when you `import vtk` in Python? >> >> Cheers, >> >> Ariel >> >> >>> Thank you. >>> Vassia >>> >>> -- >>> *Vasiliki-Maria Katsageorgiou*, M.Eng. >>> >>> Fellow-PhD, Pattern Analysis and Computer Vision >>> Istituto Italiano di Tecnologia >>> Genova, Italy >>> >>> Phone number: +39 3899171853 <%2B39%203899171853> >>> E-mail: kats.vassia at gmail.com >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > > -- > *Vasiliki-Maria Katsageorgiou*, M.Eng. > > Fellow-PhD, Pattern Analysis and Computer Vision > Istituto Italiano di Tecnologia > Genova, Italy > > Phone number: +39 3899171853 > E-mail: kats.vassia at gmail.com > > > _______________________________________________ > Neuroimaging mailing listNeuroimaging at python.orghttps://mail.python.org/mailman/listinfo/neuroimaging > > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- *Vasiliki-Maria Katsageorgiou*, M.Eng. Fellow-PhD, Pattern Analysis and Computer Vision Istituto Italiano di Tecnologia Genova, Italy Phone number: +39 3899171853 E-mail: kats.vassia at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael.patrick.dean at gmail.com Mon Jun 13 09:44:39 2016 From: michael.patrick.dean at gmail.com (Michael Dean) Date: Mon, 13 Jun 2016 08:44:39 -0500 Subject: [Neuroimaging] Conversion of DICOM to Nifti1 Message-ID: Hi all, Trying to reconstruct an older code and was wondering if there are any functions on dipy that converts DICO images into the Nifti1Image format or not. If not, what would you recommend using in order to do this process? Thanks for any advice! -------------- next part -------------- An HTML attachment was scrubbed... URL: From marmaduke.woodman at univ-amu.fr Mon Jun 13 10:56:45 2016 From: marmaduke.woodman at univ-amu.fr (Marmaduke Woodman) Date: Mon, 13 Jun 2016 07:56:45 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB Message-ID: hi all, I'm a dev behind the Virtual Brain (TVB) a neuroimaging simulation library, written in Python. As part of an effort to reach out to some users stuck in MATLAB, I've been testing MATLAB's recent *official* support for Python (2014b+). It is notably better than the many other attempts because it appears to use Py's C API to expose objects and methods as MATLAB objects and methods instead of just providing eval/exec. Generally it's working, but there are some workarounds to avoid segfaults, errors and lack of stdout/err. I've started to collect a series of tips and workarounds and figured there might be common interest in the community for a "helper" library so all our projects can benefit from a larger user base, and maybe ease the transition for potential Python users. The most important in order, - Anything touching HDF5 must use exactly the same version as MATLAB or segfault - Scipy.io.{save,load}mat segfault - C++ extension modules whose exception model doesn't match MATLAB's, segfault on exception - Linalg segfault unless using MKL (which is what MATLAB uses) - stdout/err and logging must be redirecting to mexPrintf via a ctypes monkey patch - Numpy/MATLAB array conversion is O(1) op, but can probably be made zero copy with a ctypes + lib mx workaround Aside from these (for which I sent a service request to MathWorks), it's working well and we (TVB) expect our next release to include demo scripts in MATLAB. So, again, if there is common interest in collecting notes and a helper library to monkey patch around the segfaults, it'd be great to not do this alone ;) cheers, Marmaduke -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Mon Jun 13 11:09:37 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Mon, 13 Jun 2016 15:09:37 +0000 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Thank you Marmaduke this is really helpful. We get many requests for running our libraries from inside Matlab or interfacing with Matlab. I hope we can help resolve the different issues. Have a great day and thank you for the feedback. Cheers, Eleftherios On Mon, Jun 13, 2016 at 10:58 AM Marmaduke Woodman < marmaduke.woodman at univ-amu.fr> wrote: > hi all, > > I'm a dev behind the Virtual Brain (TVB) a neuroimaging simulation > library, written in Python. As part of an effort to reach out to some users > stuck in MATLAB, I've been testing MATLAB's recent *official* support for > Python (2014b+). It is notably better than the many other attempts because > it appears to use Py's C API to expose objects and methods as MATLAB > objects and methods instead of just providing eval/exec. > > Generally it's working, but there are some workarounds to avoid segfaults, > errors and lack of stdout/err. I've started to collect a series of tips and > workarounds and figured there might be common interest in the community for > a "helper" library so all our projects can benefit from a larger user base, > and maybe ease the transition for potential Python users. > > The most important in order, > > - Anything touching HDF5 must use exactly the same version as MATLAB or > segfault > - Scipy.io.{save,load}mat segfault > - C++ extension modules whose exception model doesn't match MATLAB's, > segfault on exception > - Linalg segfault unless using MKL (which is what MATLAB uses) > - stdout/err and logging must be redirecting to mexPrintf via a ctypes > monkey patch > - Numpy/MATLAB array conversion is O(1) op, but can probably be made zero > copy with a ctypes + lib mx workaround > > Aside from these (for which I sent a service request to MathWorks), it's > working well and we (TVB) expect our next release to include demo scripts > in MATLAB. > > So, again, if there is common interest in collecting notes and a helper > library to monkey patch around the segfaults, it'd be great to not do this > alone ;) > > cheers, > Marmaduke > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at onerussian.com Mon Jun 13 10:44:01 2016 From: lists at onerussian.com (Yaroslav Halchenko) Date: Mon, 13 Jun 2016 10:44:01 -0400 Subject: [Neuroimaging] Conversion of DICOM to Nifti1 In-Reply-To: References: Message-ID: <20160613144401.GL11174@onerussian.com> On Mon, 13 Jun 2016, Michael Dean wrote: > Trying to reconstruct an older code and was wondering if there are any > functions on dipy that converts DICO images into the Nifti1Image format or > not. If not, what would you recommend using in order to do this process? > Thanks for any advice! There is a number of tools out there $> apt-cache dicom nifti # filtering output manually a bit python-dcmstack - DICOM to Nifti conversion dicomnifti - converts DICOM files into the NIfTI format mriconvert - medical image file conversion utility mrtrix - diffusion-weighted MRI white matter tractography nifti2dicom - convert 3D medical images to DICOM 2D series qnifti2dicom - convert 3D medical images to DICOM 2D series (gui) mitools - view, convert and perform basic maths with medical image datasets plastimatch - medical image reconstruction and registration mricron - magnetic resonance image conversion, viewing and analysis dcm2niix - converts DICOM and PAR/REC files into the NIfTI format heudiconv - DICOM converter with support for structure heuristics try e.g. https://github.com/neurolabusc/dcm2niix recent version is within NeuroDebian happen you are using it -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From gael.varoquaux at normalesup.org Mon Jun 13 12:33:05 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 13 Jun 2016 18:33:05 +0200 Subject: [Neuroimaging] New nilearn release Message-ID: <20160613163305.GA4109022@phare.normalesup.org> Dear neuroimagers, We have just released a new version (0.2.5) of nilearn: machine learning for neuroimaging. http://nilearn.github.io/ This is an incremental release. The highlights are more didactic docs and examples, as well as hemispheric plotting for glass brain and connectomes, visible at the bottom of the following examples: http://nilearn.github.io/auto_examples/01_plotting/plot_demo_glass_brain.html http://nilearn.github.io/auto_examples/03_connectivity/plot_multi_subject_connectome.html Thanks to all the contributors: http://nilearn.github.io/whats_new.html Ga?l From matthew.brett at gmail.com Mon Jun 13 12:47:25 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 13 Jun 2016 09:47:25 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Hi, On Mon, Jun 13, 2016 at 7:56 AM, Marmaduke Woodman wrote: > hi all, > > I'm a dev behind the Virtual Brain (TVB) a neuroimaging simulation library, > written in Python. As part of an effort to reach out to some users stuck in > MATLAB, I've been testing MATLAB's recent *official* support for Python > (2014b+). It is notably better than the many other attempts because it > appears to use Py's C API to expose objects and methods as MATLAB objects > and methods instead of just providing eval/exec. > > Generally it's working, but there are some workarounds to avoid segfaults, > errors and lack of stdout/err. I've started to collect a series of tips and > workarounds and figured there might be common interest in the community for > a "helper" library so all our projects can benefit from a larger user base, > and maybe ease the transition for potential Python users. > > The most important in order, > > - Anything touching HDF5 must use exactly the same version as MATLAB or > segfault Ouch - I guess this means it's not possible to use h5py or pytables? > - Scipy.io.{save,load}mat segfault That's interesting - it's probably possible to fix that, because that stuff doesn't call any external libraries. Do you have any more details? > - C++ extension modules whose exception model doesn't match MATLAB's, > segfault on exception > - Linalg segfault unless using MKL (which is what MATLAB uses) > - stdout/err and logging must be redirecting to mexPrintf via a ctypes > monkey patch > - Numpy/MATLAB array conversion is O(1) op, but can probably be made zero > copy with a ctypes + lib mx workaround > > Aside from these (for which I sent a service request to MathWorks), it's > working well and we (TVB) expect our next release to include demo scripts in > MATLAB. > > So, again, if there is common interest in collecting notes and a helper > library to monkey patch around the segfaults, it'd be great to not do this > alone ;) Thanks too for the feedback. You might also get some interest over at the numpy / scipy lists. Although the blas / lapack thing is a bit of a downer... Cheers, Matthew From stjeansam at gmail.com Mon Jun 13 13:43:07 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 13 Jun 2016 19:43:07 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Regarding loading mat files, I was exactly doing that today, and it turns out anything that is matlab >= 2006b is using a file format without any converter currently existing. There are bug reports about that, but it still is a work in progress since a long time ago, so maybe not waiting on that for now is better to get stuff rolling. See https://savannah.gnu.org/bugs/?45706 and while looking for it I alos found this http://octave.1599824.n4.nabble.com/Support-for-Matlab-s-7-3-file-format-td4676198.html Le 2016-06-13 ? 18:47, Matthew Brett a ?crit : > Hi, > > On Mon, Jun 13, 2016 at 7:56 AM, Marmaduke Woodman > wrote: >> hi all, >> >> I'm a dev behind the Virtual Brain (TVB) a neuroimaging simulation library, >> written in Python. As part of an effort to reach out to some users stuck in >> MATLAB, I've been testing MATLAB's recent *official* support for Python >> (2014b+). It is notably better than the many other attempts because it >> appears to use Py's C API to expose objects and methods as MATLAB objects >> and methods instead of just providing eval/exec. >> >> Generally it's working, but there are some workarounds to avoid segfaults, >> errors and lack of stdout/err. I've started to collect a series of tips and >> workarounds and figured there might be common interest in the community for >> a "helper" library so all our projects can benefit from a larger user base, >> and maybe ease the transition for potential Python users. >> >> The most important in order, >> >> - Anything touching HDF5 must use exactly the same version as MATLAB or >> segfault > Ouch - I guess this means it's not possible to use h5py or pytables? > >> - Scipy.io.{save,load}mat segfault > That's interesting - it's probably possible to fix that, because that > stuff doesn't call any external libraries. Do you have any more > details? > >> - C++ extension modules whose exception model doesn't match MATLAB's, >> segfault on exception >> - Linalg segfault unless using MKL (which is what MATLAB uses) >> - stdout/err and logging must be redirecting to mexPrintf via a ctypes >> monkey patch >> - Numpy/MATLAB array conversion is O(1) op, but can probably be made zero >> copy with a ctypes + lib mx workaround >> >> Aside from these (for which I sent a service request to MathWorks), it's >> working well and we (TVB) expect our next release to include demo scripts in >> MATLAB. >> >> So, again, if there is common interest in collecting notes and a helper >> library to monkey patch around the segfaults, it'd be great to not do this >> alone ;) > Thanks too for the feedback. You might also get some interest over > at the numpy / scipy lists. Although the blas / lapack thing is a bit > of a downer... > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From njs at pobox.com Mon Jun 13 13:45:47 2016 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 13 Jun 2016 10:45:47 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: On Jun 13, 2016 7:58 AM, "Marmaduke Woodman" wrote: > [...] > - Anything touching HDF5 must use exactly the same version as MATLAB or segfault > - Scipy.io.{save,load}mat segfault > - C++ extension modules whose exception model doesn't match MATLAB's, segfault on exception > - Linalg segfault unless using MKL (which is what MATLAB uses) What operating systems have you tested these on? These segfaults sound like they have to do with MATLAB making sub-optimal choices in how they handle shared libraries, but the shared library systems on Windows / OS X / Linux are so different that there's a good chance these limitations are operating system specific. (In particular I wouldn't be surprised if you found that all of the above issues happened only on Linux, but not windows or os X. The Linux linker is powerful and does make sense once you get to know it, but it also makes it very very easy to screw things up.) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Mon Jun 13 13:47:35 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 13 Jun 2016 19:47:35 +0200 Subject: [Neuroimaging] Problem with from dipy.viz.fvtk In-Reply-To: References: <46cdd2a0-fb7a-d08b-72f1-58ce3454a077@gmail.com> Message-ID: <35062012-b2a6-27c0-98bd-f68acfef6c36@gmail.com> Well, for anyone reading this afterward, the packaging from upstream vtk seems to be the cause, and arch is still looking at how to fix it properly. Good to know the workaround with the vtk-qt4 pkgbuild, here is the bug report for people interested until it gets properly fixed https://bugs.archlinux.org/task/49233?project=5&string=vtk&type%5B0%5D=&sev%5B0%5D=&pri%5B0%5D=&due%5B0%5D=&cat%5B0%5D=&status%5B0%5D=open&percent%5B0%5D=&reported%5B0%5D= Le 2016-06-13 ? 13:39, Vassia Katsageorgiou a ?crit : > Hi Samuel, > > thank you for the reply! > Finally there is solution for arch linux (also manjaro)! > So, in case someone has the same problem, he needs to install*vtk-qt4 > 7.0.0-2 *(here: https://aur.archlinux.org/packages/vtk-qt4/) > and it works! > > Cheers, > Vassia > > On 12 June 2016 at 20:13, Samuel St-Jean > wrote: > > I am currently on arch (manjaro uses the same repo for the most > part), and I just realized it is indeed plain broken, I'll flag a > bug report for that. The repos also have vtk6, which get installed > to a non standard path, and is also broken unfortunately. > > > If you don't need it right now, they usually fix and rebuild stuff > pretty fast, so might be worth to wait a few days before compiling > everything in cmake if it's not urgent. > > > Le 2016-06-12 ? 19:25, Vassia Katsageorgiou a ?crit : >> Hi Ariel, >> >> Thank you very much for the reply! >> So, this is what happens if you import vtk in python: >> >> --------------------------------------------------------------------------- >> ImportError Traceback (most recent >> call last) >> in () >> ----> 1 import vtk >> >> /usr/lib/python2.7/site-packages/vtk/__init__.py in () >> 88 from .vtkParallelCore import * >> 89 from .vtkFiltersAMR import * >> ---> 90 from .vtkIOAMR import * >> 91 from .vtkRenderingVolumeOpenGL2 import * >> 92 from .vtkFiltersFlowPaths import * >> >> /usr/lib/python2.7/site-packages/vtk/vtkIOAMR.py in () >> 7 # during build and testing, the modules will be >> elsewhere, >> 8 # e.g. in lib directory or Release/Debug config >> directories >> ----> 9 from vtkIOAMRPython import * >> >> ImportError: No module named vtkIOAMRPython >> >> I found this link, which I think will work: >> http://ghoshbishakh.github.io/blog/blogpost/2016/03/05/buid-vtk.html >> >> Though, after "cmake ." I get another error: >> >> CMake Error at CMake/vtkCompilerExtras.cmake:47 (if): >> if given arguments: >> >> "cc: error: ARGS: No such file or directory >> >> cc (GCC) 6.1.1 20160501 >> >> Copyright (C) 2016 Free Software Foundation, Inc. >> >> This is free software" " see the source for copying >> conditions. There is >> NO >> >> warranty" " not even for MERCHANTABILITY or FITNESS FOR A >> PARTICULAR >> PURPOSE. >> >> >> >> " "VERSION_GREATER" "4.2.0" "AND" "BUILD_SHARED_LIBS" "AND" >> "HAVE_GCC_VISIBILITY" "AND" "VTK_USE_GCC_VISIBILITY" "AND" >> "NOT" "MINGW" >> "AND" "NOT" "CYGWIN" >> >> Unknown arguments specified >> Call Stack (most recent call first): >> CMakeLists.txt:286 (include) >> >> >> -- Configuring incomplete, errors occurred! >> >> Which I think is due to gcc's compilation.. I am trying to fix >> this.. If I manage to, I will report it! >> >> But of course, if you had to suggest something different, it >> would be welcomed! :) >> >> Thanks again, >> Vassia >> >> >> On 12 June 2016 at 18:58, Ariel Rokem > > wrote: >> >> Hi Vassia, >> >> Thanks for your email. Let's see if I can be helpful. >> On Sun, Jun 12, 2016 at 6:46 AM, Vassia Katsageorgiou >> > wrote: >> >> Hello, >> >> I recently installed dipy and all the required additional >> packages, including vtk, but when I try to use fvtk for >> visualization (for example "r=fvtk.ren()", or >> "fvtk.line()") I get the error "no module ren, line..". >> I searched this error and I found these posts: >> >> https://github.com/nipy/dipy/issues/1016 >> https://neurostars.org/p/3724/ >> >> saying that vtk needs to be installed (which in my case >> is installed). I am using python2 under manjaro linux >> distribution and I have installed vtk-7, I tried also >> with vtk-6, but the same happens. I also tried to add in >> the python path the path to vtk's installation and the >> problem remains. >> Did anyone using linux have the same problem? Any suggestion? >> >> >> But first let me just make sure that I understand what is >> going on. What happens when you `import vtk` in Python? >> >> Cheers, >> >> Ariel >> >> Thank you. >> Vassia >> >> -- >> *Vasiliki-Maria Katsageorgiou*, M.Eng. >> >> Fellow-PhD, Pattern Analysis and Computer Vision >> Istituto Italiano di Tecnologia >> Genova, Italy >> >> Phone number: +39 3899171853 >> E-mail: kats.vassia at gmail.com >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> >> >> >> -- >> *Vasiliki-Maria Katsageorgiou*, M.Eng. >> >> Fellow-PhD, Pattern Analysis and Computer Vision >> Istituto Italiano di Tecnologia >> Genova, Italy >> >> Phone number: +39 3899171853 >> E-mail: kats.vassia at gmail.com >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > > > -- > *Vasiliki-Maria Katsageorgiou*, M.Eng. > > Fellow-PhD, Pattern Analysis and Computer Vision > Istituto Italiano di Tecnologia > Genova, Italy > > Phone number: +39 3899171853 > E-mail: kats.vassia at gmail.com > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Mon Jun 13 13:51:26 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 13 Jun 2016 10:51:26 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Hi, On Mon, Jun 13, 2016 at 10:43 AM, Samuel St-Jean wrote: > Regarding loading mat files, I was exactly doing that today, and it turns > out anything that is matlab >= 2006b is using a file format without any > converter currently existing. > > There are bug reports about that, but it still is a work in progress since a > long time ago, so maybe not waiting on that for now is better to get stuff > rolling. > > See https://savannah.gnu.org/bugs/?45706 and while looking for it I alos > found this > http://octave.1599824.n4.nabble.com/Support-for-Matlab-s-7-3-file-format-td4676198.html Those are for Octave I believe. No, there's no support for the HDF5 matlab 7.3 file format in scipy, because scipy doesn't have a library for reading hdf5. I don't think there's any plan to add that support soon, but it would be good to have, with an optional dependency on h5py. Cheers, Matthew From stjeansam at gmail.com Mon Jun 13 13:58:18 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 13 Jun 2016 19:58:18 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Indeed it is for octave, but since it strives ot have matlab compatibility, I just wanted to mention that if they did not got around doing it yet (it has been 10 years after all), might be useful to put on the scipy io wishlist for the python -> matlab bridge and ficuse on other issues (like the segfaults). In any case, I also tried the scipy one, whihc unfortunately were not working, then went to octave, to find out it is still an issue in what I guess is all projects doing matlab compatibility (else I would have exected them to just use whichever solution the first one came up with, but anyway). Still if you need help testing stuff out down the road I don't mind, would be helpful rather than calling execute stuff from matlab command line. Le 2016-06-13 ? 19:51, Matthew Brett a ?crit : > Hi, > > On Mon, Jun 13, 2016 at 10:43 AM, Samuel St-Jean wrote: >> Regarding loading mat files, I was exactly doing that today, and it turns >> out anything that is matlab >= 2006b is using a file format without any >> converter currently existing. >> >> There are bug reports about that, but it still is a work in progress since a >> long time ago, so maybe not waiting on that for now is better to get stuff >> rolling. >> >> See https://savannah.gnu.org/bugs/?45706 and while looking for it I alos >> found this >> http://octave.1599824.n4.nabble.com/Support-for-Matlab-s-7-3-file-format-td4676198.html > Those are for Octave I believe. No, there's no support for the HDF5 > matlab 7.3 file format in scipy, because scipy doesn't have a library > for reading hdf5. I don't think there's any plan to add that support > soon, but it would be good to have, with an optional dependency on > h5py. > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From matthew.brett at gmail.com Mon Jun 13 14:02:27 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 13 Jun 2016 11:02:27 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: On Mon, Jun 13, 2016 at 10:58 AM, Samuel St-Jean wrote: > Indeed it is for octave, but since it strives ot have matlab compatibility, > I just wanted to mention that if they did not got around doing it yet (it > has been 10 years after all), might be useful to put on the scipy io > wishlist for the python -> matlab bridge and ficuse on other issues (like > the segfaults). > > In any case, I also tried the scipy one, whihc unfortunately were not > working, then went to octave, to find out it is still an issue in what I > guess is all projects doing matlab compatibility (else I would have exected > them to just use whichever solution the first one came up with, but anyway). I'm the scipy.io.matlab maintainer. The reason I never got into the 7.3 format was the hdf5 library dependency, but now stuff is getting easier to install (wheels, conda packages), it might be worth revisiting it. Also - I might be wrong, but I don't think Mathworks has documented their 7.3 format. It would probably take a week of work to get something to test for scipy, but it's way down my priority list at the moment. Cheers, Matthew From arokem at gmail.com Mon Jun 13 14:12:27 2016 From: arokem at gmail.com (Ariel Rokem) Date: Mon, 13 Jun 2016 11:12:27 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: Hi, On Mon, Jun 13, 2016 at 10:43 AM, Samuel St-Jean wrote: > Regarding loading mat files, I was exactly doing that today, and it turns > out anything that is matlab >= 2006b is using a file format without any > converter currently existing. > > There are bug reports about that, but it still is a work in progress since > a long time ago, so maybe not waiting on that for now is better to get > stuff rolling. > > See https://savannah.gnu.org/bugs/?45706 and while looking for it I alos > found this > http://octave.1599824.n4.nabble.com/Support-for-Matlab-s-7-3-file-format-td4676198.html Have you tried using h5py? I have had success with that, even for very recent versions of Matlab (2016a), and intricately nested structs. > > Le 2016-06-13 ? 18:47, Matthew Brett a ?crit : > >> Hi, >> >> On Mon, Jun 13, 2016 at 7:56 AM, Marmaduke Woodman >> wrote: >> >>> hi all, >>> >>> I'm a dev behind the Virtual Brain (TVB) a neuroimaging simulation >>> library, >>> written in Python. As part of an effort to reach out to some users stuck >>> in >>> MATLAB, I've been testing MATLAB's recent *official* support for Python >>> (2014b+). It is notably better than the many other attempts because it >>> appears to use Py's C API to expose objects and methods as MATLAB objects >>> and methods instead of just providing eval/exec. >>> >> Yes - as a co-author of an eval/exec solution ( http://arokem.github.io/python-matlab-bridge/), I am happy to see that they are doing the work to provide a more thorough solution. I wish that they told us what they are doing (rather than providing an opaque executable). These kinds of solutions can be reconfigured into many other useful things (think 0MQ). But hey - it's a start! To be fair, there was an attempt at an open-source project in this direction (I believe this one: http://mlabwrap.sourceforge.net/), but of course it's better if Mathworks provide the authoritative solution. > >>> Generally it's working, but there are some workarounds to avoid >>> segfaults, >>> errors and lack of stdout/err. I've started to collect a series of tips >>> and >>> workarounds and figured there might be common interest in the community >>> for >>> a "helper" library so all our projects can benefit from a larger user >>> base, >>> and maybe ease the transition for potential Python users. >>> >>> The most important in order, >>> >>> - Anything touching HDF5 must use exactly the same version as MATLAB or >>> segfault >>> >> Ouch - I guess this means it's not possible to use h5py or pytables? >> > So - yes - the answer here seems to be "yes" (see above). Marmaduke - what exactly do you mean "anything touching HDF5"? What series of steps causes this segfault? > - Scipy.io.{save,load}mat segfault >>> >> That's interesting - it's probably possible to fix that, because that >> stuff doesn't call any external libraries. Do you have any more >> details? >> >> - C++ extension modules whose exception model doesn't match MATLAB's, >>> segfault on exception >>> - Linalg segfault unless using MKL (which is what MATLAB uses) >>> - stdout/err and logging must be redirecting to mexPrintf via a ctypes >>> monkey patch >>> - Numpy/MATLAB array conversion is O(1) op, but can probably be made zero >>> copy with a ctypes + lib mx workaround >>> >>> Aside from these (for which I sent a service request to MathWorks), it's >>> working well and we (TVB) expect our next release to include demo >>> scripts in >>> MATLAB. >>> >>> So, again, if there is common interest in collecting notes and a helper >>> library to monkey patch around the segfaults, it'd be great to not do >>> this >>> alone ;) >>> >> Thanks too for the feedback. You might also get some interest over >> at the numpy / scipy lists. Although the blas / lapack thing is a bit >> of a downer... >> >> Cheers, >> >> Matthew >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dimitri.papadopoulos at cea.fr Mon Jun 13 14:15:11 2016 From: dimitri.papadopoulos at cea.fr (Dimitri Papadopoulos Orfanos) Date: Mon, 13 Jun 2016 20:15:11 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: Message-ID: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Hi Matthew, There are a few links referring to the Matlab 7.3 format: http://scipy-cookbook.readthedocs.io/items/Reading_mat_files.html#matlab-7-3-and-greater http://stackoverflow.com/questions/4950630/matlab-differences-between-mat-versions There is also this PDF from Matlab which does not contain even once the "HDF" string but does seem to refer to the HDF5-based 7.3 format: http://www.mathworks.com/help/pdf_doc/matlab/matfile_format.pdf Le 13/06/2016 ? 20:02, Matthew Brett a ?crit : > I'm the scipy.io.matlab maintainer. The reason I never got into the > 7.3 format was the hdf5 library dependency, but now stuff is getting > easier to install (wheels, conda packages), it might be worth > revisiting it. Also - I might be wrong, but I don't think Mathworks > has documented their 7.3 format. It would probably take a week of > work to get something to test for scipy, but it's way down my priority > list at the moment. Best, -- Dimitri Papadopoulos From matthew.brett at gmail.com Mon Jun 13 15:10:02 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 13 Jun 2016 12:10:02 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: Hi Dimitri, On Mon, Jun 13, 2016 at 11:15 AM, Dimitri Papadopoulos Orfanos wrote: > Hi Matthew, > > There are a few links referring to the Matlab 7.3 format: > http://scipy-cookbook.readthedocs.io/items/Reading_mat_files.html#matlab-7-3-and-greater Yes, it's not hard to open the files with h5py or similar, the work will be to see how the file structure relates to the saved variables. It's quite possible that it is fairly easy to work out - I haven't looked. > http://stackoverflow.com/questions/4950630/matlab-differences-between-mat-versions > > There is also this PDF from Matlab which does not contain even once the > "HDF" string but does seem to refer to the HDF5-based 7.3 format: > http://www.mathworks.com/help/pdf_doc/matlab/matfile_format.pdf That's the doc that describes the earlier mat file formats (versions 4 and 5). scipy.io implements all the stuff described in that doc. Quoting from the introduction there: "This document describes the internal format of MATLAB Level 4 and Level 5 MAT files." See you, Matthew From satra at mit.edu Mon Jun 13 16:31:09 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Mon, 13 Jun 2016 16:31:09 -0400 Subject: [Neuroimaging] New nilearn release In-Reply-To: <20160613163305.GA4109022@phare.normalesup.org> References: <20160613163305.GA4109022@phare.normalesup.org> Message-ID: congratulations! cheers, satra On Mon, Jun 13, 2016 at 12:33 PM, Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > Dear neuroimagers, > > We have just released a new version (0.2.5) of nilearn: machine learning > for neuroimaging. http://nilearn.github.io/ > > This is an incremental release. The highlights are more didactic docs and > examples, as well as hemispheric plotting for glass brain and > connectomes, visible at the bottom of the following examples: > > http://nilearn.github.io/auto_examples/01_plotting/plot_demo_glass_brain.html > > http://nilearn.github.io/auto_examples/03_connectivity/plot_multi_subject_connectome.html > > Thanks to all the contributors: http://nilearn.github.io/whats_new.html > > Ga?l > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From krzysztof.gorgolewski at gmail.com Mon Jun 13 16:46:10 2016 From: krzysztof.gorgolewski at gmail.com (Chris Gorgolewski) Date: Mon, 13 Jun 2016 16:46:10 -0400 Subject: [Neuroimaging] New nilearn release In-Reply-To: References: <20160613163305.GA4109022@phare.normalesup.org> Message-ID: Love the new visualisation! On Jun 13, 2016 4:31 PM, "Satrajit Ghosh" wrote: > congratulations! > > cheers, > > satra > > On Mon, Jun 13, 2016 at 12:33 PM, Gael Varoquaux < > gael.varoquaux at normalesup.org> wrote: > >> Dear neuroimagers, >> >> We have just released a new version (0.2.5) of nilearn: machine learning >> for neuroimaging. http://nilearn.github.io/ >> >> This is an incremental release. The highlights are more didactic docs and >> examples, as well as hemispheric plotting for glass brain and >> connectomes, visible at the bottom of the following examples: >> >> http://nilearn.github.io/auto_examples/01_plotting/plot_demo_glass_brain.html >> >> http://nilearn.github.io/auto_examples/03_connectivity/plot_multi_subject_connectome.html >> >> Thanks to all the contributors: http://nilearn.github.io/whats_new.html >> >> Ga?l >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexandre.gramfort at telecom-paristech.fr Mon Jun 13 16:50:34 2016 From: alexandre.gramfort at telecom-paristech.fr (Alexandre Gramfort) Date: Mon, 13 Jun 2016 22:50:34 +0200 Subject: [Neuroimaging] New nilearn release In-Reply-To: References: <20160613163305.GA4109022@phare.normalesup.org> Message-ID: congrats folks ! A On Mon, Jun 13, 2016 at 10:46 PM, Chris Gorgolewski wrote: > Love the new visualisation! > > On Jun 13, 2016 4:31 PM, "Satrajit Ghosh" wrote: >> >> congratulations! >> >> cheers, >> >> satra >> >> On Mon, Jun 13, 2016 at 12:33 PM, Gael Varoquaux >> wrote: >>> >>> Dear neuroimagers, >>> >>> We have just released a new version (0.2.5) of nilearn: machine learning >>> for neuroimaging. http://nilearn.github.io/ >>> >>> This is an incremental release. The highlights are more didactic docs and >>> examples, as well as hemispheric plotting for glass brain and >>> connectomes, visible at the bottom of the following examples: >>> >>> http://nilearn.github.io/auto_examples/01_plotting/plot_demo_glass_brain.html >>> >>> http://nilearn.github.io/auto_examples/03_connectivity/plot_multi_subject_connectome.html >>> >>> Thanks to all the contributors: http://nilearn.github.io/whats_new.html >>> >>> Ga?l >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >> >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > From marmaduke.woodman at univ-amu.fr Tue Jun 14 04:21:31 2016 From: marmaduke.woodman at univ-amu.fr (Marmaduke Woodman) Date: Tue, 14 Jun 2016 10:21:31 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: hi To respond to the different open points / questions in previous emails: > Ouch - I guess this means it's not possible to use h5py or pytables? It is, but you have to compile them with a version of HDF5 includes that match MATLAB's, so that the API matches. > what exactly do you mean "anything touching HDF5"? What series of steps causes this segfault? In fact, the h5py/pytables modules load symbols from the HDF5 lib, which as a first sanity check, verifies that the header version matches the library version. If not, abort(). MATLAB uses an older version of HDF5 than Anaconda's defaults hence the problem. Compiling these libs with a header matching MATLAB's version works works OK. > anything that is matlab >= 2006b They call it mat file version 7.3. Version 7 or less is fine, libmatio (and scipy) can handle it. > I don't think Mathworks has documented their 7.3 format. Yep, IIRC they put weird stuff in the HDF5 which isn't trivially readable by h5py, but Octave people must have reverse engineered it, so would be a good reference point. > What operating systems have you tested these on? I wouldn't be surprised if you found that all of the above issues happened only on Linux Windows, Mac & Linux. My experience is that toolchains, compiler flags and library versions are more consistent on Windows & Mac, leading to fewer surprises. > Too bad about lin alg The main issue is that MKL used by MATLAB uses the same symbol names but not the same ABI, so NumPy's extension modules' references resolve to MKL symbols and not its own and put the args on the stack in the wrong order. Monkey patching around it would be not difficult by intercepting numpy.linalg calls and routing them to MATLAB's MEX Lapack routines via ctypes. It would also be possible to build NumPy against MATLAB's MKL BLAS & LAPACK routines, though my suggestion on the numpy mailing list about this got no replies. Anaconda with MKL though appears to work fine, since the relevant Python modules (numpy..) are compiled against MKL in that case. >> - Scipy.io.{save,load}mat segfault > That's interesting - it's probably possible to fix that, because that stuff doesn't call any external libraries. Do you have any more details? I thought I had seen a symbol in the stacktrace pointing to libmatio I believe, but I just read mio5.py and it's all in Python, so I'll have to go back and check, maybe it was a h5py load. > I wish that they told us what they are doing (rather than providing an opaque executable). To be fair, it was in their release notes ;) Cheers, Marmaduke On Mon, Jun 13, 2016 at 9:10 PM, Matthew Brett wrote: > Hi Dimitri, > > On Mon, Jun 13, 2016 at 11:15 AM, Dimitri Papadopoulos Orfanos > wrote: > > Hi Matthew, > > > > There are a few links referring to the Matlab 7.3 format: > > > http://scipy-cookbook.readthedocs.io/items/Reading_mat_files.html#matlab-7-3-and-greater > > Yes, it's not hard to open the files with h5py or similar, the work > will be to see how the file structure relates to the saved variables. > It's quite possible that it is fairly easy to work out - I haven't > looked. > > > > http://stackoverflow.com/questions/4950630/matlab-differences-between-mat-versions > > > > There is also this PDF from Matlab which does not contain even once the > > "HDF" string but does seem to refer to the HDF5-based 7.3 format: > > http://www.mathworks.com/help/pdf_doc/matlab/matfile_format.pdf > > That's the doc that describes the earlier mat file formats (versions 4 > and 5). scipy.io implements all the stuff described in that doc. > Quoting from the introduction there: > > "This document describes the internal format of MATLAB Level 4 and > Level 5 MAT files." > > See you, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From marmaduke.woodman at univ-amu.fr Tue Jun 14 09:27:03 2016 From: marmaduke.woodman at univ-amu.fr (Marmaduke Woodman) Date: Tue, 14 Jun 2016 15:27:03 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: On Tue, Jun 14, 2016 at 10:21 AM, Marmaduke Woodman < marmaduke.woodman at univ-amu.fr> wrote: > >> - Scipy.io.{save,load}mat segfault > > That's interesting - it's probably possible to fix that, because that stuff > doesn't call any external libraries. Do you have any more > details? > > I thought I had seen a symbol in the stacktrace pointing to libmatio I > believe, but I just read mio5.py and it's all in Python, so I'll have to go > back and check, maybe it was a h5py load. > just to follow up here, you're right, scipy.io.{save/load}mat is working. I must have blinked. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Tue Jun 14 11:03:22 2016 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 14 Jun 2016 08:03:22 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: On Jun 14, 2016 1:22 AM, "Marmaduke Woodman" wrote: > > > What operating systems have you tested these on? I wouldn't be surprised if you found that all of the above issues happened only on Linux > > Windows, Mac & Linux. My experience is that toolchains, compiler flags and library versions are more consistent on Windows & Mac, leading to fewer surprises. My prediction is that on Windows and OS X, you actually can use inconsistent versions of hdf5 and blas, and it will work. Can you confirm this explicitly? It's possible that I'm wrong, but I would be surprised because it's actually impossible to get symbol collisions on those operating systems unless you really try hard to configure things in an extra broken way. This toolchain diversity argument doesn't make a ton of sense to me -- Windows in particular actually has way more toolchain diversity than the other platforms. (I notice they say they're compatible with py27, py33, py34... I wonder what compilers they used for that on Windows.) (This might also explain how they could have shipped with basic stuff like numpy broken, if they only tested on Windows or something...) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From marmaduke.woodman at univ-amu.fr Tue Jun 14 15:33:18 2016 From: marmaduke.woodman at univ-amu.fr (Marmaduke Woodman) Date: Tue, 14 Jun 2016 21:33:18 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: On Tue, Jun 14, 2016 at 5:03 PM, Nathaniel Smith wrote: > My prediction is that on Windows and OS X, you actually can use > inconsistent versions of hdf5 and blas, and it will work. Can you confirm > this explicitly? It's possible that I'm wrong, but I would be surprised > because it's actually impossible to get symbol collisions on those > operating systems unless you really try hard to configure things in an > extra broken way. > I can confirm that h5py's call into hdf5 results in an abort due to inconsistent header and library versions (1.8.15 vs 1.8.16 for example). We can call it something else like ABI mismatch, but nevertheless, it seems has to use their version of HDF5 or not at all. I wrote to this list partly so that others interested might help confirm, deny or qualify the difficulties I stated. > (I notice they say they're compatible with py27, py33, py34... I wonder > what compilers they used for that on Windows.) > They are using the C-API to libpython.dll, so probably MSVC..? > (This might also explain how they could have shipped with basic stuff like > numpy broken, if they only tested on Windows or something...) > Not broken, but not supported either. The official response (originating out of a service request for better Python support) was that NumPy isn't a Python builtin type so they don't want to provide automatic conversions between ndarray and MATLAB array. I think though if enough people pester them, they'll add it; I mean what's the point in not supporting ndarray natively? cheers, Marmaduke -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Tue Jun 14 15:53:17 2016 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 14 Jun 2016 12:53:17 -0700 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: On Jun 14, 2016 12:36 PM, "Marmaduke Woodman" wrote: > > > On Tue, Jun 14, 2016 at 5:03 PM, Nathaniel Smith wrote: >> >> My prediction is that on Windows and OS X, you actually can use inconsistent versions of hdf5 and blas, and it will work. Can you confirm this explicitly? It's possible that I'm wrong, but I would be surprised because it's actually impossible to get symbol collisions on those operating systems unless you really try hard to configure things in an extra broken way. > > I can confirm that h5py's call into hdf5 results in an abort due to inconsistent header and library versions (1.8.15 vs 1.8.16 for example). We can call it something else like ABI mismatch, but nevertheless, it seems has to use their version of HDF5 or not at all. I wrote to this list partly so that others interested might help confirm, deny or qualify the difficulties I stated. Ok. On Windows, I think the only way this could be happening is if everyone using hdf5 is linking to a version of the dll without the version number in the name. The solution is probably for h5py to make sure it links against and ships a library called libhdf5-1.8.16.dll or similar. On OS X... not sure what could be causing this. Maybe h5py's build system is passing some sort of please-give-me-symbol-collisions flag to the linker (-flat_namespace and friends)? Maybe it has some buggy paths embedded in the binaries? On OS X I can't immediately think of a way to make h5py break on MATLAB without making it mostly broken everywhere else too, but I guess someone found a way. It's probably not hard to fix though. On Linux, it's also not too hard to avoid these issues, but if things are broken now then it suggests that MATLAB is doing something bad internally like loading libhdf5 with the RTLD_GLOBAL flag. Which is the flag that means "please monkey patch everything loaded after this with these symbols", equivalent to LD_PRELOAD. And if that's what's going on then there isn't much that h5py can do about it; really the MATLAB folk would need to fix that. >> (I notice they say they're compatible with py27, py33, py34... I wonder what compilers they used for that on Windows.) > > They are using the C-API to libpython.dll, so probably MSVC..? Normally py27 and py33/py34 use different and incompatible versions of MSVC. It's possible to mix them under certain limited circumstances, but this is also another chance to mess things up :-). This matters because if you want to use wheels from PyPI, the py27 wheels all assume one version of msvc, and the py33/py34 wheels all assume a different and incompatible version. (And py35 is yet another different and incompatible version.) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fabio.Bernardoni at uniklinikum-dresden.de Wed Jun 15 13:51:36 2016 From: Fabio.Bernardoni at uniklinikum-dresden.de (Bernardoni, Fabio) Date: Wed, 15 Jun 2016 17:51:36 +0000 Subject: [Neuroimaging] LocalGI with nipype Message-ID: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717807@G06EDBN1.med.tu-dresden.de> Dear all, I am trying to build a nipype workflow that does a preprocessing and computes the local gyrification index (LGI) for a set of subjects. On the freesurfer documentation I read that to compute the LGI, the pial surface must be present (very reasonable), so that one should first run recon all with the -all directive and once this has success one can run recon all with the -localGI directive: 1) recon-all -s -all 2) recon-all -s -localGI When I implement 2) in bash I do something like: if [ ! -f ./surf/rh.pial_lgi ]; then recon-all -s -localGI 2>&1 How do I implement this dependency in nipype? Normally I require that one of the output of process 1 (optimally the pial surface) is an input of process 2. But the pial surface cannot be an input of process 2 because the ReconAll node does not accept the pial surface as input (there is in general no output of ReconAll that is also its input). So, is the solution to create a derived class ReconAll_LGI with these properties? Thanks Fabio Dr. Fabio Bernardoni wiss. Mitarbeiter Psychosoziale Medizin und Entwicklungsneurowissenschaften Tel. +49 (0)351 458-5245 Fax +49 (0)351 458-7206 URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t an der Technischen Universit?t Dresden Anstalt des ?ffentlichen Rechts des Freistaates Sachsen Fetscherstra?e 74, 01307 Dresden http://www.uniklinikum-dresden.de Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 From bbfrederick at mclean.harvard.edu Thu Jun 16 09:44:04 2016 From: bbfrederick at mclean.harvard.edu (Frederick, Blaise B.) Date: Thu, 16 Jun 2016 13:44:04 +0000 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software Message-ID: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Hi all, For the last few years my lab has been doing time delay analysis on fMRI and concurrent fMRI/NIRS data, and I?ve written a number of python tools for performing the analysis, and they?ve been refined for several years at this point, and I think they could be generally useful to people, so I?m looking to release the software. I?m fairly new at this, and I?ve found a bunch of recommendations on how to do this, some of which are contradictory. I?d appreciate any help I could get on this. What I?ve done so far: 1) Chosen a license (Apache 2, based on a lot of reading and some conversations) 2) Put the core programs up on github (https://github.com/bbfrederick/delaytools) 3) Tried to put together rudimentary documentation and installation directions. As things exist now, if you install the prerequisites, download the code, and add the main directory to your path, you should be able to run the tools, which is a fine start, but there seems to be a lot more to installation than that (automatically installing dependancies and all that) that I?m unable to figure out. I?d appreciate any feedback on this. I?m a little mystified by the vagaries of constructing a setup.py file, and what constitutes a ?package? and a ?module?. Some of the questions I have: 1) In addition to the main program, I have scads of command line utilities that make preparing and interpreting the data easier - do I just put them all in the top level directory, or in a bin directory? 2) The dependancies for the majority of the tools are very simple (numpy, scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of useful gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be installable with pip, so I?m not sure how to handle that (installing it all with anaconda is easy though). How should I handle this? 3) What?s the best way to publicize this? This won?t be useful if nobody can find it. Thanks, Blaise -------------- Blaise Frederick Associate Professor of Psychiatry/Biophysicist Harvard Medical School/McLean Hospital bbfrederick at mclean.harvard.edu http://www.nirs-fmri.net The information in this e-mail is intended only for the person to whom it is addressed. If you believe this e-mail was sent to you in error and the e-mail contains patient information, please contact the Partners Compliance HelpLine at http://www.partners.org/complianceline . If the e-mail was sent to you in error but does not contain patient information, please contact the sender and properly dispose of the e-mail. -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Thu Jun 16 10:20:19 2016 From: arokem at gmail.com (Ariel Rokem) Date: Thu, 16 Jun 2016 07:20:19 -0700 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Message-ID: Hi Blaise, On Thu, Jun 16, 2016 at 6:44 AM, Frederick, Blaise B. < bbfrederick at mclean.harvard.edu> wrote: > Hi all, > > For the last few years my lab has been doing time delay analysis on fMRI > and concurrent fMRI/NIRS data, and I?ve written a number of python tools > for performing the analysis, and they?ve been refined for several years at > this point, and I think they could be generally useful to people, so I?m > looking to release the software. I?m fairly new at this, and I?ve found a > bunch of recommendations on how to do this, some of which are > contradictory. I?d appreciate any help I could get on this. > > What I?ve done so far: > 1) Chosen a license (Apache 2, based on a lot of reading and some > conversations) > 2) Put the core programs up on github ( > https://github.com/bbfrederick/delaytools) > 3) Tried to put together rudimentary documentation and installation > directions. > > As things exist now, if you install the prerequisites, download the code, > and add the main directory to your path, you should be able to run the > tools, which is a fine start, but there seems to be a lot more to > installation than that (automatically installing dependancies and all that) > that I?m unable to figure out. I?d appreciate any feedback on this. I?m a > little mystified by the vagaries of constructing a setup.py file, and what > constitutes a ?package? and a ?module?. > You might find this template project useful: https://github.com/uwescience/shablona It has examples for some of these things (including testing and documentation) > Some of the questions I have: > 1) In addition to the main program, I have scads of command line utilities > that make preparing and interpreting the data easier - do I just put them > all in the top level directory, or in a bin directory? > You can put these in a bin directory and install them using the `scripts` kwarg to the `setup` function. Shablona doesn't have that, but here's how it happens on Dipy: https://github.com/nipy/dipy/blob/master/setup.py#L216 > 2) The dependancies for the majority of the tools are very simple (numpy, > scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of useful > gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be > installable with pip, so I?m not sure how to handle that (installing it all > with anaconda is easy though). How should I handle this? > I would handle that as an optional dependency, that only gets imported when it's needed. Another Dipy story: we use VTK in some visualizations, but this is an optional dependency. As of recently, we try to raise an informative error when the import is triggered in the absence of VTK. Something like:"you are trying to use a feature that requires pyqt and pyqtgraph, for instructions on installing these see https:// bbfrederick .github.io/delaytools /documentation/installtion/optional/install.html" would be helpful, I think. 3) What?s the best way to publicize this? This won?t be useful if nobody > can find it. > Announcing your releases to this list (and other lists where potential users hang out) is a good idea. Presenting it at conferences also helps spread the word. Another idea: you can write a paper about your software, so that people can make reference to your software when using it (see recent conversation about that here: https://mail.python.org/pipermail/neuroimaging/2016-April/000875.html). I hope that all helps and I am sure others will chime in with their ideas/opinions, Ariel > Thanks, > Blaise > > -------------- > Blaise Frederick > Associate Professor of Psychiatry/Biophysicist > Harvard Medical School/McLean Hospital > bbfrederick at mclean.harvard.edu > http://www.nirs-fmri.net > > > > The information in this e-mail is intended only for the person to whom it > is > addressed. If you believe this e-mail was sent to you in error and the > e-mail > contains patient information, please contact the Partners Compliance > HelpLine at > http://www.partners.org/complianceline . If the e-mail was sent to you in > error > but does not contain patient information, please contact the sender and > properly > dispose of the e-mail. > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blaise.frederick at gmail.com Thu Jun 16 10:23:34 2016 From: blaise.frederick at gmail.com (Blaise Frederick) Date: Thu, 16 Jun 2016 10:23:34 -0400 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Message-ID: <16995155-C23C-4255-97A1-059EB8D4EA1F@gmail.com> Thank you! That?s exactly the kind of thing I was looking for? B > On Jun 16, 2016, at 10:20 AM, Ariel Rokem wrote: > > Hi Blaise, > > On Thu, Jun 16, 2016 at 6:44 AM, Frederick, Blaise B. > wrote: > Hi all, > > For the last few years my lab has been doing time delay analysis on fMRI and concurrent fMRI/NIRS data, and I?ve written a number of python tools for performing the analysis, and they?ve been refined for several years at this point, and I think they could be generally useful to people, so I?m looking to release the software. I?m fairly new at this, and I?ve found a bunch of recommendations on how to do this, some of which are contradictory. I?d appreciate any help I could get on this. > > What I?ve done so far: > 1) Chosen a license (Apache 2, based on a lot of reading and some conversations) > 2) Put the core programs up on github (https://github.com/bbfrederick/delaytools ) > 3) Tried to put together rudimentary documentation and installation directions. > > As things exist now, if you install the prerequisites, download the code, and add the main directory to your path, you should be able to run the tools, which is a fine start, but there seems to be a lot more to installation than that (automatically installing dependancies and all that) that I?m unable to figure out. I?d appreciate any feedback on this. I?m a little mystified by the vagaries of constructing a setup.py file, and what constitutes a ?package? and a ?module?. > > You might find this template project useful: https://github.com/uwescience/shablona > > It has examples for some of these things (including testing and documentation) > > Some of the questions I have: > 1) In addition to the main program, I have scads of command line utilities that make preparing and interpreting the data easier - do I just put them all in the top level directory, or in a bin directory? > > You can put these in a bin directory and install them using the `scripts` kwarg to the `setup` function. Shablona doesn't have that, but here's how it happens on Dipy: https://github.com/nipy/dipy/blob/master/setup.py#L216 > > 2) The dependancies for the majority of the tools are very simple (numpy, scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of useful gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be installable with pip, so I?m not sure how to handle that (installing it all with anaconda is easy though). How should I handle this? > > I would handle that as an optional dependency, that only gets imported when it's needed. Another Dipy story: we use VTK in some visualizations, but this is an optional dependency. As of recently, we try to raise an informative error when the import is triggered in the absence of VTK. Something like:"you are trying to use a feature that requires pyqt and pyqtgraph, for instructions on installing these see https:// bbfrederick .github.io/delaytools /documentation/installtion/optional/install.html" would be helpful, I think. > > 3) What?s the best way to publicize this? This won?t be useful if nobody can find it. > > Announcing your releases to this list (and other lists where potential users hang out) is a good idea. Presenting it at conferences also helps spread the word. Another idea: you can write a paper about your software, so that people can make reference to your software when using it (see recent conversation about that here:https://mail.python.org/pipermail/neuroimaging/2016-April/000875.html ). > > I hope that all helps and I am sure others will chime in with their ideas/opinions, > > Ariel > > > Thanks, > Blaise > > -------------- > Blaise Frederick > Associate Professor of Psychiatry/Biophysicist > Harvard Medical School/McLean Hospital > bbfrederick at mclean.harvard.edu > http://www.nirs-fmri.net > > > > > The information in this e-mail is intended only for the person to whom it is > addressed. If you believe this e-mail was sent to you in error and the e-mail > contains patient information, please contact the Partners Compliance HelpLine at > http://www.partners.org/complianceline . If the e-mail was sent to you in error > but does not contain patient information, please contact the sender and properly > dispose of the e-mail. > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From christopherrmullins at gmail.com Thu Jun 16 10:24:26 2016 From: christopherrmullins at gmail.com (Christopher Mullins) Date: Thu, 16 Jun 2016 10:24:26 -0400 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Message-ID: Hi, I'm not a frequent contributor to this list so others may have more acceptable solutions, but I ran into some similar challenges and I'd love to share what I did. Regarding pyqt4 and pyqtgraph: I made a desktop application using pyqtgraph, and my solution was to compile it into a binary form with pyinstaller, so that people could use it without having to learn python or do any sort of compiling or terminal work. The documentation[1] on this is pretty clear and I didn't run into many problems. For the command line utilities, I'd usually make a "Utilities" directory to indicate that they're not part of the main application. Chris [1] https://wiki.python.org/moin/PyQt/Deploying_PyQt_Applications -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at onerussian.com Thu Jun 16 11:01:34 2016 From: lists at onerussian.com (Yaroslav Halchenko) Date: Thu, 16 Jun 2016 11:01:34 -0400 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Message-ID: <20160616150133.GU11174@onerussian.com> Great question(s) and answers so far! a bit more to the plate: On Thu, 16 Jun 2016, Ariel Rokem wrote: > > For the last few years my lab has been doing time delay analysis on fMRI > > and concurrent fMRI/NIRS data, and I?ve written a number of python tools > > for performing the analysis, and they?ve been refined for several years at > > this point, and I think they could be generally useful to people, so I?m > > looking to release the software. I?m fairly new at this, and I?ve found a > > bunch of recommendations on how to do this, some of which are > > contradictory. I?d appreciate any help I could get on this. > > What I?ve done so far: > > 1) Chosen a license (Apache 2, based on a lot of reading and some > > conversations) > > 2) Put the core programs up on github ( > > https://github.com/bbfrederick/delaytools) > > 3) Tried to put together rudimentary documentation and installation > > directions. > > As things exist now, if you install the prerequisites, download the code, > > and add the main directory to your path, you should be able to run the > > tools, which is a fine start, but there seems to be a lot more to > > installation than that (automatically installing dependancies and all that) > > that I?m unable to figure out. I?d appreciate any feedback on this. I?m a > > little mystified by the vagaries of constructing a setup.py file, and what > > constitutes a ?package? and a ?module?. > You might find this template project useful: > https://github.com/uwescience/shablona > It has examples for some of these things (including testing and > documentation) I cannot stress enough on necessity to test scientific code. So I would 1. move actual functionality from bin/* scripts under delaytools/ modules, and just import/invoke them within the bin/scripts 2. now, that you have that functionality straight in the module, you could test it 'easier', so provide delaytools/tests/ and start working on them. > > Some of the questions I have: > > 1) In addition to the main program, I have scads of command line utilities > > that make preparing and interpreting the data easier - do I just put them > > all in the top level directory, or in a bin directory? > You can put these in a bin directory and install them using the `scripts` > kwarg to the `setup` function. Shablona doesn't have that, but here's how > it happens on Dipy: https://github.com/nipy/dipy/blob/master/setup.py#L216 depending on 'philosophy' to follow (trust setuptools or not ;) ), you could even generate those scripts automagically by setuptools, so they would be appropriately generated even on Windows. See e.g. https://github.com/duecredit/duecredit/blob/master/setup.py#L96 I then usually do 'pip install -e .' (within virtualenv) so those entry point scripts get generated, and available within virtualenv while referring to actual development copy of the code. Also please take example from dipy and others to use consistent prefix for your commands (e.g. dt_ ?) so users could easily find/complete them in their shell and/or provide the "gateway" script (delaytools) which would execute/pass options to corresponding script/module. Similar to how git, git-annex, cmtk, datalad, ... are doing ;) > > 2) The dependancies for the majority of the tools are very simple (numpy, > > scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of useful > > gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be > > installable with pip, so I?m not sure how to handle that (installing it all > > with anaconda is easy though). How should I handle this? > I would handle that as an optional dependency, that only gets imported when > it's needed. Another Dipy story: we use VTK in some visualizations, but > this is an optional dependency. As of recently, we try to raise an > informative error when the import is triggered in the absence of VTK. > Something like:"you are trying to use a feature that requires pyqt and > pyqtgraph, for instructions on installing these see https:// > bbfrederick > .github.io/delaytools > /documentation/installtion/optional/install.html" > would be helpful, I think. I would also recommend to use 'extras_require' for setup call, so you could create one 'gui' where you would list those which you could install via pip, then if someone does pip install delaytools[gui] it would install core dependencies and those needed for GUI. Needless to say that having a Debian package would probably make installation really easy on any debian-based (and may be soon windows ;-)) platform. > 3) What?s the best way to publicize this? This won?t be useful if nobody > > can find it. > Announcing your releases to this list (and other lists where potential > users hang out) is a good idea. Presenting it at conferences also helps > spread the word. Another idea: you can write a paper about your software, > so that people can make reference to your software when using it (see > recent conversation about that here: > https://mail.python.org/pipermail/neuroimaging/2016-April/000875.html). > I hope that all helps and I am sure others will chime in with their > ideas/opinions, when you are ready, I would be happy to (re)tweet and publicize otherwise. Also would be nice if - for those references you listed in README.md you make hyperlinks, so people could immediately go to them - some specific examples/pipelines demonstrated specific uses of the tools - I would encourage to provide duecredit entries pointing to corresponding publications which you want to have cited when particular method is used, and/or pointing to good uses of them (IIRC 'edu' tag for those in duecredit entries). shablona already has some examples, or have a look e.g. at pymvpa: https://github.com/PyMVPA/PyMVPA/blob/6a9d4060ad863f99170801854c272b61af51f015/mvpa2/__init__.py#L178 https://github.com/PyMVPA/PyMVPA/blob/master/mvpa2/algorithms/hyperalignment.py#L170 https://github.com/PyMVPA/PyMVPA/blob/master/mvpa2/algorithms/searchlight_hyperalignment.py#L52 -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From arokem at gmail.com Thu Jun 16 11:39:06 2016 From: arokem at gmail.com (Ariel Rokem) Date: Thu, 16 Jun 2016 08:39:06 -0700 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: <20160616150133.GU11174@onerussian.com> References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> <20160616150133.GU11174@onerussian.com> Message-ID: On Thu, Jun 16, 2016 at 8:01 AM, Yaroslav Halchenko wrote: > Great question(s) and answers so far! a bit more to the plate: > > On Thu, 16 Jun 2016, Ariel Rokem wrote: > > > For the last few years my lab has been doing time delay analysis on > fMRI > > > and concurrent fMRI/NIRS data, and I?ve written a number of python > tools > > > for performing the analysis, and they?ve been refined for several > years at > > > this point, and I think they could be generally useful to people, so > I?m > > > looking to release the software. I?m fairly new at this, and I?ve > found a > > > bunch of recommendations on how to do this, some of which are > > > contradictory. I?d appreciate any help I could get on this. > > > > What I?ve done so far: > > > 1) Chosen a license (Apache 2, based on a lot of reading and some > > > conversations) > > > 2) Put the core programs up on github ( > > > https://github.com/bbfrederick/delaytools) > > > 3) Tried to put together rudimentary documentation and installation > > > directions. > > > > As things exist now, if you install the prerequisites, download the > code, > > > and add the main directory to your path, you should be able to run the > > > tools, which is a fine start, but there seems to be a lot more to > > > installation than that (automatically installing dependancies and all > that) > > > that I?m unable to figure out. I?d appreciate any feedback on this. > I?m a > > > little mystified by the vagaries of constructing a setup.py file, and > what > > > constitutes a ?package? and a ?module?. > > > > You might find this template project useful: > > https://github.com/uwescience/shablona > > It has examples for some of these things (including testing and > > documentation) > > > I cannot stress enough on necessity to test scientific code. So I would > > 1. move actual functionality from bin/* scripts under delaytools/ > modules, and just import/invoke them within the bin/scripts > > > 2. now, that you have that functionality straight in the module, you > could test it 'easier', so provide delaytools/tests/ and start working > on them. > > > > > > Some of the questions I have: > > > 1) In addition to the main program, I have scads of command line > utilities > > > that make preparing and interpreting the data easier - do I just put > them > > > all in the top level directory, or in a bin directory? > > > > You can put these in a bin directory and install them using the `scripts` > > kwarg to the `setup` function. Shablona doesn't have that, but here's how > > it happens on Dipy: > https://github.com/nipy/dipy/blob/master/setup.py#L216 > > depending on 'philosophy' to follow (trust setuptools or not ;) ), > you could even generate those scripts automagically by setuptools, so > they would be appropriately generated even on Windows. See e.g. > > https://github.com/duecredit/duecredit/blob/master/setup.py#L96 > > > I then usually do 'pip install -e .' (within virtualenv) so those entry > point scripts get generated, and available within virtualenv while > referring to actual development copy of the code. > > > Also please take example from dipy and others to use consistent prefix > for your commands (e.g. dt_ ?) so users could easily find/complete them > in their shell and/or provide the "gateway" script (delaytools) > which would execute/pass options to corresponding script/module. > Similar to how git, git-annex, cmtk, datalad, ... are doing ;) > > > > 2) The dependancies for the majority of the tools are very simple > (numpy, > > > scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of > useful > > > gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be > > > installable with pip, so I?m not sure how to handle that (installing > it all > > > with anaconda is easy though). How should I handle this? > > > > I would handle that as an optional dependency, that only gets imported > when > > it's needed. Another Dipy story: we use VTK in some visualizations, but > > this is an optional dependency. As of recently, we try to raise an > > informative error when the import is triggered in the absence of VTK. > > Something like:"you are trying to use a feature that requires pyqt and > > pyqtgraph, for instructions on installing these see https:// > > bbfrederick > > .github.io/delaytools > > >/documentation/installtion/optional/install.html" > > would be helpful, I think. > > I would also recommend to use 'extras_require' for setup call, so you > could create one 'gui' where you would list those which you could > install via pip, then if someone does > > pip install delaytools[gui] > > it would install core dependencies and those needed for GUI. > > Needless to say that having a Debian package would probably make > installation really easy on any debian-based (and may be soon windows > ;-)) platform. > > > 3) What?s the best way to publicize this? This won?t be useful if nobody > > > can find it. > > > Announcing your releases to this list (and other lists where potential > > users hang out) is a good idea. Presenting it at conferences also helps > > spread the word. Another idea: you can write a paper about your software, > > so that people can make reference to your software when using it (see > > recent conversation about that here: > > https://mail.python.org/pipermail/neuroimaging/2016-April/000875.html). > > One more idea for publicizing is the nipy.org website. As long as you think that you would like to abide by the code of conduct ( http://nipy.org/conduct.html), you can add your software to the front page of the website. You can also write some blog posts to demonstrate the use of the software, or to describe the software. Details for contributing are here: http://nipy.org/contribute.html > > I hope that all helps and I am sure others will chime in with their > > ideas/opinions, > > when you are ready, I would be happy to (re)tweet and publicize > otherwise. Also would be nice if > > - for those references you listed in README.md you make hyperlinks, so > people could immediately go to them > > - some specific examples/pipelines demonstrated specific uses of the tools > > - I would encourage to provide duecredit entries pointing to > corresponding publications which you want to have cited when > particular method is used, and/or pointing to good uses of them (IIRC > 'edu' tag for those in duecredit entries). > > shablona already has some examples, or have a look e.g. at pymvpa: > > > https://github.com/PyMVPA/PyMVPA/blob/6a9d4060ad863f99170801854c272b61af51f015/mvpa2/__init__.py#L178 > > https://github.com/PyMVPA/PyMVPA/blob/master/mvpa2/algorithms/hyperalignment.py#L170 > > https://github.com/PyMVPA/PyMVPA/blob/master/mvpa2/algorithms/searchlight_hyperalignment.py#L52 > > > > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 > WWW: http://www.linkedin.com/in/yarik > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fabio.Bernardoni at uniklinikum-dresden.de Thu Jun 16 11:43:04 2016 From: Fabio.Bernardoni at uniklinikum-dresden.de (Bernardoni, Fabio) Date: Thu, 16 Jun 2016 15:43:04 +0000 Subject: [Neuroimaging] Freesurfer LocalGI with nipype Message-ID: <9CFEF48C0F425C48BC00CA74CE5F5FAC337179E3@G06EDBN1.med.tu-dresden.de> Dear all, I am trying to build a nipype workflow that does a preprocessing and computes the local gyrification index (LGI) for a set of subjects. >From the freesurfer documentation, to compute the LGI the pial surface must be present, so that one should first run recon all with the -all directive and once this has success one can run recon all with the -localGI directive: 1) recon-all -s -all 2) recon-all -s -localGI When I implement 2) in bash I do something like: if [ ! -f ./surf/rh.pial_lgi ]; then recon-all -s -localGI 2>&1 How do I implement this dependency in nipype? Normally I require that one of the output of process 1 (optimally the pial surface) is an input of process 2. But the pial surface cannot be an input of process 2 because the ReconAll node does not accept the pial surface as input (there is in general no output of ReconAll that is also its input). By the way, when I define the recon-all node to compute the LGI: reconall_gyri = pe.Node(interface=fs.ReconAll(directive='localGI', flags='-nuintensitycor-3T', subjects_dir=workflow_dir), name="reconall_gyri") I have the impression that nothing is produced, so maybe I am doing something wrong... Thanks for any help, Fabio Dr. Fabio Bernardoni wiss. Mitarbeiter Psychosoziale Medizin und Entwicklungsneurowissenschaften Tel. +49 (0)351 458-5245 Fax +49 (0)351 458-7206 URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t an der Technischen Universit?t Dresden Anstalt des ?ffentlichen Rechts des Freistaates Sachsen Fetscherstra?e 74, 01307 Dresden http://www.uniklinikum-dresden.de Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 From matthew.brett at gmail.com Thu Jun 16 13:12:27 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 16 Jun 2016 10:12:27 -0700 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: <16995155-C23C-4255-97A1-059EB8D4EA1F@gmail.com> References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> <16995155-C23C-4255-97A1-059EB8D4EA1F@gmail.com> Message-ID: Hi, On Thu, Jun 16, 2016 at 7:23 AM, Blaise Frederick wrote: > Thank you! That?s exactly the kind of thing I was looking for? The shablona project looks like a pretty good template. I would personally recommend pytest rather than nose for testing in a new project, because nose is going out of maintenance at the moment - see : https://nose.readthedocs.io/en/latest/#note-to-users . The `scripts` directory is the standard location for stuff that will become command line scripts. The usual practice is to make these scripts be tiny wrappers that import all the substantial code from the main code-base. See https://github.com/matthew-brett/delocate/tree/master/scripts for some examples - but even these scripts have more code than they should (the code should be in the library). I highly recommend that you enable both travis-ci.org testing (as shablona does) and automated code coverage testing (as shablona does). I generally use versioneer for my new projects, to set the project version from git tags - see : https://pypi.python.org/pypi/versioneer and https://github.com/matthew-brett/delocate/blob/master/setup.py for an example. It's a good idea to put a pip `requirements.txt` file in your project directory so people can easily do: pip install -r requirements.txt to get the requirements for your package. See https://github.com/nipy/nibabel for an example. Lastly - and probably most important - it's very useful to get contributing to other projects that you are using, even in small ways, in the form of issues and pull requests. I find that I pick up an enormous amount of best practice by getting feedback from others, and understanding their reasoning. Good luck, Matthew From marmaduke.woodman at univ-amu.fr Thu Jun 16 13:52:09 2016 From: marmaduke.woodman at univ-amu.fr (Marmaduke Woodman) Date: Thu, 16 Jun 2016 19:52:09 +0200 Subject: [Neuroimaging] Using Python libs from MATLAB In-Reply-To: References: <86ceb205-d4b9-87c3-d62d-801069361eaf@cea.fr> Message-ID: hi Nathaniel Thanks for the details on linking. I hadn't had time to test all bugs on all platforms. On Mac, everything seems ok, including h5py and non-MKL linalg. On Linux, with h5py I see HDF5 library warning that lib and header versions don't match, with a stack trace including [ 1] 0x00007f841ce30fc0 /lib/libc.so.6+00217024 abort+00000384 [ 2] 0x00007f840b9c8d75 /soft/matlab2015a/bin/glnxa64/libhdf5.so.8+00224629 H5check_version+00000325 [ 3] 0x00007f83de4c367f /home/duke/mc/envs/test/lib/python3.5/site-packages/h5py/h5f.so+00034431 PyInit_h5f+00006815 ldd shows h5py's dll linked to conda's hdf5, so this perhaps confirms your point about. cheers, Marmaduke On Tue, Jun 14, 2016 at 9:53 PM, Nathaniel Smith wrote: > On Jun 14, 2016 12:36 PM, "Marmaduke Woodman" < > marmaduke.woodman at univ-amu.fr> wrote: > > > > > > On Tue, Jun 14, 2016 at 5:03 PM, Nathaniel Smith wrote: > >> > >> My prediction is that on Windows and OS X, you actually can use > inconsistent versions of hdf5 and blas, and it will work. Can you confirm > this explicitly? It's possible that I'm wrong, but I would be surprised > because it's actually impossible to get symbol collisions on those > operating systems unless you really try hard to configure things in an > extra broken way. > > > > I can confirm that h5py's call into hdf5 results in an abort due to > inconsistent header and library versions (1.8.15 vs 1.8.16 for example). We > can call it something else like ABI mismatch, but nevertheless, it seems > has to use their version of HDF5 or not at all. I wrote to this list partly > so that others interested might help confirm, deny or qualify the > difficulties I stated. > > Ok. On Windows, I think the only way this could be happening is if > everyone using hdf5 is linking to a version of the dll without the version > number in the name. The solution is probably for h5py to make sure it links > against and ships a library called libhdf5-1.8.16.dll or similar. > > On OS X... not sure what could be causing this. Maybe h5py's build system > is passing some sort of please-give-me-symbol-collisions flag to the linker > (-flat_namespace and friends)? Maybe it has some buggy paths embedded in > the binaries? On OS X I can't immediately think of a way to make h5py break > on MATLAB without making it mostly broken everywhere else too, but I guess > someone found a way. It's probably not hard to fix though. > > On Linux, it's also not too hard to avoid these issues, but if things are > broken now then it suggests that MATLAB is doing something bad internally > like loading libhdf5 with the RTLD_GLOBAL flag. Which is the flag that > means "please monkey patch everything loaded after this with these > symbols", equivalent to LD_PRELOAD. And if that's what's going on then > there isn't much that h5py can do about it; really the MATLAB folk would > need to fix that. > > >> (I notice they say they're compatible with py27, py33, py34... I wonder > what compilers they used for that on Windows.) > > > > They are using the C-API to libpython.dll, so probably MSVC..? > > Normally py27 and py33/py34 use different and incompatible versions of > MSVC. It's possible to mix them under certain limited circumstances, but > this is also another chance to mess things up :-). > > This matters because if you want to use wheels from PyPI, the py27 wheels > all assume one version of msvc, and the py33/py34 wheels all assume a > different and incompatible version. (And py35 is yet another different and > incompatible version.) > > -n > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Thu Jun 16 15:03:36 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Thu, 16 Jun 2016 12:03:36 -0700 Subject: [Neuroimaging] Looking for advice regarding releasing some analysis software In-Reply-To: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> References: <8C9BE2CA-4EEB-4310-9A10-7E2F0B161287@partners.org> Message-ID: In case it hasn't been mentioned, I would recommend a documentation strategy that can parse your code and including comments and arguments from functions, etc. If you use read the docs you can have the build happen automatically with pushes to the repo, and it's a familiar, clean interface. On Thu, Jun 16, 2016 at 6:44 AM, Frederick, Blaise B. < bbfrederick at mclean.harvard.edu> wrote: > Hi all, > > For the last few years my lab has been doing time delay analysis on fMRI > and concurrent fMRI/NIRS data, and I?ve written a number of python tools > for performing the analysis, and they?ve been refined for several years at > this point, and I think they could be generally useful to people, so I?m > looking to release the software. I?m fairly new at this, and I?ve found a > bunch of recommendations on how to do this, some of which are > contradictory. I?d appreciate any help I could get on this. > > What I?ve done so far: > 1) Chosen a license (Apache 2, based on a lot of reading and some > conversations) > 2) Put the core programs up on github ( > https://github.com/bbfrederick/delaytools) > 3) Tried to put together rudimentary documentation and installation > directions. > > As things exist now, if you install the prerequisites, download the code, > and add the main directory to your path, you should be able to run the > tools, which is a fine start, but there seems to be a lot more to > installation than that (automatically installing dependancies and all that) > that I?m unable to figure out. I?d appreciate any feedback on this. I?m a > little mystified by the vagaries of constructing a setup.py file, and what > constitutes a ?package? and a ?module?. > > Some of the questions I have: > 1) In addition to the main program, I have scads of command line utilities > that make preparing and interpreting the data easier - do I just put them > all in the top level directory, or in a bin directory? > 2) The dependancies for the majority of the tools are very simple (numpy, > scipy, scikits-learn, matplotlib, nibabel), but there?s a kind of useful > gui tool that requires pyqt4 and pyqtgraph - pyqt does not seem to be > installable with pip, so I?m not sure how to handle that (installing it all > with anaconda is easy though). How should I handle this? > 3) What?s the best way to publicize this? This won?t be useful if nobody > can find it. > > Thanks, > Blaise > > -------------- > Blaise Frederick > Associate Professor of Psychiatry/Biophysicist > Harvard Medical School/McLean Hospital > bbfrederick at mclean.harvard.edu > http://www.nirs-fmri.net > > > > The information in this e-mail is intended only for the person to whom it > is > addressed. If you believe this e-mail was sent to you in error and the > e-mail > contains patient information, please contact the Partners Compliance > HelpLine at > http://www.partners.org/complianceline . If the e-mail was sent to you in > error > but does not contain patient information, please contact the sender and > properly > dispose of the e-mail. > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Thu Jun 16 21:25:35 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Thu, 16 Jun 2016 21:25:35 -0400 Subject: [Neuroimaging] LocalGI with nipype In-Reply-To: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717807@G06EDBN1.med.tu-dresden.de> References: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717807@G06EDBN1.med.tu-dresden.de> Message-ID: dear fabio, recon-all provides subject_id as an output. so you can send that to a second recon-all (named differently that then computes the localGI. cheers, satra On Wed, Jun 15, 2016 at 1:51 PM, Bernardoni, Fabio < Fabio.Bernardoni at uniklinikum-dresden.de> wrote: > Dear all, > > I am trying to build a nipype workflow that does a preprocessing and > computes the local gyrification index (LGI) for a set of subjects. > > On the freesurfer documentation I read that to compute the LGI, the pial > surface must be present (very reasonable), so that one should first run > recon all with the -all directive and once this has success one can run > recon all with the -localGI directive: > > 1) recon-all -s -all > 2) recon-all -s -localGI > > When I implement 2) in bash I do something like: > if [ ! -f ./surf/rh.pial_lgi ]; then > recon-all -s -localGI 2>&1 > > How do I implement this dependency in nipype? Normally I require that one > of the output of process 1 (optimally the pial surface) is an input of > process 2. But the pial surface cannot be an input of process 2 because the > ReconAll node does not accept the pial surface as input (there is in > general no output of ReconAll that is also its input). So, is the solution > to create a derived class ReconAll_LGI with these properties? > > Thanks > Fabio > > > > Dr. Fabio Bernardoni > wiss. Mitarbeiter > Psychosoziale Medizin und Entwicklungsneurowissenschaften > Tel. +49 (0)351 458-5245 > Fax +49 (0)351 458-7206 > URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de > Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t > an der Technischen Universit?t Dresden > Anstalt des ?ffentlichen Rechts des Freistaates Sachsen > Fetscherstra?e 74, 01307 Dresden > http://www.uniklinikum-dresden.de > Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer > Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba > USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fabio.Bernardoni at uniklinikum-dresden.de Fri Jun 17 07:41:16 2016 From: Fabio.Bernardoni at uniklinikum-dresden.de (Bernardoni, Fabio) Date: Fri, 17 Jun 2016 11:41:16 +0000 Subject: [Neuroimaging] LocalGI with nipype In-Reply-To: References: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717807@G06EDBN1.med.tu-dresden.de>, Message-ID: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717AA3@G06EDBN1.med.tu-dresden.de> dear satra, thanks a lot for your message. however, for me, the localGI directive does not work. basically the localGI directive (and I suspect something similar would happen to the qcache directive, which is my next step) is always changed to the autorecon2 directive, as I can see in the command.txt file or in the report.rst, or by just printing reconall_gyri.interface.cmdline where I have defined: reconall_gyri = pe.Node(interface=fs.ReconAll(directive='localGI',T1_files=['/scratch/igis/data/mri/nii/64-23-371-1_atd1/20160522_121506aat1mprnssagiso60364-23-371-1atd1.nii'],ignore_exception=False, subject_id ='64-23-371-1_atd1', subjects_dir='/scratch/igis/processing/nipype/smri'), name="reconall_gyri") so basically I think this and qcache are not yet implemented in nipype (version'0.12.0-rc1' ). am I right or is there still something I am overlooking? how can I solve the problem? thanks Fabio Dr. Fabio Bernardoni wiss. Mitarbeiter Psychosoziale Medizin und Entwicklungsneurowissenschaften Tel. +49 (0)351 458-5245 Fax +49 (0)351 458-7206 URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t an der Technischen Universit?t Dresden Anstalt des ?ffentlichen Rechts des Freistaates Sachsen Fetscherstra?e 74, 01307 Dresden http://www.uniklinikum-dresden.de Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 ________________________________ Von: Neuroimaging [neuroimaging-bounces+fabio.bernardoni=uniklinikum-dresden.de at python.org]" im Auftrag von "Satrajit Ghosh [satra at mit.edu] Gesendet: Freitag, 17. Juni 2016 03:25 An: Neuroimaging analysis in Python Betreff: Re: [Neuroimaging] LocalGI with nipype dear fabio, recon-all provides subject_id as an output. so you can send that to a second recon-all (named differently that then computes the localGI. cheers, satra On Wed, Jun 15, 2016 at 1:51 PM, Bernardoni, Fabio > wrote: Dear all, I am trying to build a nipype workflow that does a preprocessing and computes the local gyrification index (LGI) for a set of subjects. On the freesurfer documentation I read that to compute the LGI, the pial surface must be present (very reasonable), so that one should first run recon all with the -all directive and once this has success one can run recon all with the -localGI directive: 1) recon-all -s -all 2) recon-all -s -localGI When I implement 2) in bash I do something like: if [ ! -f ./surf/rh.pial_lgi ]; then recon-all -s -localGI 2>&1 How do I implement this dependency in nipype? Normally I require that one of the output of process 1 (optimally the pial surface) is an input of process 2. But the pial surface cannot be an input of process 2 because the ReconAll node does not accept the pial surface as input (there is in general no output of ReconAll that is also its input). So, is the solution to create a derived class ReconAll_LGI with these properties? Thanks Fabio Dr. Fabio Bernardoni wiss. Mitarbeiter Psychosoziale Medizin und Entwicklungsneurowissenschaften Tel. +49 (0)351 458-5245 Fax +49 (0)351 458-7206 URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t an der Technischen Universit?t Dresden Anstalt des ?ffentlichen Rechts des Freistaates Sachsen Fetscherstra?e 74, 01307 Dresden http://www.uniklinikum-dresden.de Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Fri Jun 17 08:24:41 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 17 Jun 2016 08:24:41 -0400 Subject: [Neuroimaging] LocalGI with nipype In-Reply-To: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717AA3@G06EDBN1.med.tu-dresden.de> References: <9CFEF48C0F425C48BC00CA74CE5F5FAC33717807@G06EDBN1.med.tu-dresden.de> <9CFEF48C0F425C48BC00CA74CE5F5FAC33717AA3@G06EDBN1.med.tu-dresden.de> Message-ID: hi fabio, could you please open an issue on github? we can continue this discussion there. cheers, satra On Fri, Jun 17, 2016 at 7:41 AM, Bernardoni, Fabio < Fabio.Bernardoni at uniklinikum-dresden.de> wrote: > dear satra, > > thanks a lot for your message. however, for me, the localGI directive does > not work. basically the localGI directive (and I suspect something similar > would happen to the qcache directive, which is my next step) is always > changed to the autorecon2 directive, as I can see in the command.txt file > or in the report.rst, or by just printing > > reconall_gyri.interface.cmdline > > where I have defined: > > reconall_gyri = > pe.Node(interface=fs.ReconAll(directive='localGI',T1_files=['/scratch/igis/data/mri/nii/64-23-371-1_atd1/20160522_121506aat1mprnssagiso60364-23-371-1atd1.nii'],ignore_exception=False, > subject_id ='64-23-371-1_atd1', > subjects_dir='/scratch/igis/processing/nipype/smri'), name="reconall_gyri") > > > > so basically I think this and qcache are not yet implemented in nipype > (version'0.12.0-rc1' ). > > am I right or is there still something I am overlooking? how can I solve > the problem? > > thanks > > Fabio > > > > Dr. Fabio Bernardoni > > wiss. Mitarbeiter > Psychosoziale Medizin und Entwicklungsneurowissenschaften > Tel. +49 (0)351 458-5245 > Fax +49 (0)351 458-7206 > URL http://www.uniklinikum-dresden.de/psm > ; > www.transdenlab.de > > > Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t > an der Technischen Universit?t Dresden > Anstalt des ?ffentlichen Rechts des Freistaates Sachsen > Fetscherstra?e 74, 01307 Dresden > http://www.uniklinikum-dresden.de > > Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer > Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba > USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 > ------------------------------ > *Von:* Neuroimaging [neuroimaging-bounces+fabio.bernardoni= > uniklinikum-dresden.de at python.org]" im Auftrag von "Satrajit Ghosh [ > satra at mit.edu] > *Gesendet:* Freitag, 17. Juni 2016 03:25 > *An:* Neuroimaging analysis in Python > *Betreff:* Re: [Neuroimaging] LocalGI with nipype > > dear fabio, > > recon-all provides subject_id as an output. so you can send that to a > second recon-all (named differently that then computes the localGI. > > cheers, > > satra > > On Wed, Jun 15, 2016 at 1:51 PM, Bernardoni, Fabio < > Fabio.Bernardoni at uniklinikum-dresden.de> wrote: > >> Dear all, >> >> I am trying to build a nipype workflow that does a preprocessing and >> computes the local gyrification index (LGI) for a set of subjects. >> >> On the freesurfer documentation I read that to compute the LGI, the pial >> surface must be present (very reasonable), so that one should first run >> recon all with the -all directive and once this has success one can run >> recon all with the -localGI directive: >> >> 1) recon-all -s -all >> 2) recon-all -s -localGI >> >> When I implement 2) in bash I do something like: >> if [ ! -f ./surf/rh.pial_lgi ]; then >> recon-all -s -localGI 2>&1 >> >> How do I implement this dependency in nipype? Normally I require that one >> of the output of process 1 (optimally the pial surface) is an input of >> process 2. But the pial surface cannot be an input of process 2 because the >> ReconAll node does not accept the pial surface as input (there is in >> general no output of ReconAll that is also its input). So, is the solution >> to create a derived class ReconAll_LGI with these properties? >> >> Thanks >> Fabio >> >> >> >> Dr. Fabio Bernardoni >> wiss. Mitarbeiter >> Psychosoziale Medizin und Entwicklungsneurowissenschaften >> Tel. +49 (0)351 458-5245 >> Fax +49 (0)351 458-7206 >> URL http://www.uniklinikum-dresden.de/psm; www.transdenlab.de >> Universit?tsklinikum Carl Gustav Carus & Medizinische Fakult?t >> an der Technischen Universit?t Dresden >> Anstalt des ?ffentlichen Rechts des Freistaates Sachsen >> Fetscherstra?e 74, 01307 Dresden >> http://www.uniklinikum-dresden.de >> Vorstand: Prof. Dr. med. D. M. Albrecht (Sprecher), Wilfried E. B. Winzer >> Vorsitzender des Aufsichtsrates: Prof. Dr. med. Peter C. Scriba >> USt.-IDNr.: DE 140 135 217, St.-Nr.: 203 145 03113 >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Fri Jun 17 09:34:58 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 17 Jun 2016 09:34:58 -0400 Subject: [Neuroimaging] advice on choosing a testing library Message-ID: hi folks, in nipype and other nipy projects we have used nose forever, but as matthew noted, nose will likely not be maintained in the future. python now has a unittest framework built into the standard library, and a large number of other testing frameworks. https://wiki.python.org/moin/PythonTestingToolsTaxonomy are there suggestions on how one might make the decision on which framework to choose? cheers, satra -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri Jun 17 10:25:16 2016 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 17 Jun 2016 07:25:16 -0700 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: AFAICT, literally everyone uses pytest, so that's what I've been moving to. It's fine. On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: hi folks, in nipype and other nipy projects we have used nose forever, but as matthew noted, nose will likely not be maintained in the future. python now has a unittest framework built into the standard library, and a large number of other testing frameworks. https://wiki.python.org/moin/PythonTestingToolsTaxonomy are there suggestions on how one might make the decision on which framework to choose? cheers, satra _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Fri Jun 17 10:39:18 2016 From: arokem at gmail.com (Ariel Rokem) Date: Fri, 17 Jun 2016 07:39:18 -0700 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: On Fri, Jun 17, 2016 at 7:25 AM, Nathaniel Smith wrote: > AFAICT, literally everyone uses pytest, so that's what I've been moving > to. It's fine. > +1. No problems with pytest so far (it's been just a few months for me working with it). Now also moving shablona to recommend that ( https://github.com/uwescience/shablona/pull/42) following Matthew's comment the other day. I don't have anything more clever to say about "how to choose", though. > On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: > > hi folks, > > in nipype and other nipy projects we have used nose forever, but as > matthew noted, nose will likely not be maintained in the future. python now > has a unittest framework built into the standard library, and a large > number of other testing frameworks. > > https://wiki.python.org/moin/PythonTestingToolsTaxonomy > > are there suggestions on how one might make the decision on which > framework to choose? > > cheers, > > satra > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njvack at wisc.edu Fri Jun 17 10:27:31 2016 From: njvack at wisc.edu (Nate Vack) Date: Fri, 17 Jun 2016 14:27:31 +0000 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: +1 to pytest. -n On Fri, Jun 17, 2016 at 9:25 AM Nathaniel Smith wrote: > AFAICT, literally everyone uses pytest, so that's what I've been moving > to. It's fine. > On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: > > hi folks, > > in nipype and other nipy projects we have used nose forever, but as > matthew noted, nose will likely not be maintained in the future. python now > has a unittest framework built into the standard library, and a large > number of other testing frameworks. > > https://wiki.python.org/moin/PythonTestingToolsTaxonomy > > are there suggestions on how one might make the decision on which > framework to choose? > > cheers, > > satra > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Fri Jun 17 12:17:32 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 17 Jun 2016 12:17:32 -0400 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: thank you folks. is there a particular reason why the standard python unittest library is not being used? it seems from the documentation that the two are quite similar. https://docs.python.org/dev/library/unittest.html#basic-example if everyone is using py.test now, we can start moving towards it. cheers, satra On Fri, Jun 17, 2016 at 10:27 AM, Nate Vack wrote: > +1 to pytest. > > -n > > On Fri, Jun 17, 2016 at 9:25 AM Nathaniel Smith wrote: > >> AFAICT, literally everyone uses pytest, so that's what I've been moving >> to. It's fine. >> On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: >> >> hi folks, >> >> in nipype and other nipy projects we have used nose forever, but as >> matthew noted, nose will likely not be maintained in the future. python now >> has a unittest framework built into the standard library, and a large >> number of other testing frameworks. >> >> https://wiki.python.org/moin/PythonTestingToolsTaxonomy >> >> are there suggestions on how one might make the decision on which >> framework to choose? >> >> cheers, >> >> satra >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njvack at wisc.edu Fri Jun 17 12:55:40 2016 From: njvack at wisc.edu (Nate Vack) Date: Fri, 17 Jun 2016 16:55:40 +0000 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: I actually like pytest largely because it's so similar to unittest; however, on test errors it does a bunch of nice magic introspection to show me what's amiss. unittest is also good. I've also made use of the pytest's test parametrize decorator to test reading a whole heaping ton of file versions with one method, though there's an element of "this is dark magic and I kind of feel like maybe I should write tests to test my tests." In the end, it saved me a ton of code so I held my nose and did it; here's the example: https://github.com/njvack/bioread/blob/master/test/test_reader.py#L80 -n On Fri, Jun 17, 2016 at 11:23 AM Satrajit Ghosh wrote: > thank you folks. > > is there a particular reason why the standard python unittest library is > not being used? it seems from the documentation that the two are quite > similar. > > https://docs.python.org/dev/library/unittest.html#basic-example > > if everyone is using py.test now, we can start moving towards it. > > cheers, > > satra > > On Fri, Jun 17, 2016 at 10:27 AM, Nate Vack wrote: > >> +1 to pytest. >> >> -n >> >> On Fri, Jun 17, 2016 at 9:25 AM Nathaniel Smith wrote: >> >>> AFAICT, literally everyone uses pytest, so that's what I've been moving >>> to. It's fine. >>> On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: >>> >>> hi folks, >>> >>> in nipype and other nipy projects we have used nose forever, but as >>> matthew noted, nose will likely not be maintained in the future. python now >>> has a unittest framework built into the standard library, and a large >>> number of other testing frameworks. >>> >>> https://wiki.python.org/moin/PythonTestingToolsTaxonomy >>> >>> are there suggestions on how one might make the decision on which >>> framework to choose? >>> >>> cheers, >>> >>> satra >>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From japsai at gmail.com Fri Jun 17 14:55:00 2016 From: japsai at gmail.com (Jasper van den Bosch) Date: Fri, 17 Jun 2016 11:55:00 -0700 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: I have never felt like I missed out when using unittest. The reason I've known people to choose nose or py.test is that you need less boilerplate code. But when you need to test hierarchies of classes, being used to the object-oriented style of unittest pays off. Jasper On 17 June 2016 at 09:55, Nate Vack wrote: > I actually like pytest largely because it's so similar to unittest; > however, on test errors it does a bunch of nice magic introspection to show > me what's amiss. unittest is also good. > > I've also made use of the pytest's test parametrize decorator to test > reading a whole heaping ton of file versions with one method, though > there's an element of "this is dark magic and I kind of feel like maybe I > should write tests to test my tests." In the end, it saved me a ton of code > so I held my nose and did it; here's the example: > > https://github.com/njvack/bioread/blob/master/test/test_reader.py#L80 > > -n > > On Fri, Jun 17, 2016 at 11:23 AM Satrajit Ghosh wrote: > >> thank you folks. >> >> is there a particular reason why the standard python unittest library is >> not being used? it seems from the documentation that the two are quite >> similar. >> >> https://docs.python.org/dev/library/unittest.html#basic-example >> >> if everyone is using py.test now, we can start moving towards it. >> >> cheers, >> >> satra >> >> On Fri, Jun 17, 2016 at 10:27 AM, Nate Vack wrote: >> >>> +1 to pytest. >>> >>> -n >>> >>> On Fri, Jun 17, 2016 at 9:25 AM Nathaniel Smith wrote: >>> >>>> AFAICT, literally everyone uses pytest, so that's what I've been moving >>>> to. It's fine. >>>> On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: >>>> >>>> hi folks, >>>> >>>> in nipype and other nipy projects we have used nose forever, but as >>>> matthew noted, nose will likely not be maintained in the future. python now >>>> has a unittest framework built into the standard library, and a large >>>> number of other testing frameworks. >>>> >>>> https://wiki.python.org/moin/PythonTestingToolsTaxonomy >>>> >>>> are there suggestions on how one might make the decision on which >>>> framework to choose? >>>> >>>> cheers, >>>> >>>> satra >>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From berleant at stanford.edu Fri Jun 17 14:46:43 2016 From: berleant at stanford.edu (Shoshana Berleant) Date: Fri, 17 Jun 2016 18:46:43 +0000 Subject: [Neuroimaging] advice on choosing a testing library In-Reply-To: References: Message-ID: It looks like pytest can run our existing test with minimal changes, too, which is a big plus: https://pytest.org/latest/nose.html On Fri, Jun 17, 2016 at 9:55 AM Nate Vack wrote: > I actually like pytest largely because it's so similar to unittest; > however, on test errors it does a bunch of nice magic introspection to show > me what's amiss. unittest is also good. > > I've also made use of the pytest's test parametrize decorator to test > reading a whole heaping ton of file versions with one method, though > there's an element of "this is dark magic and I kind of feel like maybe I > should write tests to test my tests." In the end, it saved me a ton of code > so I held my nose and did it; here's the example: > > https://github.com/njvack/bioread/blob/master/test/test_reader.py#L80 > > -n > > On Fri, Jun 17, 2016 at 11:23 AM Satrajit Ghosh wrote: > >> thank you folks. >> >> is there a particular reason why the standard python unittest library is >> not being used? it seems from the documentation that the two are quite >> similar. >> >> https://docs.python.org/dev/library/unittest.html#basic-example >> >> if everyone is using py.test now, we can start moving towards it. >> >> cheers, >> >> satra >> >> On Fri, Jun 17, 2016 at 10:27 AM, Nate Vack wrote: >> >>> +1 to pytest. >>> >>> -n >>> >>> On Fri, Jun 17, 2016 at 9:25 AM Nathaniel Smith wrote: >>> >>>> AFAICT, literally everyone uses pytest, so that's what I've been moving >>>> to. It's fine. >>>> On Jun 17, 2016 06:35, "Satrajit Ghosh" wrote: >>>> >>>> hi folks, >>>> >>>> in nipype and other nipy projects we have used nose forever, but as >>>> matthew noted, nose will likely not be maintained in the future. python now >>>> has a unittest framework built into the standard library, and a large >>>> number of other testing frameworks. >>>> >>>> https://wiki.python.org/moin/PythonTestingToolsTaxonomy >>>> >>>> are there suggestions on how one might make the decision on which >>>> framework to choose? >>>> >>>> cheers, >>>> >>>> satra >>>> >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>>> _______________________________________________ >>>> Neuroimaging mailing list >>>> Neuroimaging at python.org >>>> https://mail.python.org/mailman/listinfo/neuroimaging >>>> >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Wed Jun 22 10:57:52 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Wed, 22 Jun 2016 14:57:52 +0000 Subject: [Neuroimaging] [dipy] [google hangout] Issues trying to install dipy In-Reply-To: References: Message-ID: Hi all, Just wanted to give you a quick feedback about the meeting we had with Jon to try and install DIPY for development work in Windows 10 (64bit) using an Anaconda installation. What we found is that if you are using recent Anaconda in Windows you do not need to install mingw or libpython (with conda). All you need after or before installing Anaconda is to install one of the free versions (or not) of the Visual C++ compiler (e.g. community edition). After that then cython will find the correct compiler and build the shared objects with no problem. So, in summary the steps are: 1. Install Anaconda for your windows version. 2. Install Visual C++ (or Studio) community edition (or other). 3. pip install nibabel 4. conda install vtk 5. cd dipy; 6. python setup.py develop (or python setup.py build_ext --inplace) I hope this helps. We need to update our installation page! Cygwin/mingw is no more for anaconda users. We should also check what happens with Openmp when Visual C++ is used. Many thanks to Jon for reporting the problem. Best regards, Eleftherios On Sat, Jun 18, 2016 at 8:50 PM Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > Let's say just Saturday for now. Sunday is harder for me. > See you on Saturday at 11am EST. > > Best, > Eleftherios > > On Thu, Jun 16, 2016 at 6:36 PM Jon Haitz Legarreta < > jhlegarreta at vicomtech.org> wrote: > >> Dear Prof. Garyfallidis, >> OK, I will then be around at that time on both Saturday and Sunday so >> that we can start the hangout when you send me the invitation. >> See you then. >> >> Again, thanks for your time Prof. Garyfallidis. >> >> Sincerely, >> JON HAITZ >> >> >> -- >> >> Jon Haitz Legarreta Gorro?o >> Investigador / Researcher >> eSalud y Aplicaciones Biom?dicas / eHealth and Biomedical Applications >> Vicomtech-IK4 - Visual Interaction Communication Technologies >> Mikeletegi Pasealekua, 57 - Parque Tecnol?gico >> 20009 Donostia - San Sebasti?n - Spain >> Tel: +[34] 943 30 92 30 >> Fax: +[34] 943 30 93 93 >> e-mail: jhlegarreta at vicomtech.orgwww.vicomtech.org >> *** member of IK4 Research Alliance ****www.ik4.es >> *** member of GraphicsMedia.net ****www.graphicsmedia.net >> ----------------------------------------------------- >> Vicomtech-IK4 is an ISO 9001:2000 certified institute >> ----------------------------------------------------- >> Aviso Legal - Pol?tica de privacidad (http://www.vicomtech.org/proteccion-datos) >> Lege Oharra - Pribatutasun politika (http://www.vicomtech.org/eu/proteccion-datos) >> Legal Notice - Privacy policy (http://www.vicomtech.org/en/proteccion-datos) >> >> >> On 17 June 2016 at 00:29, Eleftherios Garyfallidis < >> garyfallidis at gmail.com> wrote: >> >>> Okay, thank you for letting me know. I will be travelling to Europe next >>> Monday. Maybe the easiest will be to have a short meeting during the >>> weekend let's say 11am EST. Otherwise, next week. But schedule for next >>> week is still chaotic. I will be also in OHBM exhibition in Geneva. >>> >>> On Thu, Jun 16, 2016 at 6:24 PM Jon Haitz Legarreta < >>> jhlegarreta at vicomtech.org> wrote: >>> >>>> Dear Prof. Garyfallidis, >>>> I have just been notified about a work meeting that will tentatively >>>> last until 10:00 EST, so I will not be able to make it. I am sorry Prof. >>>> Garyfallidis. >>>> >>>> We will need to arrange the hangout for another day. I will be >>>> available through the weekend as well, but otherwise I am open to hold the >>>> hangout next week on either Monday, Tuesday or Wednesday between 8:00-12:00 >>>> EST. >>>> >>>> If this suits you better, then great. Otherwise, we will need to >>>> postpone it. >>>> >>>> Using google hangout is not an issue. >>>> >>>> And yes, for the time we will be doing the hangout I will remove all >>>> Anaconda, nibabel and dipy so that we can start from a fresh installation. >>>> >>>> Thanks for your patience and your time. >>>> >>>> Sincerely, >>>> JON HAITZ >>>> >>>> >>>> -- >>>> >>>> Jon Haitz Legarreta Gorro?o >>>> Investigador / Researcher >>>> eSalud y Aplicaciones Biom?dicas / eHealth and Biomedical Applications >>>> Vicomtech-IK4 - Visual Interaction Communication Technologies >>>> Mikeletegi Pasealekua, 57 - Parque Tecnol?gico >>>> 20009 Donostia - San Sebasti?n - Spain >>>> Tel: +[34] 943 30 92 30 >>>> Fax: +[34] 943 30 93 93 >>>> e-mail: jhlegarreta at vicomtech.orgwww.vicomtech.org >>>> *** member of IK4 Research Alliance ****www.ik4.es >>>> *** member of GraphicsMedia.net ****www.graphicsmedia.net >>>> ----------------------------------------------------- >>>> Vicomtech-IK4 is an ISO 9001:2000 certified institute >>>> ----------------------------------------------------- >>>> Aviso Legal - Pol?tica de privacidad (http://www.vicomtech.org/proteccion-datos) >>>> Lege Oharra - Pribatutasun politika (http://www.vicomtech.org/eu/proteccion-datos) >>>> Legal Notice - Privacy policy (http://www.vicomtech.org/en/proteccion-datos) >>>> >>>> >>>> On 17 June 2016 at 00:04, Eleftherios Garyfallidis < >>>> garyfallidis at gmail.com> wrote: >>>> >>>>> Hi Jon, >>>>> >>>>> Apologies for the delay. I can hangout with you at 9am EST (EDT). >>>>> It will have to be a relatively short meeting as I will have another >>>>> meeting at 10am EST. I hope we can figure out a solution for you during >>>>> this time. >>>>> >>>>> You may want to try to completely erase all previous installations >>>>> both dipy, nibabel, and anaconda. So, you have a fresh system. >>>>> >>>>> I prefer to use google hangouts as we may need to screenshare. >>>>> >>>>> I hope this okay. >>>>> Cheers, >>>>> Eleftherios >>>>> >>>>> >>>>> >>>>> On Thu, Jun 16, 2016 at 5:38 PM Jon Haitz Legarreta < >>>>> jhlegarreta at vicomtech.org> wrote: >>>>> >>>>>> Dear Prof. Garyfallidis, >>>>>> would you be available to do the hangout tomorrow morning, between >>>>>> 08:00-15:00 EDT? >>>>>> >>>>>> Thank you. >>>>>> >>>>>> Sincerely, >>>>>> JON HAITZ >>>>>> >>>>>> -- >>>>>> >>>>>> Jon Haitz Legarreta Gorro?o >>>>>> Investigador / Researcher >>>>>> eSalud y Aplicaciones Biom?dicas / eHealth and Biomedical Applications >>>>>> Vicomtech-IK4 - Visual Interaction Communication Technologies >>>>>> Mikeletegi Pasealekua, 57 - Parque Tecnol?gico >>>>>> 20009 Donostia - San Sebasti?n - Spain >>>>>> Tel: +[34] 943 30 92 30 >>>>>> Fax: +[34] 943 30 93 93 >>>>>> e-mail: jhlegarreta at vicomtech.orgwww.vicomtech.org >>>>>> *** member of IK4 Research Alliance ****www.ik4.es >>>>>> *** member of GraphicsMedia.net ****www.graphicsmedia.net >>>>>> ----------------------------------------------------- >>>>>> Vicomtech-IK4 is an ISO 9001:2000 certified institute >>>>>> ----------------------------------------------------- >>>>>> Aviso Legal - Pol?tica de privacidad (http://www.vicomtech.org/proteccion-datos) >>>>>> Lege Oharra - Pribatutasun politika (http://www.vicomtech.org/eu/proteccion-datos) >>>>>> Legal Notice - Privacy policy (http://www.vicomtech.org/en/proteccion-datos) >>>>>> >>>>>> >>>>>> ---------- Forwarded message ---------- >>>>>> From: Jon Haitz Legarreta >>>>>> Date: 6 June 2016 at 00:16 >>>>>> Subject: Fwd: [dipy] [google hangout] Issues trying to install dipy >>>>>> To: Eleftherios Garyfallidis >>>>>> >>>>>> >>>>>> Dear Prof. Garyfallidis, >>>>>> could we program the hangout to discuss about the dipy installation >>>>>> issues from sources at some point this or next week? >>>>>> >>>>>> My availability would be as follows at this time: >>>>>> Friday, June 10th: 08:00-15:00 EDT >>>>>> Wednesday, June 15th: 9:00-15:00 EDT >>>>>> Friday, June 17th: 08:00-15:00 EDT >>>>>> >>>>>> Otherwise, I will be available through the weekend as well (except >>>>>> for this Saturday, June 11th). >>>>>> >>>>>> I already posted the question to the Anaconda users' forum [1], but >>>>>> it got little attention from the community. >>>>>> >>>>>> Thank you. >>>>>> >>>>>> Sincerely, >>>>>> JON HAITZ >>>>>> >>>>>> [1] >>>>>> https://groups.google.com/a/continuum.io/forum/#!topic/anaconda/eQcYKQubSJ8 >>>>>> >>>>>> >>>>>> ---------- Forwarded message ---------- >>>>>> From: Jon Haitz Legarreta >>>>>> Date: 1 June 2016 at 19:14 >>>>>> Subject: [dipy] [google hangout] Issues trying to install dipy >>>>>> To: garyfallidis at gmail.com >>>>>> >>>>>> >>>>>> Dear Prof. Garyfallidis, >>>>>> first of all, thank you for your support and availability to try to >>>>>> fix the dipy installation issues I am experiencing. >>>>>> >>>>>> Concerning your offer to have a google hangout, given my schedule, >>>>>> and the fact that the time shift between and Spain, where I am based, and >>>>>> Sherbrooke is 8 hours, my availability for the following days would be as >>>>>> follows: >>>>>> >>>>>> Friday, June 3rd: 08:00-11:00 EDT >>>>>> Wednesday, June 8th: 9:00-15:00 EDT >>>>>> Friday, June 10th: 08:00-11:00 EDT >>>>>> >>>>>> If this is too tight for your schedule, we could look for an >>>>>> alternative. I would also be available through this weekend. But, of >>>>>> course, I would understand that you preferred to have the google hangout >>>>>> during the week. >>>>>> >>>>>> I look forward to your reply. >>>>>> >>>>>> Sincerely, >>>>>> Jon Haitz LEGARRETA >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From albayenes at gmail.com Wed Jun 22 11:08:50 2016 From: albayenes at gmail.com (=?UTF-8?B?RW5lcyBBbGJheSAtINin2YbYsyDYp9mE2KjYp9mJ?=) Date: Wed, 22 Jun 2016 18:08:50 +0300 Subject: [Neuroimaging] Dipy - Basic Tracking references Message-ID: Hi all, I have noticed that in general, examples in documentation of Dipy have references. But Introduction to Basic Tracking page does not have any reference. Does anybody know any reference for this code? -- Enes Albay -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Wed Jun 22 11:13:42 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Wed, 22 Jun 2016 15:13:42 +0000 Subject: [Neuroimaging] Dipy - Basic Tracking references In-Reply-To: References: Message-ID: Hello Enes, Thank you for your excellent question. We do need the citations! Please cite the main DIPY paper. http://journal.frontiersin.org/article/10.3389/fninf.2014.00008/full Another paper explaining the tracking framework will be submitted asap. But for now please do cite the paper of the link above. Best regards, Eleftherios ?On Wed, Jun 22, 2016 at 5:09 PM ?Enes Albay - ??? ??????? < albayenes at gmail.com> wrote:? > Hi all, > > I have noticed that in general, examples in documentation of Dipy have > references. But Introduction to Basic Tracking > > page does not have any reference. > > Does anybody know any reference for this code? > > -- > Enes Albay > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From albayenes at gmail.com Wed Jun 22 11:21:18 2016 From: albayenes at gmail.com (=?UTF-8?B?RW5lcyBBbGJheSAtINin2YbYsyDYp9mE2KjYp9mJ?=) Date: Wed, 22 Jun 2016 18:21:18 +0300 Subject: [Neuroimaging] Dipy - Basic Tracking references In-Reply-To: References: Message-ID: Thanks for your reply. On Wed, Jun 22, 2016 at 6:13 PM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > Hello Enes, > > Thank you for your excellent question. We do need the citations! > > Please cite the main DIPY paper. > > http://journal.frontiersin.org/article/10.3389/fninf.2014.00008/full > > Another paper explaining the tracking framework will be submitted asap. > But for now please do cite the paper of the link above. > > Best regards, > Eleftherios > > > ?On Wed, Jun 22, 2016 at 5:09 PM ?Enes Albay - ??? ??????? < > albayenes at gmail.com> wrote:? > >> Hi all, >> >> I have noticed that in general, examples in documentation of Dipy have >> references. But Introduction to Basic Tracking >> >> page does not have any reference. >> >> Does anybody know any reference for this code? >> >> -- >> Enes Albay >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Enes Albay -------------- next part -------------- An HTML attachment was scrubbed... URL: From jhlegarreta at vicomtech.org Wed Jun 22 17:53:51 2016 From: jhlegarreta at vicomtech.org (Jon Haitz Legarreta) Date: Wed, 22 Jun 2016 23:53:51 +0200 Subject: [Neuroimaging] [dipy] [google hangout] Issues trying to install dipy In-Reply-To: References: Message-ID: Hi there, yep, that's right. Thanks for the follow-up message, Eleftherios, and sorry for lagging behind with the answer. The procedure we followed is the one Eleftherios described. We also created an environment variable called PYTHONPATH pointing to the location where the dipy sources reside in my machine. I don't know whether this was strictly necessary in the end. I run the automatic tests with the following results: run: 539; skipped=3, errors=2, failures=1 I guess it's OK. In my case, I was not particularly put off by the apparent dependency on VS, since anyways sooner or later I was to install it for other C++ developments. Now, this is feed for thought: although VS Community Edition is free, as it seems from the effort we made, it looks like we need to install the entire tool. I ignore whether other "lighter" solutions exist. Since Anaconda kept looking for the VS compiler, although I explicitly set to use its built-in MinGW32, so it looks unavoidable. But I guess it would be interesting to find the root of the issue or to find a workaround to avoid such a heavy dependency. I ignore whether this issue is new on Win 10 systems (may be it was already present on Win 8?), or specific to the Anaconda distributions I downloaded, or a combination of both. We tried both with a Anaconda with Python 2.7 and 3.0 versions with similar concerns. The related question I posted [1] to the Anaconda mailing list is still unanswered. Thanks for the effort Elefhterios. JON HAITZ [1] https://groups.google.com/a/continuum.io/forum/#!searchin/anaconda/dipy/anaconda/eQcYKQubSJ8/Dd4Wt4FPAwAJ -- On 22 June 2016 at 16:57, Eleftherios Garyfallidis wrote: > Hi all, > > Just wanted to give you a quick feedback about the meeting we had with Jon > to try and install DIPY for development work in Windows 10 (64bit) using an > Anaconda installation. > > What we found is that if you are using recent Anaconda in Windows you do > not need to install mingw or libpython (with conda). All you need after or > before installing Anaconda is to install one of the free versions (or not) > of the Visual C++ compiler (e.g. community edition). After that then cython > will find the correct compiler and build the shared objects with no problem. > So, in summary the steps are: > 1. Install Anaconda for your windows version. > 2. Install Visual C++ (or Studio) community edition (or other). > 3. pip install nibabel > 4. conda install vtk > 5. cd dipy; > 6. python setup.py develop (or python setup.py build_ext --inplace) > > I hope this helps. We need to update our installation page! Cygwin/mingw > is no more for anaconda users. We should also check what happens with > Openmp when Visual C++ is used. > > Many thanks to Jon for reporting the problem. > > Best regards, > Eleftherios > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jhlegarreta at vicomtech.org Wed Jun 22 17:54:40 2016 From: jhlegarreta at vicomtech.org (Jon Haitz Legarreta) Date: Wed, 22 Jun 2016 23:54:40 +0200 Subject: [Neuroimaging] [dipy] Dashboard to submit test results? Message-ID: Hi there, is there any public dashboard where we can automatically submit the results of running the dipy tests? Thank you, JON HAITZ -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexsavio at gmail.com Wed Jun 22 18:05:06 2016 From: alexsavio at gmail.com (alexsavio at gmail.com) Date: Thu, 23 Jun 2016 00:05:06 +0200 Subject: [Neuroimaging] [dipy] Dashboard to submit test results? In-Reply-To: References: Message-ID: Hi Jon, Qu? tal? :) If you enable Travis on your dipy fork repository you will automatically see the testing results when you push code to your repository. For that, first you need to link your Github account to a Travis account. I hope this helps. Cheers, Alex Alexandre Manh?es Savio PhD, Medical Imaging, Machine Learning Klinikum rechts der Isar, TUM, M?nchen https://alexsavio.github.io Nebenstellennummer: 4570 On 22 June 2016 at 23:54, Jon Haitz Legarreta wrote: > Hi there, > is there any public dashboard where we can automatically submit the > results of running the dipy tests? > > Thank you, > JON HAITZ > > -- > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Wed Jun 22 18:10:52 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Wed, 22 Jun 2016 22:10:52 +0000 Subject: [Neuroimaging] [dipy] [google hangout] Issues trying to install dipy In-Reply-To: References: Message-ID: Thanks for the feedback Jon! On Wed, Jun 22, 2016, 11:54 PM Jon Haitz Legarreta < jhlegarreta at vicomtech.org> wrote: > Hi there, > yep, that's right. Thanks for the follow-up message, Eleftherios, and > sorry for lagging behind with the answer. > > The procedure we followed is the one Eleftherios described. > > We also created an environment variable called PYTHONPATH pointing to the > location where the dipy sources reside in my machine. I don't know whether > this was strictly necessary in the end. > > I run the automatic tests with the following results: > run: 539; > skipped=3, errors=2, failures=1 > > I guess it's OK. > > In my case, I was not particularly put off by the apparent dependency on > VS, since anyways sooner or later I was to install it for other C++ > developments. > > Now, this is feed for thought: although VS Community Edition is free, as > it seems from the effort we made, it looks like we need to install the > entire tool. I ignore whether other "lighter" solutions exist. Since > Anaconda kept looking for the VS compiler, although I explicitly set to use > its built-in MinGW32, so it looks unavoidable. But I guess it would be > interesting to find the root of the issue or to find a workaround to avoid > such a heavy dependency. > > I ignore whether this issue is new on Win 10 systems (may be it was > already present on Win 8?), or specific to the Anaconda distributions I > downloaded, or a combination of both. We tried both with a Anaconda with > Python 2.7 and 3.0 versions with similar concerns. > > The related question I posted [1] to the Anaconda mailing list is still > unanswered. > > Thanks for the effort Elefhterios. > JON HAITZ > > [1] > https://groups.google.com/a/continuum.io/forum/#!searchin/anaconda/dipy/anaconda/eQcYKQubSJ8/Dd4Wt4FPAwAJ > > -- > > On 22 June 2016 at 16:57, Eleftherios Garyfallidis > wrote: > >> Hi all, >> >> Just wanted to give you a quick feedback about the meeting we had with >> Jon to try and install DIPY for development work in Windows 10 (64bit) >> using an Anaconda installation. >> >> What we found is that if you are using recent Anaconda in Windows you do >> not need to install mingw or libpython (with conda). All you need after or >> before installing Anaconda is to install one of the free versions (or not) >> of the Visual C++ compiler (e.g. community edition). After that then cython >> will find the correct compiler and build the shared objects with no problem. >> So, in summary the steps are: >> 1. Install Anaconda for your windows version. >> 2. Install Visual C++ (or Studio) community edition (or other). >> 3. pip install nibabel >> 4. conda install vtk >> 5. cd dipy; >> 6. python setup.py develop (or python setup.py build_ext --inplace) >> >> I hope this helps. We need to update our installation page! Cygwin/mingw >> is no more for anaconda users. We should also check what happens with >> Openmp when Visual C++ is used. >> >> Many thanks to Jon for reporting the problem. >> >> Best regards, >> Eleftherios >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jhlegarreta at vicomtech.org Thu Jun 23 01:31:43 2016 From: jhlegarreta at vicomtech.org (Jon Haitz Legarreta) Date: Thu, 23 Jun 2016 07:31:43 +0200 Subject: [Neuroimaging] [dipy] Dashboard to submit test results? In-Reply-To: References: Message-ID: Hi Alex, OK, thanks. It definitely helps ! Kind regards, JON HAITZ -- On 23 June 2016 at 00:05, alexsavio at gmail.com wrote: > Hi Jon, > > Qu? tal? :) > > If you enable Travis on your dipy fork repository you will automatically > see the testing results when you push code to your repository. > For that, first you need to link your Github account to a Travis account. > > I hope this helps. > > Cheers, > Alex > > > Alexandre Manh?es Savio > PhD, Medical Imaging, Machine Learning > Klinikum rechts der Isar, TUM, M?nchen > https://alexsavio.github.io > Nebenstellennummer: 4570 > > On 22 June 2016 at 23:54, Jon Haitz Legarreta > wrote: > >> Hi there, >> is there any public dashboard where we can automatically submit the >> results of running the dipy tests? >> >> Thank you, >> JON HAITZ >> >> -- >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rahim.mehdi at gmail.com Thu Jun 23 13:43:04 2016 From: rahim.mehdi at gmail.com (Mehdi Rahim) Date: Thu, 23 Jun 2016 19:43:04 +0200 Subject: [Neuroimaging] Nilearn tutorial @OHBM on 06/26 Message-ID: Dear all, We are pleased to announce a Nilearn (https://nilearn.github.io) tutorial at OHBM - Open Science Special Interest Group Hackathon - on Sunday June 26th from 9 am to 4:30 pm. The tutorial will cover: - Basic examples (to get the ball rolling) - Manipulating brain image volumes (loading, masking, confound removal, etc.) - Visualization of brain images (stats maps, glass brains, connectomes) - Decoding and predicting from brain images (classical predictors, structured MVPA, etc.) - Functional connectivity (estimation of connectivity matrices, etc.) - ICA/dictionary learning/parcellations. We hope to see you there! The Nilearn team -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Jun 23 20:59:43 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 23 Jun 2016 17:59:43 -0700 Subject: [Neuroimaging] Some nipy tutorials Message-ID: Hi, I just did some small nipy tutorials, and a barely modified copy of Omar's excellent dipy affine tutorial, linked from here: http://practical-neuroimaging.github.io/analysis-clinic/nipy_intro.html I'd be very grateful for feedback, corrections, etc. Cheers, Matthew From stjeansam at gmail.com Fri Jun 24 02:35:19 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Fri, 24 Jun 2016 08:35:19 +0200 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl Message-ID: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Hello, So this is probably gonna look as a confused question because I still do not fully understand the issue myself. Anyway, reading upon https://github.com/nipy/nibabel/pull/90 and playing with loading a nifti and saving it back, it seems like sometimes the affine, sform and qform do not fully agree upon something and end up overwriting each other. This also seems to only happen with data that went through fsl, like the HCP datasets. So basically a normal pipeline involving nibabel is like volume = nib.load('my_data.nii.gz') data = volume.get_data() affine volume.get_affine() ## Do stuff on data nib.save(nib.Nifti1Image(data, affine),'my_new_data.nii.gz') Note how I did not save the header in this case. Now, for almost all cases, this works fine and a new header get created. And now for the question : Is it a bad idea to strip out the header (since dtype, pixdim and other things might change depending on the processing involved)? Is it possible to save back exactly the same header / sform / qform so that data won't be flipped in e.g. fslvew afterward? Is it just because software don't play well together or there is an unseen issue I don't get here? I also remember a few years back playing the whole afternoon with get/set qform and sform rewriting our affines everytime, so I though might as well ask everyone since we never fully figured out the logic between what sets the affine in the header and why. Thanks for reading, Samuel From arokem at gmail.com Sat Jun 25 13:48:30 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sat, 25 Jun 2016 10:48:30 -0700 Subject: [Neuroimaging] Fwd: [GitHub] Subscribed to nipy/staged-recipes notifications In-Reply-To: <576e580529742_383a3f927642f2a044874d@github-lowworker18-cp1-prd.iad.github.net.mail> References: <576e580529742_383a3f927642f2a044874d@github-lowworker18-cp1-prd.iad.github.net.mail> Message-ID: Hey Satra, Just checking in about this: what is the plan for this? As you might know, there are already a few conda-forge recipes for nipy packages: https://github.com/conda-forge/nibabel-feedstock/ https://github.com/conda-forge/nitime-feedstock/ https://github.com/conda-forge/pydicom-feedstock/ https://github.com/conda-forge/dipy-feedstock/ https://github.com/conda-forge/nilearn-feedstock/ https://github.com/conda-forge/dcmstack-feedstock/ (and more that I don't know of?) Is there any advantage to creating a fork for the nipy org? (BTW: I can't speak for the nilearn recipe, because I didn't create it, but if anyone is interested in joining as a maintainer of any of the others, feel free to add yourself in a PR to the relevant section of the recipe yaml file). (BTBTW: If anyone is interested in help/guidance is setting up one of these, let me know and I can share from my own experience; the conda-forg org is super-helpful though, so they will also help shepherd you through the process. A really great community!) Thanks! Ariel ---------- Forwarded message ---------- From: GitHub Date: Sat, Jun 25, 2016 at 3:08 AM Subject: [GitHub] Subscribed to nipy/staged-recipes notifications To: arokem at gmail.com Hey there, we?re just writing to let you know that you?ve been automatically subscribed to a repository on GitHub. nipy/staged-recipes forked by satra from conda-forge/staged-recipes A place to submit conda recipes before they become fully fledged conda-forge feedstocks https://github.com/nipy/staged-recipes You?ll receive notifications for all issues, pull requests, and comments that happen inside the repository. If you would like to stop watching this repository, you can manage your settings here: https://github.com/nipy/staged-recipes/subscription You can unwatch this repository immediately by clicking here: https://github.com/nipy/staged-recipes/unsubscribe_via_email/AAHPNpd6rPkNEoi4twlYrh2YpbcP_4nBks5qPP4FgaJpZM4DsRgl You were automatically subscribed because you?ve been given push access to the repository. Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Sat Jun 25 18:19:05 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Sun, 26 Jun 2016 00:19:05 +0200 Subject: [Neuroimaging] Fwd: [GitHub] Subscribed to nipy/staged-recipes notifications In-Reply-To: References: <576e580529742_383a3f927642f2a044874d@github-lowworker18-cp1-prd.iad.github.net.mail> Message-ID: hi ariel, i didn't really know what i was doing, but felt that if i were to create some staging for feedstocks i might as well do it under the nipy repo. i was planning to make one for nipype and perhaps a few others. if you think i should fork it elsewhere let me know. cheers, satra On Sat, Jun 25, 2016 at 7:48 PM, Ariel Rokem wrote: > Hey Satra, > > Just checking in about this: what is the plan for this? As you might know, > there are already a few conda-forge recipes for nipy packages: > > https://github.com/conda-forge/nibabel-feedstock/ > https://github.com/conda-forge/nitime-feedstock/ > https://github.com/conda-forge/pydicom-feedstock/ > https://github.com/conda-forge/dipy-feedstock/ > https://github.com/conda-forge/nilearn-feedstock/ > https://github.com/conda-forge/dcmstack-feedstock/ > > (and more that I don't know of?) > > Is there any advantage to creating a fork for the nipy org? > > > (BTW: I can't speak for the nilearn recipe, because I didn't create it, > but if anyone is interested in joining as a maintainer of any of the > others, feel free to add yourself in a PR to the relevant section of the > recipe yaml file). > > (BTBTW: If anyone is interested in help/guidance is setting up one of > these, let me know and I can share from my own experience; the conda-forg > org is super-helpful though, so they will also help shepherd you through > the process. A really great community!) > > Thanks! > > Ariel > > > ---------- Forwarded message ---------- > From: GitHub > Date: Sat, Jun 25, 2016 at 3:08 AM > Subject: [GitHub] Subscribed to nipy/staged-recipes notifications > To: arokem at gmail.com > > > > Hey there, we?re just writing to let you know that you?ve been > automatically subscribed to a repository on GitHub. > > nipy/staged-recipes forked by satra from conda-forge/staged-recipes > A place to submit conda recipes before they become fully fledged > conda-forge feedstocks > https://github.com/nipy/staged-recipes > > You?ll receive notifications for all issues, pull requests, and comments > that happen inside the repository. If you would like to stop watching this > repository, you can manage your settings here: > > https://github.com/nipy/staged-recipes/subscription > > You can unwatch this repository immediately by clicking here: > > > https://github.com/nipy/staged-recipes/unsubscribe_via_email/AAHPNpd6rPkNEoi4twlYrh2YpbcP_4nBks5qPP4FgaJpZM4DsRgl > > You were automatically subscribed because you?ve been given push access to > the repository. > > Thanks! > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sat Jun 25 18:32:13 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sat, 25 Jun 2016 15:32:13 -0700 Subject: [Neuroimaging] Fwd: [GitHub] Subscribed to nipy/staged-recipes notifications In-Reply-To: References: <576e580529742_383a3f927642f2a044874d@github-lowworker18-cp1-prd.iad.github.net.mail> Message-ID: On Sat, Jun 25, 2016 at 3:19 PM, Satrajit Ghosh wrote: > hi ariel, > > i didn't really know what i was doing, but felt that if i were to create > some staging for feedstocks i might as well do it under the nipy repo. i > was planning to make one for nipype and perhaps a few others. > Ah - I thought you were thinking of doing something clever where we would somehow circumvent the limits on the Github API and on the CI servers, or maybe you were starting a new conda channel :-) if you think i should fork it elsewhere let me know. > I've been forking these to my own user account, and making PRs from there into staged-recipes. You can add multiple maintainers by adding their GH user names in the relevant section of the recipe, so you can share the responsibility widely. The conda-forge elves (for lack of a better term; "admins" perhaps?) are pretty pro-active in adding people related to a project to the list of maintainers. > cheers, > > satra > > On Sat, Jun 25, 2016 at 7:48 PM, Ariel Rokem wrote: > >> Hey Satra, >> >> Just checking in about this: what is the plan for this? As you might >> know, there are already a few conda-forge recipes for nipy packages: >> >> https://github.com/conda-forge/nibabel-feedstock/ >> https://github.com/conda-forge/nitime-feedstock/ >> https://github.com/conda-forge/pydicom-feedstock/ >> https://github.com/conda-forge/dipy-feedstock/ >> https://github.com/conda-forge/nilearn-feedstock/ >> https://github.com/conda-forge/dcmstack-feedstock/ >> >> (and more that I don't know of?) >> >> Is there any advantage to creating a fork for the nipy org? >> >> >> (BTW: I can't speak for the nilearn recipe, because I didn't create it, >> but if anyone is interested in joining as a maintainer of any of the >> others, feel free to add yourself in a PR to the relevant section of the >> recipe yaml file). >> >> (BTBTW: If anyone is interested in help/guidance is setting up one of >> these, let me know and I can share from my own experience; the conda-forg >> org is super-helpful though, so they will also help shepherd you through >> the process. A really great community!) >> >> Thanks! >> >> Ariel >> >> >> ---------- Forwarded message ---------- >> From: GitHub >> Date: Sat, Jun 25, 2016 at 3:08 AM >> Subject: [GitHub] Subscribed to nipy/staged-recipes notifications >> To: arokem at gmail.com >> >> >> >> Hey there, we?re just writing to let you know that you?ve been >> automatically subscribed to a repository on GitHub. >> >> nipy/staged-recipes forked by satra from conda-forge/staged-recipes >> A place to submit conda recipes before they become fully fledged >> conda-forge feedstocks >> https://github.com/nipy/staged-recipes >> >> You?ll receive notifications for all issues, pull requests, and comments >> that happen inside the repository. If you would like to stop watching this >> repository, you can manage your settings here: >> >> https://github.com/nipy/staged-recipes/subscription >> >> You can unwatch this repository immediately by clicking here: >> >> >> https://github.com/nipy/staged-recipes/unsubscribe_via_email/AAHPNpd6rPkNEoi4twlYrh2YpbcP_4nBks5qPP4FgaJpZM4DsRgl >> >> You were automatically subscribed because you?ve been given push access >> to the repository. >> >> Thanks! >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Sat Jun 25 18:01:19 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sat, 25 Jun 2016 15:01:19 -0700 Subject: [Neuroimaging] nipy, dipy manylinux wheels Message-ID: Hi, I took the liberty of building and testing and then uploading manylinux1 [1] wheels for nipy [2] and dipy [3]. On almost any Intel / AMD-based Linux you can now get numpy, scipy, nipy, dipy all with binary wheels, so installation should be pretty quick and painless. Try with something like: virtualenv for-wheels source for-wheels/bin/activate python -m pip install --upgrade pip # get latest pip pip install numpy nibabel scipy nipy dipy While you are at it: pip install scikit-learn This also works on OSX, but that's been so for a while now. Cheers, Matthew [1] https://www.python.org/dev/peps/pep-0513 [2] https://travis-ci.org/MacPython/nipy-wheels [3] https://travis-ci.org/MacPython/dipy-wheels From stjeansam at gmail.com Mon Jun 27 08:04:48 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 27 Jun 2016 14:04:48 +0200 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl In-Reply-To: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> References: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Message-ID: So, to add a more practical example to what I outlined above, seems like saving a dataset in this manner preserves the sform, but changes the qform. According to the doc, they should both provide the same information in reconstructing the affine (possible from different origins). So I compared the header of an original nifti file and the same thing after saving it without headers, and here are the interesting parts. - sform is identical - sform name changed from sform_name Scanner Anat sform_code 1 to sform_name Aligned Anat sform_code 2 - some qform fields went from qform_name Scanner Anat qform_code 1 qto_xyz:1 1.796652 0.000000 0.000000 -115.754318 qto_xyz:2 0.000000 1.795833 -0.054337 -90.885376 qto_xyz:3 0.000000 0.054236 1.799180 54.572971 to qform_name Unknown qform_code 0 qto_xyz:1 1.796652 0.000000 0.000000 0.000000 qto_xyz:2 0.000000 1.796652 0.000000 0.000000 qto_xyz:3 0.000000 0.000000 1.800000 0.000000 Which leaves me wondering what is to be trusted in these fields. Seems like the translation of origin in qform changed (and the name is unknown, so nibabel ignores it on loading), but the sform name changed without changing the actual matrix and now has the translation of the qform. Is there a way to know how these fields are consistent when recreated by nibabel versus the original provided in the (now stripped) header? Or maybe the fsl suite just does not respect some convention I am unaware of (like trusting undefined qform over sform or something else like that), which is also possible, and would fall out of scope of this mailing list. Samuel 2016-06-24 8:35 GMT+02:00 Samuel St-Jean : > Hello, > > So this is probably gonna look as a confused question because I still do > not fully understand the issue myself. Anyway, reading upon > https://github.com/nipy/nibabel/pull/90 and playing with loading a nifti > and saving it back, it seems like sometimes the affine, sform and qform do > not fully agree upon something and end up overwriting each other. > > This also seems to only happen with data that went through fsl, like the > HCP datasets. So basically a normal pipeline involving nibabel is like > > > volume = nib.load('my_data.nii.gz') > data = volume.get_data() > affine volume.get_affine() > > ## Do stuff on data > > nib.save(nib.Nifti1Image(data, affine),'my_new_data.nii.gz') > > > Note how I did not save the header in this case. Now, for almost all > cases, this works fine and a new header get created. And now for the > question : > > Is it a bad idea to strip out the header (since dtype, pixdim and other > things might change depending on the processing involved)? > Is it possible to save back exactly the same header / sform / qform so > that data won't be flipped in e.g. fslvew afterward? Is it just because > software don't play well together or there is an unseen issue I don't get > here? > > I also remember a few years back playing the whole afternoon with get/set > qform and sform rewriting our affines everytime, so I though might as well > ask everyone since we never fully figured out the logic between what sets > the affine in the header and why. > > > Thanks for reading, > > Samuel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jean.christophe.houde at gmail.com Mon Jun 27 09:06:50 2016 From: jean.christophe.houde at gmail.com (Jean-Christophe Houde) Date: Mon, 27 Jun 2016 09:06:50 -0400 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl In-Reply-To: References: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Message-ID: Just a quick note: qform_code = 0 normally means that it is not set, so I'm surprised to see some values in your last qto_xyz... 2016-06-27 8:04 GMT-04:00 Samuel St-Jean : > So, to add a more practical example to what I outlined above, seems like > saving a dataset in this manner preserves the sform, but changes the qform. > According to the doc, they should both provide the same information in > reconstructing the affine (possible from different origins). So I compared > the header of an original nifti file and the same thing after saving it > without headers, and here are the interesting parts. > > - sform is identical > - sform name changed from > > sform_name Scanner Anat > sform_code 1 > > to > > sform_name Aligned Anat > sform_code 2 > > - some qform fields went from > > qform_name Scanner Anat > qform_code 1 > qto_xyz:1 1.796652 0.000000 0.000000 -115.754318 > qto_xyz:2 0.000000 1.795833 -0.054337 -90.885376 > qto_xyz:3 0.000000 0.054236 1.799180 54.572971 > > to > > qform_name Unknown > qform_code 0 > qto_xyz:1 1.796652 0.000000 0.000000 0.000000 > qto_xyz:2 0.000000 1.796652 0.000000 0.000000 > qto_xyz:3 0.000000 0.000000 1.800000 0.000000 > > Which leaves me wondering what is to be trusted in these fields. Seems > like the translation of origin in qform changed (and the name is unknown, > so nibabel ignores it on loading), but the sform name changed without > changing the actual matrix and now has the translation of the qform. Is > there a way to know how these fields are consistent when recreated by > nibabel versus the original provided in the (now stripped) header? Or maybe > the fsl suite just does not respect some convention I am unaware of (like > trusting undefined qform over sform or something else like that), which is > also possible, and would fall out of scope of this mailing list. > > Samuel > > 2016-06-24 8:35 GMT+02:00 Samuel St-Jean : > >> Hello, >> >> So this is probably gonna look as a confused question because I still do >> not fully understand the issue myself. Anyway, reading upon >> https://github.com/nipy/nibabel/pull/90 and playing with loading a nifti >> and saving it back, it seems like sometimes the affine, sform and qform do >> not fully agree upon something and end up overwriting each other. >> >> This also seems to only happen with data that went through fsl, like the >> HCP datasets. So basically a normal pipeline involving nibabel is like >> >> >> volume = nib.load('my_data.nii.gz') >> data = volume.get_data() >> affine volume.get_affine() >> >> ## Do stuff on data >> >> nib.save(nib.Nifti1Image(data, affine),'my_new_data.nii.gz') >> >> >> Note how I did not save the header in this case. Now, for almost all >> cases, this works fine and a new header get created. And now for the >> question : >> >> Is it a bad idea to strip out the header (since dtype, pixdim and other >> things might change depending on the processing involved)? >> Is it possible to save back exactly the same header / sform / qform so >> that data won't be flipped in e.g. fslvew afterward? Is it just because >> software don't play well together or there is an unseen issue I don't get >> here? >> >> I also remember a few years back playing the whole afternoon with get/set >> qform and sform rewriting our affines everytime, so I though might as well >> ask everyone since we never fully figured out the logic between what sets >> the affine in the header and why. >> >> >> Thanks for reading, >> >> Samuel >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pauldmccarthy at gmail.com Mon Jun 27 11:49:11 2016 From: pauldmccarthy at gmail.com (paul mccarthy) Date: Mon, 27 Jun 2016 16:49:11 +0100 Subject: [Neuroimaging] indexed access to gziped files In-Reply-To: References: <20160311063312.GF3792063@phare.normalesup.org> <5F6A858FD00E5F4A82E3206D2D854EF8A26E6C2E@EXMB10.ohsu.edu> Message-ID: Howdy all, Apologies for taking so long on this - I've been busy with my real work. But I've managed to get my indexed gzip project to a useable state - check it out here: https://github.com/pauldmccarthy/indexed_gzip I've tested it a fair bit via direct usage, but would like to add a bit more test coverage. And I'd like to add some basic write support - a function which writes out the full file, but re-builds the index as it does so, so the file can then be re-opened, with fast random-access available immediately. Apart from that, it's ready to be used. As far as nibabel integration goes, I'm not really sure if any changes to nibabel are necessary - I think it would be perfectly reasonable to put the onus on the user to manage their own file handle, and create their nibabel images via a file map. The only potential change that I think would be useful is the ability to create an image directly from a file handle (instead of using from_file_map). What does everybody think? Cheers, Paul On 15 March 2016 at 01:58, Matthew Brett wrote: > Hi, > > On Mon, Mar 14, 2016 at 2:12 PM, paul mccarthy > wrote: > > Hi Matthew, > > > > Thanks for clarifying the flieobj 'dance'! > > > > I had meant to ask you about cython - it looks like a good option (and is > > recommended in the official docs - > > https://docs.python.org/3/howto/cporting.html), so I'll look into it. > > Perhaps the best way forward would be for me to drop the mailing list a > line > > when I've got something in a more useable state. > > That would be great. Please feel free to ask for help with Cython, we > have a lot of collective experience here on the list. > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Mon Jun 27 14:20:33 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 27 Jun 2016 11:20:33 -0700 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl In-Reply-To: References: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Message-ID: Hi, On Mon, Jun 27, 2016 at 6:06 AM, Jean-Christophe Houde wrote: > Just a quick note: qform_code = 0 normally means that it is not set, so I'm > surprised to see some values in your last qto_xyz... > > 2016-06-27 8:04 GMT-04:00 Samuel St-Jean : >> >> So, to add a more practical example to what I outlined above, seems like >> saving a dataset in this manner preserves the sform, but changes the qform. >> According to the doc, they should both provide the same information in >> reconstructing the affine (possible from different origins). So I compared >> the header of an original nifti file and the same thing after saving it >> without headers, and here are the interesting parts. >> >> - sform is identical >> - sform name changed from >> >> sform_name Scanner Anat >> sform_code 1 >> >> to >> >> sform_name Aligned Anat >> sform_code 2 >> >> - some qform fields went from >> >> qform_name Scanner Anat >> qform_code 1 >> qto_xyz:1 1.796652 0.000000 0.000000 -115.754318 >> qto_xyz:2 0.000000 1.795833 -0.054337 -90.885376 >> qto_xyz:3 0.000000 0.054236 1.799180 54.572971 >> >> to >> >> qform_name Unknown >> qform_code 0 >> qto_xyz:1 1.796652 0.000000 0.000000 0.000000 >> qto_xyz:2 0.000000 1.796652 0.000000 0.000000 >> qto_xyz:3 0.000000 0.000000 1.800000 0.000000 Just to clarify - the rules for sform / qform are the following: If you don't pass a header to image construction, then the input `affine` gets set into both sform and qform, with default sform / qform output codes. If you do pass a header, and the input `affine` is the same as the affine implied by the header (either via the sform or qform), or the input affine is None, then the header (therefore sform / qform) is unchanged. If you do pass a header, and the affine is different from that implied by the header, both sform and qform get set to match the input affine. Does that explain what you are seeing? Cheers, Matthew From jhlegarreta at vicomtech.org Mon Jun 27 17:35:41 2016 From: jhlegarreta at vicomtech.org (Jon Haitz Legarreta) Date: Mon, 27 Jun 2016 23:35:41 +0200 Subject: [Neuroimaging] Fwd: [dipy] Dashboard to submit test results? In-Reply-To: References: Message-ID: Hi again, I may have been too fast in answering: as expected according to your words, and the documentation, travis will display the testing results when pushing a patch. But what if I want to run the tests for the latest version on the remote (in this case, the dipy master), even if I do not modify the source? Is it possible to be sending the test results to some common travis dashboard so that maintainers can see the results from potentially contributing developers? Thanks, JON HAITZ ---------- Forwarded message ---------- From: Jon Haitz Legarreta Date: 23 June 2016 at 07:31 Subject: Re: [Neuroimaging] [dipy] Dashboard to submit test results? To: Neuroimaging analysis in Python Hi Alex, OK, thanks. It definitely helps ! Kind regards, JON HAITZ -- On 23 June 2016 at 00:05, alexsavio at gmail.com wrote: > Hi Jon, > > Qu? tal? :) > > If you enable Travis on your dipy fork repository you will automatically > see the testing results when you push code to your repository. > For that, first you need to link your Github account to a Travis account. > > I hope this helps. > > Cheers, > Alex > > > Alexandre Manh?es Savio > PhD, Medical Imaging, Machine Learning > Klinikum rechts der Isar, TUM, M?nchen > https://alexsavio.github.io > Nebenstellennummer: 4570 > > On 22 June 2016 at 23:54, Jon Haitz Legarreta > wrote: > >> Hi there, >> is there any public dashboard where we can automatically submit the >> results of running the dipy tests? >> >> Thank you, >> JON HAITZ >> >> -- >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Tue Jun 28 02:50:30 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Tue, 28 Jun 2016 08:50:30 +0200 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl In-Reply-To: References: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Message-ID: I'll need to play a bit with it today and ask some guy about the flipping, I am reporting for someone else actually. So, it seems to be harmless to put back the header in the case of unchanged affine (as it is in my situation), could be that other software downstream does not really like something else. I did not reuse it because of dtypes, voxelsize and other things which may wildly change on processed data. Anyway, this example is using your case number 1, no header, affine is gotten from get_affine. Of course it was not modified during the processing, and I end up with a different sform/qform. Accoridng to your first point, I expected them to be identical, but I end up with a sligthly different (and diagonal) qform. They also have different names, so ifthey were set to the same thing I would expect them to be pristine copies. That's a bit troublesome now as JC pointed out, since we all use the same set of scripts for the heavy processing, which don't reuse headers as I can remember. Does not seem to cause any problem for us, don't want to receive hate mail by other people using some of our stuff and saying we broke their data though. Are sform and qform set to match the affine in exactly the same way, or does some transformation is applied and then they are identical? I went through the info on the nibabel website about headers and affine, so maybe I just missed it also if anyone has other info to add. Le 2016-06-27 ? 20:20, Matthew Brett a ?crit : > Hi, > > On Mon, Jun 27, 2016 at 6:06 AM, Jean-Christophe Houde > wrote: >> Just a quick note: qform_code = 0 normally means that it is not set, so I'm >> surprised to see some values in your last qto_xyz... >> >> 2016-06-27 8:04 GMT-04:00 Samuel St-Jean : >>> So, to add a more practical example to what I outlined above, seems like >>> saving a dataset in this manner preserves the sform, but changes the qform. >>> According to the doc, they should both provide the same information in >>> reconstructing the affine (possible from different origins). So I compared >>> the header of an original nifti file and the same thing after saving it >>> without headers, and here are the interesting parts. >>> >>> - sform is identical >>> - sform name changed from >>> >>> sform_name Scanner Anat >>> sform_code 1 >>> >>> to >>> >>> sform_name Aligned Anat >>> sform_code 2 >>> >>> - some qform fields went from >>> >>> qform_name Scanner Anat >>> qform_code 1 >>> qto_xyz:1 1.796652 0.000000 0.000000 -115.754318 >>> qto_xyz:2 0.000000 1.795833 -0.054337 -90.885376 >>> qto_xyz:3 0.000000 0.054236 1.799180 54.572971 >>> >>> to >>> >>> qform_name Unknown >>> qform_code 0 >>> qto_xyz:1 1.796652 0.000000 0.000000 0.000000 >>> qto_xyz:2 0.000000 1.796652 0.000000 0.000000 >>> qto_xyz:3 0.000000 0.000000 1.800000 0.000000 > Just to clarify - the rules for sform / qform are the following: > > If you don't pass a header to image construction, then the input > `affine` gets set into both sform and qform, with default sform / > qform output codes. > > If you do pass a header, and the input `affine` is the same as the > affine implied by the header (either via the sform or qform), or the > input affine is None, then the header (therefore sform / qform) is > unchanged. > > If you do pass a header, and the affine is different from that implied > by the header, both sform and qform get set to match the input affine. > > Does that explain what you are seeing? > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From yoh at onerussian.com Mon Jun 27 18:24:51 2016 From: yoh at onerussian.com (Yaroslav Halchenko) Date: Tue, 28 Jun 2016 00:24:51 +0200 Subject: [Neuroimaging] indexed access to gziped files In-Reply-To: References: <20160311063312.GF3792063@phare.normalesup.org> <5F6A858FD00E5F4A82E3206D2D854EF8A26E6C2E@EXMB10.ohsu.edu> Message-ID: <3D62826A-B584-418F-B491-B08CD7DD10C0@onerussian.com> Hi Paul, I just want to say - carry on! I haven't tried it yet but we might later make use of your project within datalad project to access content within archives... I hope we will investigate that opportunity in the near future Related issue https://github.com/datalad/datalad/issues/373 Cheers On June 27, 2016 5:49:11 PM GMT+02:00, paul mccarthy wrote: >Howdy all, > >Apologies for taking so long on this - I've been busy with my real >work. >But I've managed to get my indexed gzip project to a useable state - >check >it out here: > >https://github.com/pauldmccarthy/indexed_gzip > >I've tested it a fair bit via direct usage, but would like to add a bit >more test coverage. And I'd like to add some basic write support - a >function which writes out the full file, but re-builds the index as it >does >so, so the file can then be re-opened, with fast random-access >available >immediately. > >Apart from that, it's ready to be used. > >As far as nibabel integration goes, I'm not really sure if any changes >to >nibabel are necessary - I think it would be perfectly reasonable to put >the >onus on the user to manage their own file handle, and create their >nibabel >images via a file map. > >The only potential change that I think would be useful is the ability >to >create an image directly from a file handle (instead of using >from_file_map). What does everybody think? > >Cheers, > >Paul > >On 15 March 2016 at 01:58, Matthew Brett >wrote: > >> Hi, >> >> On Mon, Mar 14, 2016 at 2:12 PM, paul mccarthy > >> wrote: >> > Hi Matthew, >> > >> > Thanks for clarifying the flieobj 'dance'! >> > >> > I had meant to ask you about cython - it looks like a good option >(and is >> > recommended in the official docs - >> > https://docs.python.org/3/howto/cporting.html), so I'll look into >it. >> > Perhaps the best way forward would be for me to drop the mailing >list a >> line >> > when I've got something in a more useable state. >> >> That would be great. Please feel free to ask for help with Cython, >we >> have a lot of collective experience here on the list. >> >> Cheers, >> >> Matthew >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > >------------------------------------------------------------------------ > >_______________________________________________ >Neuroimaging mailing list >Neuroimaging at python.org >https://mail.python.org/mailman/listinfo/neuroimaging -- Sent from a phone which beats iPhone. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Maxime.Descoteaux at USherbrooke.ca Tue Jun 28 05:17:31 2016 From: Maxime.Descoteaux at USherbrooke.ca (Maxime Descoteaux) Date: Tue, 28 Jun 2016 09:17:31 +0000 Subject: [Neuroimaging] Diffusion workshop in Lisbon Message-ID: <9C6EC371-7D88-47EF-B341-5E9AA2B36FAD@usherbrooke.ca> -------------------------------- Subject: Diffusion MRI Workshop, 12-16 September, 2016 -------------------------------- Registration is officially open for the 4th ISMRM Diffusion MRI Workshop "Breaking Barriers in Diffusion MRI" to be held September 12-16 in Lisbon, Portugal! Dates: September 12-16, 2016 Venue: Sheraton Lisboa, Lisbon, Portugal Deadline for abstracts: July 21, 2016 Facebook: https://www.facebook.com/DSGworkshop2016/ As with the first three workshops held in 2002, 2005 and 2013, the workshop has been designed as the primary scientific venue for basic scientists and clinical researchers in the diffusion MR community to learn, meet the experts, discuss the latest advances in the field, and network. There is an optional bootcamp for trainees and those new to the field of diffusion MR on September 12. The formal scientific program from September 13-16 will cover the full scope of diffusion MR topics from acquisition to microstructure and modeling, anatomy, tractography, connectomics, validation, clinical application, and future strategies for innovation. Whether you are new to the field, a PhD student, postdoc, senior researcher or physician, we hope that you will be able to participate in the workshop and interact with the leaders in the field. Please note that the deadline for scientific abstract submission is July 21, 2016. For physicians in practice, the ISMRM is accredited by the ACGME to provide up to 25.75 AMA PRA Category 1 credits for participation in the workshop. For additional details, please visit the workshop website: http://www.ismrm.org/workshops/Diffusion16 or contact any of the organizing committee: Mara Cercignani, Brighton & Sussex Medical School, Brighton, England, UK Maxime Descoteaux, Sherbrooke University, Sherbrooke, QC, Canada Tim B. Dyrby, Danish Research Centre for Magnetic Resonance, Hvidovre, Denmark Christopher Hess, University of California, San Francisco, CA, USA Alexander Leemans, University Medical Center Utrecht, Utrecht, The Netherlands -------------- next part -------------- An HTML attachment was scrubbed... URL: From Maxime.Descoteaux at USherbrooke.ca Tue Jun 28 05:17:30 2016 From: Maxime.Descoteaux at USherbrooke.ca (Maxime Descoteaux) Date: Tue, 28 Jun 2016 09:17:30 +0000 Subject: [Neuroimaging] Diffusion workshop in Lisbon Message-ID: -------------------------------- Subject: Diffusion MRI Workshop, 12-16 September, 2016 -------------------------------- Registration is officially open for the 4th ISMRM Diffusion MRI Workshop "Breaking Barriers in Diffusion MRI" to be held September 12-16 in Lisbon, Portugal! Dates: September 12-16, 2016 Venue: Sheraton Lisboa, Lisbon, Portugal Deadline for abstracts: July 21, 2016 Facebook: https://www.facebook.com/DSGworkshop2016/ As with the first three workshops held in 2002, 2005 and 2013, the workshop has been designed as the primary scientific venue for basic scientists and clinical researchers in the diffusion MR community to learn, meet the experts, discuss the latest advances in the field, and network. There is an optional bootcamp for trainees and those new to the field of diffusion MR on September 12. The formal scientific program from September 13-16 will cover the full scope of diffusion MR topics from acquisition to microstructure and modeling, anatomy, tractography, connectomics, validation, clinical application, and future strategies for innovation. Whether you are new to the field, a PhD student, postdoc, senior researcher or physician, we hope that you will be able to participate in the workshop and interact with the leaders in the field. Please note that the deadline for scientific abstract submission is July 21, 2016. For physicians in practice, the ISMRM is accredited by the ACGME to provide up to 25.75 AMA PRA Category 1 credits for participation in the workshop. For additional details, please visit the workshop website: http://www.ismrm.org/workshops/Diffusion16 or contact any of the organizing committee: Mara Cercignani, Brighton & Sussex Medical School, Brighton, England, UK Maxime Descoteaux, Sherbrooke University, Sherbrooke, QC, Canada Tim B. Dyrby, Danish Research Centre for Magnetic Resonance, Hvidovre, Denmark Christopher Hess, University of California, San Francisco, CA, USA Alexander Leemans, University Medical Center Utrecht, Utrecht, The Netherlands -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Tue Jun 28 09:13:18 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 28 Jun 2016 06:13:18 -0700 Subject: [Neuroimaging] [nibabel] sform/qform flipping left - right in the affine and (possibly) fsl In-Reply-To: References: <91ecab46-10b5-ef86-aedf-f8b3781beffa@gmail.com> Message-ID: Hi, On Mon, Jun 27, 2016 at 11:50 PM, Samuel St-Jean wrote: > I'll need to play a bit with it today and ask some guy about the flipping, I > am reporting for someone else actually. So, it seems to be harmless to put > back the header in the case of unchanged affine (as it is in my situation), > could be that other software downstream does not really like something else. > I did not reuse it because of dtypes, voxelsize and other things which may > wildly change on processed data. > > > Anyway, this example is using your case number 1, no header, affine is > gotten from get_affine. Of course it was not modified during the processing, > and I end up with a different sform/qform. Accoridng to your first point, I > expected them to be identical, but I end up with a sligthly different (and > diagonal) qform. They also have different names, so ifthey were set to the > same thing I would expect them to be pristine copies. > > That's a bit troublesome now as JC pointed out, since we all use the same > set of scripts for the heavy processing, which don't reuse headers as I can > remember. Does not seem to cause any problem for us, don't want to receive > hate mail by other people using some of our stuff and saying we broke their > data though. > > Are sform and qform set to match the affine in exactly the same way, or does > some transformation is applied and then they are identical? I went through > the info on the nibabel website about headers and affine, so maybe I just > missed it also if anyone has other info to add. Well - remember that the qform is stored differently from the sform. In particular, the qform cannot store shears, because it's the composition of the translations on a pure rotation on the zooms, whereas the sform can, because it stores a full 3 x 4 affine matrix. The sform and the qform will have different so-called "transform codes", if the sform and qform both get set (rules as described in my last email). This is because the default transform codes are different for the sform and qform, and we use the defaults. The flip should always be the same though, when the sform and qform get set together - at least if the external software is reading the header correctly. Cheers, Matthew From hao.freesurfer at hotmail.com Wed Jun 29 04:25:07 2016 From: hao.freesurfer at hotmail.com (Hao wen) Date: Wed, 29 Jun 2016 08:25:07 +0000 Subject: [Neuroimaging] wrap a matlab script for Nipype Message-ID: Hello: I am new to nipype, and recently, I want to wrap a matlab script for Nipype, but in your website, the example is not so specific,and I checked out all the posts that you have talked about, it is not so clear, whatever for the command line or to create the matlab interface by ourself, can you just make a good tutorial for us? in fact, the tutorial in the nipype website is not so well-structured, in my opinion... Specifically, here is my case I wanna wrap a matlab function with a nipype freesurfer node, my matlab function actually just got some jpg files and save it to the output directory, I followed some options from the tutorial and other posts: By the way, i am on Ubuntu 12.04 def runmatlab(): from nipype.interfaces.matlab import MatlabCommand mlab = MatlabCommand() mlab.inputs.script = "/aramis/home/wen/HAO_lab/testhelloworld.m" out = mlab.run() out.outputs['matlab_output'] print out.outputs.matlab_output return out.outputs.matlab_output from nipype.interfaces.utility import Function import nipype.pipeline.engine as pe runmatlab = pe.Node(name='runmatlab', interface=Function(input_names=[], output_names=['out_file'], function=runmatlab)) runmatlab.run() the matlab file is just to print 'hello world' I got this in the _report file : Node: utility ============= Hierarchy : runmatlab Exec ID : runmatlab Original Inputs --------------- * function_str : S'def runmatlab():\n from nipype.interfaces.matlab import MatlabCommand\ n mlab = MatlabCommand()\n mlab.inputs.script = "/aramis/home/wen/HAO_lab/testhellowor ld.m"\n out = mlab.run()\n out.outputs[\'matlab_output\'] \n print out.outputs.matl ab_output\n return out.outputs.matlab_output\n' . * ignore_exception : False in the runmatlab node, I got some files Also, I tried the tutorial example2 in nipype website : http://nipy.org/nipype/devel/matlab_interface_devel.html, In my spyder, I got no display, but in my dir, I got the pyscript.m, I run it in my matlab, I got the result, does that mean that spyder cant find my matlab???? but I saw that in some post that you said the MatlabCommand will not use matlab engine, they just give some matlab script to it to run? Any advice will be appreciated:) Hao Neuroimaging in Python - Pipelines and Interfaces - nipy ... nipy.org How to wrap a MATLAB script? This is minimal script for wrapping MATLAB code. You should replace the MATLAB code template, and define approriate inputs and outputs. -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.freesurfer at hotmail.com Wed Jun 29 04:32:38 2016 From: hao.freesurfer at hotmail.com (Hao wen) Date: Wed, 29 Jun 2016 08:32:38 +0000 Subject: [Neuroimaging] wrap a matlab script for Nipype In-Reply-To: References: Message-ID: For the function version(runmatlab), in spyder, I got the result like this: RuntimeError: Command: matlab -nodesktop -nosplash -singleCompThread -r "addpath('/tmp/tmppHQYo8/runmatlab');pyscript;exit" Standard output: Warning: Unable to open display ':1'. You will not be able to display graphics on the screen. Warning: No window system found. Java option 'MWT' ignored < M A T L A B (R) > Copyright 1984-2010 The MathWorks, Inc. Version 7.11.0.584 (R2010b) 64-bit (glnxa64) August 16, 2010 To get started, type one of these: helpwin, helpdesk, or demo. For product information, visit www.mathworks.com. Executing pyscript at 29-Jun-2016 10:28:27: ------------------------------------------------------------------------------------- MATLAB Version 7.11.0.584 (R2010b) MATLAB License Number: 295724 Operating System: Linux 3.11.0-26-generic #45~precise1-Ubuntu SMP Tue Jul 15 04:02:35 UTC 2014 x86_64 Java VM Version: Java 1.6.0_17-b04 with Sun Microsystems Inc. Java HotSpot(TM) 64-Bit Server VM mixed mode ------------------------------------------------------------------------------------- MATLAB Version 7.11 (R2010b) Database Toolbox Version 3.8 (R2010b) Image Processing Toolbox Version 7.1 (R2010b) MATLAB Compiler Version 4.14 (R2010b) Neural Network Toolbox Version 7.0 (R2010b) Optimization Toolbox Version 5.1 (R2010b) Parallel Computing Toolbox Version 5.0 (R2010b) Parallel Computing Toolbox Version 5.0 (R2010b) Signal Processing Toolbox Version 6.14 (R2010b) Statistical Parametric Mapping Version 5236 (SPM8) Statistics Toolbox Version 7.4 (R2010b) Wavelet Toolbox Version 4.6 (R2010b) Standard error: /usr/cenir/matlabR2010b_64/bin/matlab: 1: /servernas/usr_cenir/matlabR2010b_64/bin/util/oscheck.sh: /lib64/libc.so.6: not found MATLAB code threw an exception: Undefined function or variable 'testhelloworld'. File:/tmp/tmppHQYo8/runmatlab/pyscript.m Name:pyscript Line:5 Return code: 0 Interface MatlabCommand failed to run. Interface Function failed to run. the command line, actually, the addpath points to the wrong path, my matlab script is in : /aramis/home/wen/HAO_lab/testhelloworld.m Actually, I think it is because I did not find the right documentation for nipype, cuz Im new to nipype... Thanks, Bonne journ?e ________________________________ De : Hao wen Envoy? : mercredi 29 juin 2016 10:25 ? : neuroimaging at python.org Objet : wrap a matlab script for Nipype Hello: I am new to nipype, and recently, I want to wrap a matlab script for Nipype, but in your website, the example is not so specific,and I checked out all the posts that you have talked about, it is not so clear, whatever for the command line or to create the matlab interface by ourself, can you just make a good tutorial for us? in fact, the tutorial in the nipype website is not so well-structured, in my opinion... Specifically, here is my case I wanna wrap a matlab function with a nipype freesurfer node, my matlab function actually just got some jpg files and save it to the output directory, I followed some options from the tutorial and other posts: By the way, i am on Ubuntu 12.04 def runmatlab(): from nipype.interfaces.matlab import MatlabCommand mlab = MatlabCommand() mlab.inputs.script = "/aramis/home/wen/HAO_lab/testhelloworld.m" out = mlab.run() out.outputs['matlab_output'] print out.outputs.matlab_output return out.outputs.matlab_output from nipype.interfaces.utility import Function import nipype.pipeline.engine as pe runmatlab = pe.Node(name='runmatlab', interface=Function(input_names=[], output_names=['out_file'], function=runmatlab)) runmatlab.run() the matlab file is just to print 'hello world' I got this in the _report file : Node: utility ============= Hierarchy : runmatlab Exec ID : runmatlab Original Inputs --------------- * function_str : S'def runmatlab():\n from nipype.interfaces.matlab import MatlabCommand\ n mlab = MatlabCommand()\n mlab.inputs.script = "/aramis/home/wen/HAO_lab/testhellowor ld.m"\n out = mlab.run()\n out.outputs[\'matlab_output\'] \n print out.outputs.matl ab_output\n return out.outputs.matlab_output\n' . * ignore_exception : False in the runmatlab node, I got some files Also, I tried the tutorial example2 in nipype website : http://nipy.org/nipype/devel/matlab_interface_devel.html, In my spyder, I got no display, but in my dir, I got the pyscript.m, I run it in my matlab, I got the result, does that mean that spyder cant find my matlab???? but I saw that in some post that you said the MatlabCommand will not use matlab engine, they just give some matlab script to it to run? Any advice will be appreciated:) Hao Neuroimaging in Python - Pipelines and Interfaces - nipy ... nipy.org How to wrap a MATLAB script? This is minimal script for wrapping MATLAB code. You should replace the MATLAB code template, and define approriate inputs and outputs. -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.freesurfer at hotmail.com Wed Jun 29 13:37:57 2016 From: hao.freesurfer at hotmail.com (Hao wen) Date: Wed, 29 Jun 2016 17:37:57 +0000 Subject: [Neuroimaging] wrap Matlab script using MatlabCommand in Nipype Message-ID: Hi everyone, I wrap a matlab script, in the matlab function, I didnt define the output, but I will save some output images to a specific folder, and I use the MatlabCommand in Nipype, so here is my NIpype code: from nipype.interfaces.utility import Function import nipype.pipeline.engine as pe def runmatlab(ContrastLinearModel, Format, CSVFilenames, PATH_TO_RECON_ALL_OUTPUTS, a_required_path): from nipype.interfaces.matlab import MatlabCommand matlab = MatlabCommand() matlab.inputs.paths = [a_required_path]# this is the path to add into matlab, addpath matlab.inputs.script = """ NipypeSurfStat('%s', '%s', '%s', '%s')"""%(ContrastLinearModel, Format, CSVFilenames, PATH_TO_RECON_ALL_OUTPUTS ) matlab.inputs.mfile = True # this will create a file: pyscript.m , the pyscript.m is the default name out = matlab.run() return out a_required_path = '/aramis/dataARAMIS/users/junhao.wen/Data_AD_CN/Group_analysis/SurfStat/Junhao_script/Code/NipypeSurfStat' #output_dir = '/aramis/dataARAMIS/users/junhao.wen/Data_AD_CN/Group_analysis/SurfStat/Junhao_script/Figures' PATH_TO_RECON_ALL_OUTPUTS = '/aramis/dataARAMIS/users/junhao.wen/Data_AD_CN/ADNI_60_AD_CN_object_recon_all_output'; CSVFilenames = '/aramis/dataARAMIS/users/junhao.wen/Data_AD_CN/Group_analysis/SurfStat/Junhao_script/Database/GroupAnalysisDataAramis/60_AD_CN_GROUPT_ANALYSIS.csv'; Format = '%s %s %s %f'; ContrastLinearModel = '1 + Label'; SurfStat = pe.Node(name='SurfStat', interface=Function(input_names=['ContrastLinearModel', 'Format', 'CSVFilenames', 'PATH_TO_RECON_ALL_OUTPUTS', 'a_required_path'], output_names=['out_file'], function=runmatlab)) SurfStat.inputs.ContrastLinearModel = ContrastLinearModel SurfStat.inputs.Format = Format SurfStat.inputs.CSVFilenames = CSVFilenames SurfStat.inputs.PATH_TO_RECON_ALL_OUTPUTS = PATH_TO_RECON_ALL_OUTPUTS SurfStat.inputs.a_required_path = a_required_path SurfStat.run() But I got some error, and I checked my matlab script, it is caused by this line, save2jpeg(strcat('Figures/', fileName, '/ContrastPositive-TValue.jpg')); The error from my nipype is : Standard error: Segmentation fault (core dumped) Return code: 139 Interface MatlabCommand failed to run. Interface Function failed to run. I googled, they give me this explaination: 1. Very basically, it means your program is trying to access a memory area it is not suposed to. 2. To the current directoy. 3. You need to make sure your program doesn't do 1.. I have been stucked by this for a long time, any advice will be appreciated:) Good day! Bonne journ?e! Hao -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexandre.gramfort at telecom-paristech.fr Thu Jun 30 12:50:32 2016 From: alexandre.gramfort at telecom-paristech.fr (Alexandre Gramfort) Date: Thu, 30 Jun 2016 18:50:32 +0200 Subject: [Neuroimaging] [JOB] Post Doc Machine Learning for neuroscience time series Message-ID: *Post-Doc/Research position in:* *Supervised learning on neuroscience time-series with application to epilepsy* Place: TELECOM ParisTech, 75013 Paris, France Duration: 1 year (extension possible) Start: Any date from September 1st, 2016 Salary: according to background and experience *Keywords:* machine learning, time-series, optimization, conditional random fields (CRF), representation learning, electroencephalography (EEG) *Position description* The objective of the project is to develop machine learning tools that can facilitate the review, the visualization, the processing and the annotation of clinical intracranial EEG data in the context of epilepsy. Epilepsy is a pathology that leads to prototypical patterns in the recorded time series (spikes, high frequency oscillations, seizures). The objective of the project is to build algorithms that automatically pinpoint such events in raw intracranial EEG data. The approach envisioned is based on state-of-the-art machine learning techniques (representation learning, conditional random fields). The position is funded by a joint grant between Alexandre Gramfort and Slim Essid at the Signal and Image Processing department at Telecom ParisTech, the companies Dataiku and Bioserenity as well as the ICM Institute at the Salp?tri?re hospital. *Work Environment* TELECOM ParisTech is the leading graduate School of Institut TELECOM with more than 160 research professors and over 250 Engineering degrees, 50 PhD and 150 specialized masters (post graduates) awarded per year. The signal and image processing (TSI) department conducts leading research in the field of statistics, machine learning and signal processing with regular publications in leading conferences (NIPS, ICML, ICCASP, etc.) and journals (JMLR, IEEE Trans. Med. Imaging, IEEE Trans. Signal Processing, etc.). The candidate will be integrated to a team formed by 6 PIs, more than 10 PhD students, 4 engineers and 3 post-docs, among which are 6 persons dedicated to the statistical analysis of electrophysiological signals. The local expertise is unique with both significant experience in signal processing, machine learning, statistics and in applied neuroscience data munging. *Candidate Profile* As minimum requirements, the candidate will have: a PhD in computer science, statistics / machine learning, signal processing strong programming skills (Experience with Python is a definite plus) strong communication skills in English. The ideal candidate would also have: - Prior experience with EEG data analysis and/or electrophysiology signals - Ability to work in a multi-partner collaborative environment. - Basic knowledge of French (not required). *Contacts *Interested applicants can contact Alexandre Gramfort or Slim Essid for more information or directly email a candidacy letter including a Curriculum Vitae, a list of publications and a statement of research interests. - Alexandre Gramfort (alexandre.gramfort at telecom-paristech.fr) - Slim Essid (slim.essid at telecom-paristech.fr) -------------- next part -------------- An HTML attachment was scrubbed... URL: