From adam at rybinski.info Mon Jan 4 07:07:08 2016 From: adam at rybinski.info (Adam Rybinski) Date: Mon, 4 Jan 2016 12:07:08 +0000 Subject: [Neuroimaging] [Dipy] global tractography Message-ID: Hi,recently I got interested in the concepts of global tractography---obviously there are many different algorithms--- but I have one particular article in my mind (Galinsky, Frank 2015) http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4417445/I wonder, what would be needed to implement something like this in Dipy---where to start in your opinion?---are there any building blocks that would be suitable for it already in Dipy?(maybe reconstruction parts of GQI). Cheers,Adam -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Mon Jan 4 09:43:03 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 4 Jan 2016 15:43:03 +0100 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel In-Reply-To: References: Message-ID: That is a good question which I am also trying to find an answer to. Actually, what are the slope and inter field specifically referring to? I know that for some reason the scanner likes to recast everything between 0 and 4096 (probably for storage reason), so you cannot mix and match acquisitions from the same session but acquired in different block because of that. Because of some scaling (which I think might be slope and inter), it also means that your magnitude/phase data has weird values (which make no sense for the phase data), but correcting for this scaling from the dicom headers fix it. This field (which has a private tag for philips, but I can try to dig it out from somewhere with pydicom) let's you indeed recover sane values, so I was wondering if it is related to the slope and inter field from the nifti file, thus no need to dig it out from the dicom. The whole point if this nifti field is the same as the one in the dicom, I could dump my custom, half-baked dicom converter for problematic files and just scale the converted nifti myself (provided a converter works, magnitude/phase data are so seldom used that I have no tool working on those, if anyone has suggestions.) Didn't know about the dataobj thing though, thanks for pointing it out. 2015-12-17 18:57 GMT+01:00 Matthew Brett : > Hi, > > On Thu, Dec 17, 2015 at 4:42 PM, Samuel St-Jean > wrote: > > Hello, > > > > I'd like to get the scaling and offset value to compare data from two > > volumes acquired in the same scan. Since the floating points values are > > wildly different, we need ot put them on the same scaling of intensity. > This > > can be done from the dicom files directly, but it a painful and error > prone > > process and difficult to automate. > > > > Turns out mrconvert from mrtrix does not take into account scaling when > your > > data is in floating point (by design, so even the options to correct this > > scaling are disabled), but still stores the correct values. When opening > the > > nifti files converted from nibabel the headers just show nan values, > while > > mrtrix can parse them correctly. Here is the header from mrinfo as an > > example. > > > > ************************************************ > > Image: "noise.nii.gz" > > ************************************************ > > Format: NIfTI-1.1 > > Dimensions: 128 x 128 x 78 > > Voxel size: 1.79665 x 1.79665 x 1.8 > > Dimension labels: 0. left->right (mm) > > 1. posterior->anterior (mm) > > 2. inferior->superior (mm) > > Data type: signed 16 bit integer (little endian) > > Data layout: [ -0 -1 +2 ] > > Data scaling: offset = 0, multiplier = 166.825 > > Comments: SAMUEL 9JUIL (SAMU > > MRtrix version: 0.3.12-1050-g62ace960 > > Transform: 0.9993 -0.00287 -0.03691 -115.6 > > -5.649e-11 0.997 -0.07752 -108.4 > > 0.03702 0.07746 0.9963 -60.36 > > 0 0 0 1 > > > > Same header through nibabel > > > > object, endian='<' > > sizeof_hdr : 348 > > data_type : > > db_name : SAMUEL 9JUIL (SAMU > > extents : 16384 > > session_error : 0 > > regular : r > > dim_info : 0 > > dim : [ 3 128 128 78 1 1 1 1] > > intent_p1 : 0.0 > > intent_p2 : 0.0 > > intent_p3 : 0.0 > > intent_code : none > > datatype : int16 > > bitpix : 16 > > slice_start : 0 > > pixdim : [ 1. 1.79665172 1.79665172 1.79999924 0. > > 0. 0. > > 0. ] > > vox_offset : 0.0 > > scl_slope : nan > > scl_inter : nan > > slice_end : 0 > > slice_code : unknown > > xyzt_units : 10 > > cal_max : 0.0 > > cal_min : 0.0 > > slice_duration : 0.0 > > toffset : 0.0 > > glmax : 0 > > glmin : 0 > > descrip : MRtrix version: 0.3.12-1050-g62ace960 > > aux_file : > > qform_code : scanner > > sform_code : scanner > > quatern_b : -0.0185005571693 > > quatern_c : -0.038780156523 > > quatern_d : 0.999076247215 > > qoffset_x : 111.719642639 > > qoffset_y : 119.12978363 > > qoffset_z : -34.2374076843 > > srow_x : [ -1.79541993e+00 5.15606394e-03 -6.64402023e-02 > > 1.11719643e+02] > > srow_y : [ 1.01492648e-10 -1.79124594e+00 -1.39527366e-01 > > 1.19129784e+02] > > srow_z : [ -0.06651677 -0.13917242 1.79335308 -34.23740768] > > intent_name : > > magic : n+1 > > > > > > Those would probably be scl_slope and scl_inter, but the values just got > > lost somewhere apparently. The same behavior is seen from the example in > > nifti header here : > > http://nipy.org/nibabel/nifti_images.html#the-nifti-header > > > > So the questions > > > > 1. Did something went wrong, or do both program just use different > field? If > > so, does nibabel show all fields in the header or only known ones? > > 2. Is the field scl_slope/scl_inter even related to the dicom scaling > (and > > offset/multiplier from mrtrix, though this one goes in their mailing > list if > > nobody knows)? Philips call those SS and SI values, and you simply need > to > > fix the offset + scaling with these numbers to get sane intensities out > of > > your scanner, but they are a huge pain to get out. > > No - nothing wrong. The nibabel reader resets the scale and slope to > NaN by default when it reads in the image. > > If you want to read the original scale and slope, you can get them > from the image array proxy object, as in: > > img.dataobj.slope > img.dataobj.inter > > Is that what you needed? > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Mon Jan 4 09:47:15 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 4 Jan 2016 14:47:15 +0000 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel In-Reply-To: References: Message-ID: Hi, On Mon, Jan 4, 2016 at 2:43 PM, Samuel St-Jean wrote: > That is a good question which I am also trying to find an answer to. > Actually, what are the slope and inter field specifically referring to? > > I know that for some reason the scanner likes to recast everything between 0 > and 4096 (probably for storage reason), so you cannot mix and match > acquisitions from the same session but acquired in different block because > of that. > Because of some scaling (which I think might be slope and inter), it also > means that your magnitude/phase data has weird values (which make no sense > for the phase data), but correcting for this scaling from the dicom headers > fix it. > > This field (which has a private tag for philips, but I can try to dig it out > from somewhere with pydicom) let's you indeed recover sane values, so I was > wondering if it is related to the slope and inter field from the nifti file, > thus no need to dig it out from the dicom. The whole point if this nifti > field is the same as the one in the dicom, I could dump my custom, > half-baked dicom converter for problematic files and just scale the > converted nifti myself (provided a converter works, magnitude/phase data are > so seldom used that I have no tool working on those, if anyone has > suggestions.) > > Didn't know about the dataobj thing though, thanks for pointing it out. The slope and inter fields are what was in the nifti header when you loaded the image. How those values came about will depend on how the niftis got generated by your scanner or conversion, it's difficult to speculate further. You could have a look at the DICOM values and see how they compare, but that might not give you a reliable answer, Cheers, Matthew From stjeansam at gmail.com Mon Jan 4 09:49:49 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 4 Jan 2016 15:49:49 +0100 Subject: [Neuroimaging] [Dipy] global tractography In-Reply-To: References: Message-ID: You could try browsing through https://github.com/nipy/dipy/tree/master/dipy/tracking , all of the local tracking stuff is in there. There are also the examples ( http://nipy.org/dipy/examples_index.html#fiber-tracking-new-experimental) which could give you an idea behind the current stuff implemented and how it works. Best of all, tracking is not tied to a local model (such as GQI), but rather works with the ODF or the peaks depending on the tracking algorithm used. Since they share some stuff in common, looking at the implementation from the first link should give you an idea of how they work. 2016-01-04 13:07 GMT+01:00 Adam Rybinski : > Hi, > recently I got interested in the concepts of global > tractography---obviously there are many different algorithms--- but I have > one particular article in my mind (Galinsky, Frank 2015) > http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4417445/ > I wonder, what would be needed to implement something like this in > Dipy---where to start in your opinion?---are there any building blocks that > would be suitable for it already in Dipy?(maybe reconstruction parts of > GQI). > > Cheers, > Adam > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Mon Jan 4 09:59:17 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 4 Jan 2016 15:59:17 +0100 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel In-Reply-To: References: Message-ID: So I guess I'll experiment with converters that work and see if they give the same field as the dicom themselves, thanks. Makes sense after all that those field are written by the converter, which might not be reliable. What are they supposed to contain/what is their usual purpose? Are those value you should apply to the data for whatever reason or that is not defined in the nifti standard? 2016-01-04 15:47 GMT+01:00 Matthew Brett : > Hi, > > On Mon, Jan 4, 2016 at 2:43 PM, Samuel St-Jean > wrote: > > That is a good question which I am also trying to find an answer to. > > Actually, what are the slope and inter field specifically referring to? > > > > I know that for some reason the scanner likes to recast everything > between 0 > > and 4096 (probably for storage reason), so you cannot mix and match > > acquisitions from the same session but acquired in different block > because > > of that. > > Because of some scaling (which I think might be slope and inter), it also > > means that your magnitude/phase data has weird values (which make no > sense > > for the phase data), but correcting for this scaling from the dicom > headers > > fix it. > > > > This field (which has a private tag for philips, but I can try to dig it > out > > from somewhere with pydicom) let's you indeed recover sane values, so I > was > > wondering if it is related to the slope and inter field from the nifti > file, > > thus no need to dig it out from the dicom. The whole point if this nifti > > field is the same as the one in the dicom, I could dump my custom, > > half-baked dicom converter for problematic files and just scale the > > converted nifti myself (provided a converter works, magnitude/phase data > are > > so seldom used that I have no tool working on those, if anyone has > > suggestions.) > > > > Didn't know about the dataobj thing though, thanks for pointing it out. > > The slope and inter fields are what was in the nifti header when you > loaded the image. > > How those values came about will depend on how the niftis got > generated by your scanner or conversion, it's difficult to speculate > further. You could have a look at the DICOM values and see how they > compare, but that might not give you a reliable answer, > > Cheers, > > Matthew > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Mon Jan 4 10:05:29 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Mon, 4 Jan 2016 15:05:29 +0000 Subject: [Neuroimaging] Getting scaling and offset from nifti header in nibabel In-Reply-To: References: Message-ID: On Mon, Jan 4, 2016 at 2:59 PM, Samuel St-Jean wrote: > So I guess I'll experiment with converters that work and see if they give > the same field as the dicom themselves, thanks. Makes sense after all that > those field are written by the converter, which might not be reliable. What > are they supposed to contain/what is their usual purpose? Are those value > you should apply to the data for whatever reason or that is not defined in > the nifti standard? The nifti standard just says that, if you want to add a scale and intercept to the data, you should use these fields. If the author of the nifti has set these fields, then they probably intend for you to use them, and nibabel does use them, for `np.array(img.dataobj)` and `img.get_data()`. You can ask nibabel to give you the data without the scaling, but that is more involved. Cheers, Matthew From garyfallidis at gmail.com Tue Jan 5 05:05:32 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Tue, 5 Jan 2016 12:05:32 +0200 Subject: [Neuroimaging] [Dipy] global tractography In-Reply-To: References: Message-ID: Hi Adam, Thank you for your interest in implementing global tracking in Dipy. This is exciting. I am sure there are many functions in Dipy to help you implement this algorithm but at the same time this framework presented in the paper is quite novel. That means that there is some work to be done. It would help if you can write down the main algorithm with the blocks that need implementing. In that way it will be easier to give you feedback on what is already available. It will also be nice to see if we can use the same framework with input from other reconstruction models. But this can wait. What is crucial is to make sure that this algorithm is validated and compared with existing tracking algorithms found in Dipy so that we know were we are standing. Also it will be nice to use this algorithm with same stopping criteria as used for local tracking (if possible). This tutorial is a must http://nipy.org/dipy/examples_built/tracking_tissue_classifier.html#example-tracking-tissue-classifier for understanding our current design on stopping tracking. For implementing a global tracking algorithm you need to create a folder under dipy/tracking/ called global (in contrast to local) and then put there the modules that are relevant to the global tracking. For example you can create a file called esp.py (for entropy spectrum pathways). The GQI implementation is available here https://github.com/nipy/dipy/blob/master/dipy/reconst/gqi.py https://github.com/nipy/dipy/blob/master/dipy/reconst/tests/test_gqi.py If you need more specific help ask directly in gitter https://gitter.im/nipy/dipy As this may be your first important contribution we should also arrange a hangout to discuss more about the specific problem and the coding practices used in Dipy. We can arrange date and time in gitter. Best regards, Eleftherios On Mon, Jan 4, 2016 at 4:49 PM, Samuel St-Jean wrote: > You could try browsing through > https://github.com/nipy/dipy/tree/master/dipy/tracking , all of the local > tracking stuff is in there. There are also the examples ( > http://nipy.org/dipy/examples_index.html#fiber-tracking-new-experimental) > which could give you an idea behind the current stuff implemented and how > it works. Best of all, tracking is not tied to a local model (such as GQI), > but rather works with the ODF or the peaks depending on the tracking > algorithm used. Since they share some stuff in common, looking at the > implementation from the first link should give you an idea of how they work. > > 2016-01-04 13:07 GMT+01:00 Adam Rybinski : > >> Hi, >> recently I got interested in the concepts of global >> tractography---obviously there are many different algorithms--- but I have >> one particular article in my mind (Galinsky, Frank 2015) >> http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4417445/ >> I wonder, what would be needed to implement something like this in >> Dipy---where to start in your opinion?---are there any building blocks that >> would be suitable for it already in Dipy?(maybe reconstruction parts of >> GQI). >> >> Cheers, >> Adam >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Jan 5 14:56:23 2016 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 5 Jan 2016 11:56:23 -0800 Subject: [Neuroimaging] Transatlantic Health Data Science Workshop Call for Papers - Due 1/20/16 Message-ID: Hi everyone. This workshop might be of interest to members of this list: https://sites.google.com/site/usukhealthdata/home Cheers, Ariel -------------- next part -------------- An HTML attachment was scrubbed... URL: From stjeansam at gmail.com Thu Jan 14 05:28:49 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Thu, 14 Jan 2016 11:28:49 +0100 Subject: [Neuroimaging] [nibabel] Loading data directly instead of using a memmap Message-ID: Hello, While processing some hcp data, we decided to use directly nifti files instead of using gzipped file as they use quite a lot of ram (there are some PRs fixing this under the work in nibabel apparently). So when you load a regular nifti file, it gets a memmap instead of a proper numpy array, which does not support the same feature and sometimes ends up producing really weird bugs down the line ( https://github.com/numpy/numpy/issues/6750). So, we just ended up casting the memmap to a regular numpy array with something like data = np.array(data) While this works, is it memory usage friendly (hcp data is ~4go after all) or does it keep a reference in the background? Is there a better way to achieve similar results, like for example forcing nibabel to load a numpy array directly instead of memmap? If anyone has suggestions for better usability, feel free to share. Samuel -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jan 14 06:14:22 2016 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 14 Jan 2016 03:14:22 -0800 Subject: [Neuroimaging] [nibabel] Loading data directly instead of using a memmap In-Reply-To: References: Message-ID: On Thu, Jan 14, 2016 at 2:28 AM, Samuel St-Jean wrote: > Hello, > > While processing some hcp data, we decided to use directly nifti files > instead of using gzipped file as they use quite a lot of ram (there are some > PRs fixing this under the work in nibabel apparently). So when you load a > regular nifti file, it gets a memmap instead of a proper numpy array, which > does not support the same feature and sometimes ends up producing really > weird bugs down the line (https://github.com/numpy/numpy/issues/6750). > > So, we just ended up casting the memmap to a regular numpy array with > something like > > data = np.array(data) > > While this works, is it memory usage friendly (hcp data is ~4go after all) > or does it keep a reference in the background? Is there a better way to > achieve similar results, like for example forcing nibabel to load a numpy > array directly instead of memmap? It costs a few hundred bytes of memory, and otherwise will act identically except that you lose access to the special mmap methods. I wouldn't worry about it :-). -n -- Nathaniel J. Smith -- http://vorpus.org From stjeansam at gmail.com Thu Jan 14 06:24:40 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Thu, 14 Jan 2016 12:24:40 +0100 Subject: [Neuroimaging] [nibabel] Loading data directly instead of using a memmap In-Reply-To: References: Message-ID: Oh, a simple fix after all, thanks! 2016-01-14 12:14 GMT+01:00 Nathaniel Smith : > On Thu, Jan 14, 2016 at 2:28 AM, Samuel St-Jean > wrote: > > Hello, > > > > While processing some hcp data, we decided to use directly nifti files > > instead of using gzipped file as they use quite a lot of ram (there are > some > > PRs fixing this under the work in nibabel apparently). So when you load a > > regular nifti file, it gets a memmap instead of a proper numpy array, > which > > does not support the same feature and sometimes ends up producing really > > weird bugs down the line (https://github.com/numpy/numpy/issues/6750). > > > > So, we just ended up casting the memmap to a regular numpy array with > > something like > > > > data = np.array(data) > > > > While this works, is it memory usage friendly (hcp data is ~4go after > all) > > or does it keep a reference in the background? Is there a better way to > > achieve similar results, like for example forcing nibabel to load a numpy > > array directly instead of memmap? > > It costs a few hundred bytes of memory, and otherwise will act > identically except that you lose access to the special mmap methods. I > wouldn't worry about it :-). > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Fri Jan 15 17:15:20 2016 From: satra at mit.edu (Satrajit Ghosh) Date: Fri, 15 Jan 2016 17:15:20 -0500 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development Message-ID: hi all, we wanted to announce some fantastic news regarding nipype. we have received funding from nibib to support nipype development for the next four years starting today. thanks to this grant, we will be able to harden, extend, and disseminate nipype and other nipy community projects. more importantly, chris and i and a few others will be able to spend a chunk of time on nipype. the key goals of the project are to improve usability and interactivity, introspection of computation, and interoperability with other projects, databases, and web-services (neurovault, neurosynth, and others). the focus will be on reproducible dataflows in biomedicine. the resources will also support additional personnel, an annual workshop, and cloud resources for some web services. as soon as we take care of a few administrative pieces, we will focus on nipype engineering. i do want to thank the extended nipy community for its extensive support towards nipype development, maintenance, and adoption over the last 5 years. we couldn't have gotten to this point without you. we are thrilled with the opportunity, very excited about the possibilities, and look forward to continued engagement with the community. quoting eleftherios: go go team! cheers, satra -------------- next part -------------- An HTML attachment was scrubbed... URL: From pieper at isomics.com Fri Jan 15 18:11:28 2016 From: pieper at isomics.com (Steve Pieper) Date: Fri, 15 Jan 2016 18:11:28 -0500 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Congratulations Satra, Chris, and all! On Fri, Jan 15, 2016 at 5:15 PM, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next four > years starting today. thanks to this grant, we will be able to harden, > extend, and disseminate nipype and other nipy community projects. more > importantly, chris and i and a few others will be able to spend a chunk of > time on nipype. > > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for > some web services. as soon as we take care of a few administrative pieces, > we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From krzysztof.gorgolewski at gmail.com Fri Jan 15 17:47:10 2016 From: krzysztof.gorgolewski at gmail.com (Chris Filo Gorgolewski) Date: Fri, 15 Jan 2016 14:47:10 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Go, go team! Well deserved! On Fri, Jan 15, 2016 at 2:15 PM, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next four > years starting today. thanks to this grant, we will be able to harden, > extend, and disseminate nipype and other nipy community projects. more > importantly, chris and i and a few others will be able to spend a chunk of > time on nipype. > > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for > some web services. as soon as we take care of a few administrative pieces, > we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Fri Jan 15 18:58:49 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Sat, 16 Jan 2016 01:58:49 +0200 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Many congratulations! GGT!!! :D On Sat, Jan 16, 2016 at 12:47 AM, Chris Filo Gorgolewski < krzysztof.gorgolewski at gmail.com> wrote: > Go, go team! Well deserved! > > On Fri, Jan 15, 2016 at 2:15 PM, Satrajit Ghosh wrote: > >> hi all, >> >> we wanted to announce some fantastic news regarding nipype. we have >> received funding from nibib to support nipype development for the next four >> years starting today. thanks to this grant, we will be able to harden, >> extend, and disseminate nipype and other nipy community projects. more >> importantly, chris and i and a few others will be able to spend a chunk of >> time on nipype. >> >> the key goals of the project are to improve usability and interactivity, >> introspection of computation, and interoperability with other projects, >> databases, and web-services (neurovault, neurosynth, and others). the focus >> will be on reproducible dataflows in biomedicine. the resources will also >> support additional personnel, an annual workshop, and cloud resources for >> some web services. as soon as we take care of a few administrative pieces, >> we will focus on nipype engineering. >> >> i do want to thank the extended nipy community for its extensive support >> towards nipype development, maintenance, and adoption over the last 5 >> years. we couldn't have gotten to this point without you. we are thrilled >> with the opportunity, very excited about the possibilities, and look >> forward to continued engagement with the community. >> >> quoting eleftherios: go go team! >> >> cheers, >> >> satra >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Fri Jan 15 19:10:15 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Fri, 15 Jan 2016 16:10:15 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: That's pretty awesome, great job guys! :O) On Fri, Jan 15, 2016 at 3:58 PM, Eleftherios Garyfallidis < garyfallidis at gmail.com> wrote: > Many congratulations! GGT!!! :D > > On Sat, Jan 16, 2016 at 12:47 AM, Chris Filo Gorgolewski < > krzysztof.gorgolewski at gmail.com> wrote: > >> Go, go team! Well deserved! >> >> On Fri, Jan 15, 2016 at 2:15 PM, Satrajit Ghosh wrote: >> >>> hi all, >>> >>> we wanted to announce some fantastic news regarding nipype. we have >>> received funding from nibib to support nipype development for the next four >>> years starting today. thanks to this grant, we will be able to harden, >>> extend, and disseminate nipype and other nipy community projects. more >>> importantly, chris and i and a few others will be able to spend a chunk of >>> time on nipype. >>> >>> the key goals of the project are to improve usability and interactivity, >>> introspection of computation, and interoperability with other projects, >>> databases, and web-services (neurovault, neurosynth, and others). the focus >>> will be on reproducible dataflows in biomedicine. the resources will also >>> support additional personnel, an annual workshop, and cloud resources for >>> some web services. as soon as we take care of a few administrative pieces, >>> we will focus on nipype engineering. >>> >>> i do want to thank the extended nipy community for its extensive support >>> towards nipype development, maintenance, and adoption over the last 5 >>> years. we couldn't have gotten to this point without you. we are thrilled >>> with the opportunity, very excited about the possibilities, and look >>> forward to continued engagement with the community. >>> >>> quoting eleftherios: go go team! >>> >>> cheers, >>> >>> satra >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From pellman.john at gmail.com Fri Jan 15 18:50:25 2016 From: pellman.john at gmail.com (John Pellman) Date: Fri, 15 Jan 2016 18:50:25 -0500 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Congrats y'all! On Jan 15, 2016 6:11 PM, "Steve Pieper" wrote: > Congratulations Satra, Chris, and all! > > On Fri, Jan 15, 2016 at 5:15 PM, Satrajit Ghosh wrote: > >> hi all, >> >> we wanted to announce some fantastic news regarding nipype. we have >> received funding from nibib to support nipype development for the next four >> years starting today. thanks to this grant, we will be able to harden, >> extend, and disseminate nipype and other nipy community projects. more >> importantly, chris and i and a few others will be able to spend a chunk of >> time on nipype. >> >> the key goals of the project are to improve usability and interactivity, >> introspection of computation, and interoperability with other projects, >> databases, and web-services (neurovault, neurosynth, and others). the focus >> will be on reproducible dataflows in biomedicine. the resources will also >> support additional personnel, an annual workshop, and cloud resources for >> some web services. as soon as we take care of a few administrative pieces, >> we will focus on nipype engineering. >> >> i do want to thank the extended nipy community for its extensive support >> towards nipype development, maintenance, and adoption over the last 5 >> years. we couldn't have gotten to this point without you. we are thrilled >> with the opportunity, very excited about the possibilities, and look >> forward to continued engagement with the community. >> >> quoting eleftherios: go go team! >> >> cheers, >> >> satra >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at onerussian.com Fri Jan 15 23:00:29 2016 From: lists at onerussian.com (Yaroslav Halchenko) Date: Fri, 15 Jan 2016 23:00:29 -0500 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: <20160116040029.GL22242@onerussian.com> Congrats! what a better way could be to celebrate than by cutting a fresh release? $> git describe --tags 0.11.0-425-gd410fbb ;-) Live long and prosper Nipype! On Fri, 15 Jan 2016, Satrajit Ghosh wrote: > hi all, > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next > four years starting today. thanks to this grant, we will be able to > harden, extend, and disseminate nipype and other nipy community projects. > more importantly, chris and i and a few others will be able to spend a > chunk of time on nipype. > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the > focus will be on reproducible dataflows in biomedicine. the resources will > also support additional personnel, an annual workshop, and cloud resources > for some web services. as soon as we take care of a few administrative > pieces, we will focus on nipype engineering. > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > quoting eleftherios: go go team! > cheers, > satra > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -- Yaroslav O. Halchenko Center for Open Neuroscience http://centerforopenneuroscience.org Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 WWW: http://www.linkedin.com/in/yarik From nolan.nichols at gmail.com Fri Jan 15 23:01:39 2016 From: nolan.nichols at gmail.com (Nolan Nichols) Date: Fri, 15 Jan 2016 20:01:39 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Very exciting, congrats! It's been a long time coming! Cheers, Nolan sent via mobile, please excuse brevity and typos On Jan 15, 2016 15:11, "Steve Pieper" wrote: > Congratulations Satra, Chris, and all! > > On Fri, Jan 15, 2016 at 5:15 PM, Satrajit Ghosh wrote: > >> hi all, >> >> we wanted to announce some fantastic news regarding nipype. we have >> received funding from nibib to support nipype development for the next four >> years starting today. thanks to this grant, we will be able to harden, >> extend, and disseminate nipype and other nipy community projects. more >> importantly, chris and i and a few others will be able to spend a chunk of >> time on nipype. >> >> the key goals of the project are to improve usability and interactivity, >> introspection of computation, and interoperability with other projects, >> databases, and web-services (neurovault, neurosynth, and others). the focus >> will be on reproducible dataflows in biomedicine. the resources will also >> support additional personnel, an annual workshop, and cloud resources for >> some web services. as soon as we take care of a few administrative pieces, >> we will focus on nipype engineering. >> >> i do want to thank the extended nipy community for its extensive support >> towards nipype development, maintenance, and adoption over the last 5 >> years. we couldn't have gotten to this point without you. we are thrilled >> with the opportunity, very excited about the possibilities, and look >> forward to continued engagement with the community. >> >> quoting eleftherios: go go team! >> >> cheers, >> >> satra >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbpoline at gmail.com Fri Jan 15 17:55:03 2016 From: jbpoline at gmail.com (JB Poline) Date: Fri, 15 Jan 2016 14:55:03 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Congrats Satra - this is great news. JB On Fri, Jan 15, 2016 at 2:15 PM, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next four > years starting today. thanks to this grant, we will be able to harden, > extend, and disseminate nipype and other nipy community projects. more > importantly, chris and i and a few others will be able to spend a chunk of > time on nipype. > > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for > some web services. as soon as we take care of a few administrative pieces, > we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bcipolli at ucsd.edu Fri Jan 15 23:22:19 2016 From: bcipolli at ucsd.edu (Ben Cipollini) Date: Fri, 15 Jan 2016 20:22:19 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: It is super awesome news! Well-deserved, looking forward to what comes next! Ben On Fri, Jan 15, 2016 at 2:15 PM, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next four > years starting today. thanks to this grant, we will be able to harden, > extend, and disseminate nipype and other nipy community projects. more > importantly, chris and i and a few others will be able to spend a chunk of > time on nipype. > > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for > some web services. as soon as we take care of a few administrative pieces, > we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Sat Jan 16 04:25:28 2016 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 16 Jan 2016 10:25:28 +0100 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: <20160116092528.GA938010@phare.normalesup.org> That's an excellent news. And it's well deserved. The scientific community is benefiting a lot from your work. GGT! Ga?l On Fri, Jan 15, 2016 at 05:15:20PM -0500, Satrajit Ghosh wrote: > hi all, > we wanted to announce some fantastic news regarding nipype. we have received > funding from nibib to support nipype development for the next four years > starting today. thanks to this grant, we will be able to harden, extend, and > disseminate nipype and other nipy community projects. more importantly, chris > and i and a few others will be able to spend a chunk of time on nipype. > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for some > web services. as soon as we take care of a few administrative pieces, we will > focus on nipype engineering. > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 years. we > couldn't have gotten to this point without you. we are thrilled with the > opportunity, very excited about the possibilities, and look forward to > continued engagement with the community. > quoting eleftherios: go go team! > cheers, > satra > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -- Gael Varoquaux Researcher, INRIA Parietal NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France Phone: ++ 33-1-69-08-79-68 http://gael-varoquaux.info http://twitter.com/GaelVaroquaux From bertrand.thirion at inria.fr Sat Jan 16 06:15:27 2016 From: bertrand.thirion at inria.fr (bthirion) Date: Sat, 16 Jan 2016 12:15:27 +0100 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: <569A264F.6070901@inria.fr> Congratulations ! This is well served ! GGT Bertrand On 15/01/2016 23:15, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next > four years starting today. thanks to this grant, we will be able to > harden, extend, and disseminate nipype and other nipy community > projects. more importantly, chris and i and a few others will be able > to spend a chunk of time on nipype. > > the key goals of the project are to improve usability and > interactivity, introspection of computation, and interoperability with > other projects, databases, and web-services (neurovault, neurosynth, > and others). the focus will be on reproducible dataflows in > biomedicine. the resources will also support additional personnel, an > annual workshop, and cloud resources for some web services. as soon as > we take care of a few administrative pieces, we will focus on nipype > engineering. > > i do want to thank the extended nipy community for its extensive > support towards nipype development, maintenance, and adoption over the > last 5 years. we couldn't have gotten to this point without you. we > are thrilled with the opportunity, very excited about the > possibilities, and look forward to continued engagement with the > community. > > quoting eleftherios: go go team! > > cheers, > > satra > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From arman.eshaghi at yahoo.com Sat Jan 16 08:36:05 2016 From: arman.eshaghi at yahoo.com (Arman Eshaghi) Date: Sat, 16 Jan 2016 13:36:05 +0000 (UTC) Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: <569A264F.6070901@inria.fr> References: <569A264F.6070901@inria.fr> Message-ID: <1436367887.6374133.1452951365530.JavaMail.yahoo@mail.yahoo.com> Congrats. Very well deserved. Arman On Saturday, January 16, 2016 11:27 AM, bthirion wrote: Congratulations ! This is well served ! GGT Bertrand On 15/01/2016 23:15, Satrajit Ghosh wrote: hi all, we wanted to announce some fantastic news regarding nipype. we have received funding from nibib to support nipype development for the next four years starting today. thanks to this grant, we will be able to harden, extend, and disseminate nipype and other nipy community projects. more importantly, chris and i and a few others will be able to spend a chunk of time on nipype. the key goals of the project are to improve usability and interactivity, introspection of computation, and interoperability with other projects, databases, and web-services (neurovault, neurosynth, and others). the focus will be on reproducible dataflows in biomedicine. the resources will also support additional personnel, an annual workshop, and cloud resources for some web services. as soon as we take care of a few administrative pieces, we will focus on nipype engineering. i do want to thank the extended nipy community for its extensive support towards nipype development, maintenance, and adoption over the last 5 years. we couldn't have gotten to this point without you. we are thrilled with the opportunity, very excited about the possibilities, and look forward to continued engagement with the community. quoting eleftherios: go go team! cheers, satra _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexandre.gramfort at telecom-paristech.fr Sat Jan 16 08:53:24 2016 From: alexandre.gramfort at telecom-paristech.fr (Alexandre Gramfort) Date: Sat, 16 Jan 2016 14:53:24 +0100 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: <1436367887.6374133.1452951365530.JavaMail.yahoo@mail.yahoo.com> References: <569A264F.6070901@inria.fr> <1436367887.6374133.1452951365530.JavaMail.yahoo@mail.yahoo.com> Message-ID: really nice guys congrats ! Alex From szorowi1 at gmail.com Sun Jan 17 11:16:20 2016 From: szorowi1 at gmail.com (Sam Zorowitz) Date: Sun, 17 Jan 2016 11:16:20 -0500 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: <569A264F.6070901@inria.fr> <1436367887.6374133.1452951365530.JavaMail.yahoo@mail.yahoo.com> Message-ID: Congrats all! On Sat, Jan 16, 2016 at 8:53 AM, Alexandre Gramfort < alexandre.gramfort at telecom-paristech.fr> wrote: > really nice guys congrats ! > > Alex > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexis.roche at gmail.com Sun Jan 17 15:02:28 2016 From: alexis.roche at gmail.com (Alexis Roche) Date: Sun, 17 Jan 2016 21:02:28 +0100 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: <569A264F.6070901@inria.fr> <1436367887.6374133.1452951365530.JavaMail.yahoo@mail.yahoo.com> Message-ID: Great news! Congrats all! Alexis Le 17 janv. 2016 17:16, "Sam Zorowitz" a ?crit : > Congrats all! > > On Sat, Jan 16, 2016 at 8:53 AM, Alexandre Gramfort < > alexandre.gramfort at telecom-paristech.fr> wrote: > >> really nice guys congrats ! >> >> Alex >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Sun Jan 17 20:49:00 2016 From: arokem at gmail.com (Ariel Rokem) Date: Sun, 17 Jan 2016 17:49:00 -0800 Subject: [Neuroimaging] Nosetests is going away? Message-ID: Hi everyone, I just noticed the following "note to users" on the nose website ( https://nose.readthedocs.org/en/latest/): "Nose has been in maintenance mode for the past several years and will likely cease without a new person/team to take over maintainership. New projects should consider using Nose2 , py.test , or just plain unittest/unittest2." Is that a real thing? Should we be worried? Make a plan to move all of our tests to py.test? Nose2? Unittests(2)? Maintain it ourselves? ;-) Cheers, Ariel -------------- next part -------------- An HTML attachment was scrubbed... URL: From michael.hanke at gmail.com Mon Jan 18 00:40:39 2016 From: michael.hanke at gmail.com (Michael Hanke) Date: Mon, 18 Jan 2016 06:40:39 +0100 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: <20160118054039.GQ6233@meiner> Congrats! Looking forward to see all the good things coming in the future. Best, Michael On Fri, Jan 15, 2016 at 05:15:20PM -0500, Satrajit Ghosh wrote: > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > received funding from nibib to support nipype development for the next four > years starting today. thanks to this grant, we will be able to harden, > extend, and disseminate nipype and other nipy community projects. more > importantly, chris and i and a few others will be able to spend a chunk of > time on nipype. > > the key goals of the project are to improve usability and interactivity, > introspection of computation, and interoperability with other projects, > databases, and web-services (neurovault, neurosynth, and others). the focus > will be on reproducible dataflows in biomedicine. the resources will also > support additional personnel, an annual workshop, and cloud resources for > some web services. as soon as we take care of a few administrative pieces, > we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support > towards nipype development, maintenance, and adoption over the last 5 > years. we couldn't have gotten to this point without you. we are thrilled > with the opportunity, very excited about the possibilities, and look > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -- J.-Prof. Dr. Michael Hanke Psychoinformatik Labor, Institut f?r Psychologie II Otto-von-Guericke-Universit?t Magdeburg, Universit?tsplatz 2, Geb.24 Tel.: +49(0)391-67-18481 Fax: +49(0)391-67-11947 GPG: 4096R/7FFB9E9B From msh at nmr.mgh.harvard.edu Mon Jan 18 01:23:53 2016 From: msh at nmr.mgh.harvard.edu (Matti Hamalainen) Date: Mon, 18 Jan 2016 08:23:53 +0200 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: <47C9A6ED-2B06-4CEB-9E11-B18964B0AB64@nmr.mgh.harvard.edu> Hi Satra et al. Great to hear nipype got funding from NIBIB. Keep up the good work! - Matti > On Jan 16, 2016, at 12:15 AM, Satrajit Ghosh wrote: > > hi all, > > we wanted to announce some fantastic news regarding nipype. we have received funding from nibib to support nipype development for the next four years starting today. thanks to this grant, we will be able to harden, extend, and disseminate nipype and other nipy community projects. more importantly, chris and i and a few others will be able to spend a chunk of time on nipype. > > the key goals of the project are to improve usability and interactivity, introspection of computation, and interoperability with other projects, databases, and web-services (neurovault, neurosynth, and others). the focus will be on reproducible dataflows in biomedicine. the resources will also support additional personnel, an annual workshop, and cloud resources for some web services. as soon as we take care of a few administrative pieces, we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive support towards nipype development, maintenance, and adoption over the last 5 years. we couldn't have gotten to this point without you. we are thrilled with the opportunity, very excited about the possibilities, and look forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging --------- Matti Hamalainen, Ph.D. Athinoula A. Martinos Center for Biomedical Imaging Massachusetts General Hospital msh at nmr.mgh.harvard.edu mhamalainen at partners.org The information in this e-mail is intended only for the person to whom it is addressed. If you believe this e-mail was sent to you in error and the e-mail contains patient information, please contact the Partners Compliance HelpLine at http://www.partners.org/complianceline . If the e-mail was sent to you in error but does not contain patient information, please contact the sender and properly dispose of the e-mail. From matthew.brett at gmail.com Mon Jan 18 02:29:49 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 17 Jan 2016 23:29:49 -0800 Subject: [Neuroimaging] Nosetests is going away? In-Reply-To: References: Message-ID: Hi, On Sun, Jan 17, 2016 at 5:49 PM, Ariel Rokem wrote: > Hi everyone, > > I just noticed the following "note to users" on the nose website > (https://nose.readthedocs.org/en/latest/): > > "Nose has been in maintenance mode for the past several years and will > likely cease without a new person/team to take over maintainership. New > projects should consider using Nose2, py.test, or just plain > unittest/unittest2." > > Is that a real thing? Should we be worried? Make a plan to move all of our > tests to py.test? Nose2? Unittests(2)? I suggest we wait it out, until we run into problems, and then switch to one of the alternatives. Numpy and scipy are still using nosetests, for example. Cheers, Matthew From stjeansam at gmail.com Mon Jan 18 05:01:43 2016 From: stjeansam at gmail.com (Samuel St-Jean) Date: Mon, 18 Jan 2016 11:01:43 +0100 Subject: [Neuroimaging] [dipy] Understanding the ACT tracking Message-ID: Hello, I ran the example about the new tracking API over here [1] but there is a small note at the bottom which I don't really understand. The note reads as Notes Currently in ACT the proposed method that cuts streamlines going through subcortical gray matter regions is not implemented. The backtracking technique for streamlines reaching INVALIDPOINT is not implemented either. So for the second part, it does no do backtracking, but what about the first sentence? Should one read it as if does not keep streamlines going form GM to GM? So I tried the example on the given dataset, and I get strange results. No idea if pictures work in mailing list, so I'll link to the issue [2]. I was expecting the ACT tracking to not produce small, anatomically implausible streamlines, but I'm wondering if it has something to do with the note above. Anyone can share experience about the results they have? I tried on an HCP dataset and I also get weird results. I also ran the same tracking parameters/same masks on another tracking script implementing ACT (from which a subpart of it is what became the tracking API in dipy, or at least the same guy did help in both), and it does not produce spurious tracks as I would expect. So, am I misunderstanding the ACT tracking and it is supposed to do that or I am missing something obvious? [1] http://nipy.org/dipy/examples_built/tracking_tissue_classifier.html#example-tracking-tissue-classifier [2] https://github.com/nipy/dipy/issues/831 -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at stanford.edu Fri Jan 15 23:38:25 2016 From: vsochat at stanford.edu (vanessa sochat) Date: Fri, 15 Jan 2016 20:38:25 -0800 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: <20160116040029.GL22242@onerussian.com> References: <20160116040029.GL22242@onerussian.com> Message-ID: +1 Yaroslav! [image: Inline image 1] :) On Fri, Jan 15, 2016 at 8:00 PM, Yaroslav Halchenko wrote: > Congrats! what a better way could be to celebrate than by cutting a > fresh release? > > $> git describe --tags > 0.11.0-425-gd410fbb > > ;-) > > Live long and prosper Nipype! > > On Fri, 15 Jan 2016, Satrajit Ghosh wrote: > > > hi all, > > we wanted to announce some fantastic news regarding nipype. we have > > received funding from nibib to support nipype development for the next > > four years starting today. thanks to this grant, we will be able to > > harden, extend, and disseminate nipype and other nipy community > projects. > > more importantly, chris and i and a few others will be able to spend a > > chunk of time on nipype. > > the key goals of the project are to improve usability and > interactivity, > > introspection of computation, and interoperability with other > projects, > > databases, and web-services (neurovault, neurosynth, and others). the > > focus will be on reproducible dataflows in biomedicine. the resources > will > > also support additional personnel, an annual workshop, and cloud > resources > > for some web services. as soon as we take care of a few administrative > > pieces, we will focus on nipype engineering. > > i do want to thank the extended nipy community for its extensive > support > > towards nipype development, maintenance, and adoption over the last 5 > > years. we couldn't have gotten to this point without you. we are > thrilled > > with the opportunity, very excited about the possibilities, and look > > forward to continued engagement with the community. > > quoting eleftherios: go go team! > > cheers, > > satra > > > _______________________________________________ > > Neuroimaging mailing list > > Neuroimaging at python.org > > https://mail.python.org/mailman/listinfo/neuroimaging > > > -- > Yaroslav O. Halchenko > Center for Open Neuroscience http://centerforopenneuroscience.org > Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755 > Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419 > WWW: http://www.linkedin.com/in/yarik > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -- Vanessa Villamia Sochat Stanford University (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 276018 bytes Desc: not available URL: From g.flandin at ucl.ac.uk Mon Jan 18 03:45:58 2016 From: g.flandin at ucl.ac.uk (Flandin, Guillaume) Date: Mon, 18 Jan 2016 08:45:58 +0000 Subject: [Neuroimaging] NIBIB@NIH funds Nipype development In-Reply-To: References: Message-ID: Fantastic news, Satra. Congratulations! G. ________________________________ From: Neuroimaging on behalf of Satrajit Ghosh Sent: 15 January 2016 22:15 To: Neuroimaging analysis in Python Subject: [Neuroimaging] NIBIB at NIH funds Nipype development hi all, we wanted to announce some fantastic news regarding nipype. we have received funding from nibib to support nipype development for the next four years starting today. thanks to this grant, we will be able to harden, extend, and disseminate nipype and other nipy community projects. more importantly, chris and i and a few others will be able to spend a chunk of time on nipype. the key goals of the project are to improve usability and interactivity, introspection of computation, and interoperability with other projects, databases, and web-services (neurovault, neurosynth, and others). the focus will be on reproducible dataflows in biomedicine. the resources will also support additional personnel, an annual workshop, and cloud resources for some web services. as soon as we take care of a few administrative pieces, we will focus on nipype engineering. i do want to thank the extended nipy community for its extensive support towards nipype development, maintenance, and adoption over the last 5 years. we couldn't have gotten to this point without you. we are thrilled with the opportunity, very excited about the possibilities, and look forward to continued engagement with the community. quoting eleftherios: go go team! cheers, satra -------------- next part -------------- An HTML attachment was scrubbed... URL: From raghav.mehta at research.iiit.ac.in Wed Jan 20 22:36:39 2016 From: raghav.mehta at research.iiit.ac.in (Raghav Mehta) Date: Thu, 21 Jan 2016 09:06:39 +0530 (IST) Subject: [Neuroimaging] reg. load nifti image Message-ID: <434111074.226028.1453347399827.JavaMail.zimbra@research.iiit.ac.in> Hi, I am trying to read Nifti images using Nibabel, but when I am using nib.load to load the file from perticlar dataset I am getting following error: Traceback (most recent call last): File "", line 1, in File "/usr/local/lib/python2.7/dist-packages/nibabel/loadsave.py", line 44, in load return guessed_image_type(filename).from_filename(filename, **kwargs) File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 986, in from_filename return klass.from_file_map(file_map, mmap=mmap) File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 956, in from_file_map img._affine = header.get_best_affine() File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 642, in get_best_affine return self.get_qform() File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 822, in get_qform quat = self.get_qform_quaternion() File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 797, in get_qform_quaternion return fillpositive(bcd, self.quaternion_threshold) File "/usr/local/lib/python2.7/dist-packages/nibabel/quaternions.py", line 99, in fillpositive raise ValueError('w2 should be positive, but is %e' % w2) ValueError: w2 should be positive, but is -6.401211e-07 I know this due to fill option in quaternions.py file. My question is how can I disable that option and read these files. This issue is only for this perticular dataset. I am able to read other files normally from other dataset. Can anyone help me in solving this issue?? Regards, Raghav Mehta From matthew.brett at gmail.com Thu Jan 21 03:51:41 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 21 Jan 2016 00:51:41 -0800 Subject: [Neuroimaging] reg. load nifti image In-Reply-To: <434111074.226028.1453347399827.JavaMail.zimbra@research.iiit.ac.in> References: <434111074.226028.1453347399827.JavaMail.zimbra@research.iiit.ac.in> Message-ID: Hi, On Wed, Jan 20, 2016 at 7:36 PM, Raghav Mehta wrote: > Hi, > > I am trying to read Nifti images using Nibabel, but when I am using nib.load to load the file from perticlar dataset I am getting following error: > > Traceback (most recent call last): > File "", line 1, in > File "/usr/local/lib/python2.7/dist-packages/nibabel/loadsave.py", line 44, in load > return guessed_image_type(filename).from_filename(filename, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper > return func(*args, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 986, in from_filename > return klass.from_file_map(file_map, mmap=mmap) > File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper > return func(*args, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 956, in from_file_map > img._affine = header.get_best_affine() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 642, in get_best_affine > return self.get_qform() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 822, in get_qform > quat = self.get_qform_quaternion() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 797, in get_qform_quaternion > return fillpositive(bcd, self.quaternion_threshold) > File "/usr/local/lib/python2.7/dist-packages/nibabel/quaternions.py", line 99, in fillpositive > raise ValueError('w2 should be positive, but is %e' % w2) > ValueError: w2 should be positive, but is -6.401211e-07 > > > I know this due to fill option in quaternions.py file. My question is how can I disable that option and read these files. > > This issue is only for this perticular dataset. I am able to read other files normally from other dataset. Can anyone help me in solving this issue?? It looks like our threshold is too strict here. Does this work? >>> import numpy as np >>> import nibabel as nib >>> nib.Nifti1Header.quaternion_threshold = np.finfo(np.float32).eps * 10 >>> img = nib.load('bad_image.nii') ? Best, Matthew From raghav.mehta at research.iiit.ac.in Thu Jan 21 04:11:36 2016 From: raghav.mehta at research.iiit.ac.in (Raghav Mehta) Date: Thu, 21 Jan 2016 14:41:36 +0530 (IST) Subject: [Neuroimaging] reg. load nifti image In-Reply-To: References: <434111074.226028.1453347399827.JavaMail.zimbra@research.iiit.ac.in> Message-ID: <1756044187.267108.1453367496593.JavaMail.zimbra@research.iiit.ac.in> Hi, Pre-defined thresold is -6.6e-14, while value of w2 for my data is -6.401211e-07. So when I change the thresold is works perfectly?? Can you please explain how this may effect processing of data. as I had to multiply quaternion thresold with 1e9 to get proper thresold value Regards, Raghav Mehta ----- Original Message ----- From: "Matthew Brett" To: "neuroimaging" Sent: Thursday, January 21, 2016 2:21:41 PM Subject: Re: [Neuroimaging] reg. load nifti image Hi, On Wed, Jan 20, 2016 at 7:36 PM, Raghav Mehta wrote: > Hi, > > I am trying to read Nifti images using Nibabel, but when I am using nib.load to load the file from perticlar dataset I am getting following error: > > Traceback (most recent call last): > File "", line 1, in > File "/usr/local/lib/python2.7/dist-packages/nibabel/loadsave.py", line 44, in load > return guessed_image_type(filename).from_filename(filename, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper > return func(*args, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 986, in from_filename > return klass.from_file_map(file_map, mmap=mmap) > File "/usr/local/lib/python2.7/dist-packages/nibabel/keywordonly.py", line 16, in wrapper > return func(*args, **kwargs) > File "/usr/local/lib/python2.7/dist-packages/nibabel/analyze.py", line 956, in from_file_map > img._affine = header.get_best_affine() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 642, in get_best_affine > return self.get_qform() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 822, in get_qform > quat = self.get_qform_quaternion() > File "/usr/local/lib/python2.7/dist-packages/nibabel/nifti1.py", line 797, in get_qform_quaternion > return fillpositive(bcd, self.quaternion_threshold) > File "/usr/local/lib/python2.7/dist-packages/nibabel/quaternions.py", line 99, in fillpositive > raise ValueError('w2 should be positive, but is %e' % w2) > ValueError: w2 should be positive, but is -6.401211e-07 > > > I know this due to fill option in quaternions.py file. My question is how can I disable that option and read these files. > > This issue is only for this perticular dataset. I am able to read other files normally from other dataset. Can anyone help me in solving this issue?? It looks like our threshold is too strict here. Does this work? >>> import numpy as np >>> import nibabel as nib >>> nib.Nifti1Header.quaternion_threshold = np.finfo(np.float32).eps * 10 >>> img = nib.load('bad_image.nii') ? Best, Matthew _______________________________________________ Neuroimaging mailing list Neuroimaging at python.org https://mail.python.org/mailman/listinfo/neuroimaging From matthew.brett at gmail.com Thu Jan 21 13:10:05 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 21 Jan 2016 10:10:05 -0800 Subject: [Neuroimaging] reg. load nifti image In-Reply-To: <1756044187.267108.1453367496593.JavaMail.zimbra@research.iiit.ac.in> References: <434111074.226028.1453347399827.JavaMail.zimbra@research.iiit.ac.in> <1756044187.267108.1453367496593.JavaMail.zimbra@research.iiit.ac.in> Message-ID: Hi, On Thu, Jan 21, 2016 at 1:11 AM, Raghav Mehta wrote: > Hi, > > Pre-defined thresold is -6.6e-14, while value of w2 for my data is -6.401211e-07. So when I change the thresold is works perfectly?? Can you please explain how this may effect processing of data. as I had to multiply quaternion thresold with 1e9 to get proper thresold value > Oh - whoops - I missed off a negative sign, maybe that explains the confusion. The default threshold is: In [11]: -np.finfo(np.float32).eps * 3 Out[11]: -3.5762786865234375e-07 Your value appears to be a little below that, at -6.401211e-07. So, with a threshold of: In [13]: -np.finfo(np.float32).eps * 10 Out[13]: -1.1920928955078125e-06 you should be fairly safe, with a reasonable threshold. >>> import numpy as np >>> import nibabel as nib >>> nib.Nifti1Header.quaternion_threshold = -np.finfo(np.float32).eps * 10 # notice new minus sign >>> img = nib.load('bad_image.nii') Does that work? Matthew From kangik at snu.ac.kr Thu Jan 21 03:13:05 2016 From: kangik at snu.ac.kr (Kevin Kangik Cho) Date: Thu, 21 Jan 2016 17:13:05 +0900 Subject: [Neuroimaging] PySurfer installation problem Message-ID: Dear PySurfer community, I decided to become a fan of PySurfer after seeing the example gallery on the website. Thank you for the nice tool. I?m using Ubuntu 15.01 VTK 6.3.0 python-vtk through apt-get (5.1) python and Mayavi using Canopy enthought pysurfer through canopy pip (also tried most recent git version) Althought ?pysurfer fsaverage lh pial? works perfect, an error occurs when running plot_basic.py from the example. Could you please comment on the problem below ? X Error of failed request: GLXBadCurrentWindow Major opcode of failed request: 150 (GLX) Minor opcode of failed request: 1 (X_GLXRender) Serial number of failed request: 48 Current serial number in output stream: 49 Thank you for the help in advance ! ? could you possibly comment on the optimal system settings for the PySurfer ? osx vs ubuntu (+ versions) Kevin -------------- next part -------------- An HTML attachment was scrubbed... URL: From ilya.kuzovkin at gmail.com Thu Jan 21 09:46:03 2016 From: ilya.kuzovkin at gmail.com (Ilya Kuzovkin) Date: Thu, 21 Jan 2016 14:46:03 +0000 Subject: [Neuroimaging] [PySurfer] MNI coordinate to cortex area name Message-ID: Hi, What would be the right way to obtain brain area name from an MNI location? I give (-45, 50, 5) in and get "V4" out? Thanks in advance, - Ilya -------------- next part -------------- An HTML attachment was scrubbed... URL: From pellman.john at gmail.com Sun Jan 24 12:33:40 2016 From: pellman.john at gmail.com (John Pellman) Date: Sun, 24 Jan 2016 12:33:40 -0500 Subject: [Neuroimaging] HackerNews user open sources his brain Message-ID: Thought this might be of interest to y'all: https://news.ycombinator.com/item?id=10961355 -------------- next part -------------- An HTML attachment was scrubbed... URL: From krzysztof.gorgolewski at gmail.com Sun Jan 24 12:38:04 2016 From: krzysztof.gorgolewski at gmail.com (Chris Filo Gorgolewski) Date: Sun, 24 Jan 2016 09:38:04 -0800 Subject: [Neuroimaging] HackerNews user open sources his brain In-Reply-To: References: Message-ID: I did something similar a while ago: http://blog.chrisgorgolewski.org/2014/10/this-is-my-brain-sharing-risk.html On Jan 24, 2016 9:34 AM, "John Pellman" wrote: > Thought this might be of interest to y'all: > > https://news.ycombinator.com/item?id=10961355 > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbpoline at gmail.com Sun Jan 24 13:09:31 2016 From: jbpoline at gmail.com (JB Poline) Date: Sun, 24 Jan 2016 10:09:31 -0800 Subject: [Neuroimaging] [PySurfer] MNI coordinate to cortex area name In-Reply-To: References: Message-ID: Hi, There's this thread on researchgate that mention a few ways and caveats: https://www.researchgate.net/post/Can_anyone_suggest_an_online_tool_for_entering_coordinates_to_view_brain_location_eg_MNI_vs_Talairach cheers JB On Thu, Jan 21, 2016 at 6:46 AM, Ilya Kuzovkin wrote: > Hi, > > What would be the right way to obtain brain area name from an MNI location? > I give (-45, 50, 5) in and get "V4" out? > > Thanks in advance, > - Ilya > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mwaskom at stanford.edu Sun Jan 24 13:15:37 2016 From: mwaskom at stanford.edu (Michael Waskom) Date: Sun, 24 Jan 2016 10:15:37 -0800 Subject: [Neuroimaging] [PySurfer] MNI coordinate to cortex area name In-Reply-To: References: Message-ID: I'd like to plug this paper from Devlin and Poldrack: http://www.ncbi.nlm.nih.gov/pubmed/17870621 I think it's very important to read and understand before putting too much faith in tools that tell you what region your MNI coordinate is in. On Sun, Jan 24, 2016 at 10:09 AM, JB Poline wrote: > Hi, > > There's this thread on researchgate that mention a few ways and caveats: > > https://www.researchgate.net/post/Can_anyone_suggest_an_online_tool_for_entering_coordinates_to_view_brain_location_eg_MNI_vs_Talairach > > cheers > JB > > On Thu, Jan 21, 2016 at 6:46 AM, Ilya Kuzovkin > wrote: > >> Hi, >> >> What would be the right way to obtain brain area name from an MNI >> location? >> I give (-45, 50, 5) in and get "V4" out? >> >> Thanks in advance, >> - Ilya >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbpoline at gmail.com Sun Jan 24 13:22:59 2016 From: jbpoline at gmail.com (JB Poline) Date: Sun, 24 Jan 2016 10:22:59 -0800 Subject: [Neuroimaging] [PySurfer] MNI coordinate to cortex area name In-Reply-To: References: Message-ID: +1 Also read Jason Bohland's paper: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2748707&tool=pmcentrez&rendertype=abstract On Sun, Jan 24, 2016 at 10:15 AM, Michael Waskom wrote: > I'd like to plug this paper from Devlin and Poldrack: > http://www.ncbi.nlm.nih.gov/pubmed/17870621 > > I think it's very important to read and understand before putting too much > faith in tools that tell you what region your MNI coordinate is in. > > On Sun, Jan 24, 2016 at 10:09 AM, JB Poline wrote: > >> Hi, >> >> There's this thread on researchgate that mention a few ways and caveats: >> >> https://www.researchgate.net/post/Can_anyone_suggest_an_online_tool_for_entering_coordinates_to_view_brain_location_eg_MNI_vs_Talairach >> >> cheers >> JB >> >> On Thu, Jan 21, 2016 at 6:46 AM, Ilya Kuzovkin >> wrote: >> >>> Hi, >>> >>> What would be the right way to obtain brain area name from an MNI >>> location? >>> I give (-45, 50, 5) in and get "V4" out? >>> >>> Thanks in advance, >>> - Ilya >>> >>> _______________________________________________ >>> Neuroimaging mailing list >>> Neuroimaging at python.org >>> https://mail.python.org/mailman/listinfo/neuroimaging >>> >>> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jrudascas at gmail.com Wed Jan 27 20:19:13 2016 From: jrudascas at gmail.com (Jorge Rudas) Date: Wed, 27 Jan 2016 20:19:13 -0500 Subject: [Neuroimaging] ROI register Message-ID: Hi dipy experts, I have a ROI in NMI space. I need register the ROI to subject space. What is the correct pipeline for this? Some example? Regards, *Jorge Rudas* -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Wed Jan 27 20:28:21 2016 From: arokem at gmail.com (Ariel Rokem) Date: Wed, 27 Jan 2016 17:28:21 -0800 Subject: [Neuroimaging] ROI register In-Reply-To: References: Message-ID: Hi Jorge, On Wed, Jan 27, 2016 at 5:19 PM, Jorge Rudas wrote: > Hi dipy experts, > > I have a ROI in NMI space. I need register the ROI to subject space. > > What is the correct pipeline for this? > > Some example? > Github is down right this moment, but when it comes back up, you should be able to see a couple of examples of that here: https://github.com/arokem/AFQ-notebooks Look at the notebook called "AFQ-registration-callosum". The template ROIs used are now available to download here: http://hdl.handle.net/1773/34926 > Regards, > > *Jorge Rudas* > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vsochat at gmail.com Wed Jan 27 22:15:34 2016 From: vsochat at gmail.com (vanessa s) Date: Wed, 27 Jan 2016 19:15:34 -0800 Subject: [Neuroimaging] ROI register In-Reply-To: References: Message-ID: It's ok, everyone can put down the doomsday vices, put down the zombie repellent, emerge from the bomb shelters... Github is back online! Huzzah!! :D On Wed, Jan 27, 2016 at 5:28 PM, Ariel Rokem wrote: > Hi Jorge, > > On Wed, Jan 27, 2016 at 5:19 PM, Jorge Rudas wrote: > >> Hi dipy experts, >> >> I have a ROI in NMI space. I need register the ROI to subject space. >> >> What is the correct pipeline for this? >> >> Some example? >> > > Github is down right this moment, but when it comes back up, you should be > able to see a couple of examples of that here: > > https://github.com/arokem/AFQ-notebooks > > Look at the notebook called "AFQ-registration-callosum". > > The template ROIs used are now available to download here: > http://hdl.handle.net/1773/34926 > > > >> Regards, >> >> *Jorge Rudas* >> >> >> _______________________________________________ >> Neuroimaging mailing list >> Neuroimaging at python.org >> https://mail.python.org/mailman/listinfo/neuroimaging >> >> > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > > -- Vanessa Villamia Sochat Stanford University '16 (603) 321-0676 -------------- next part -------------- An HTML attachment was scrubbed... URL: From garyfallidis at gmail.com Sun Jan 31 16:35:09 2016 From: garyfallidis at gmail.com (Eleftherios Garyfallidis) Date: Sun, 31 Jan 2016 16:35:09 -0500 Subject: [Neuroimaging] [Nipy-devel][Dipy] A new circle of Google Summer of Code starts - time for new proposals Message-ID: Hello all, Taking part in Google Summer of Code (GSoC) is indeed rewording for our project as it allows for new algorithms to be merged and at the same time grow our development team with excellent contributors. After I believe a successful GSoC participation last year a new circle starts for this year (2016). Last year it was Ariel and me who did most the mentoring. This year we would like to hear others' ideas too. Therefore, we welcome other developers/scientists who would like to mentor or propose new projects. For those who want to mentor we will be happily co-mentors to help them with the process and give extra feedback to the relevant students. In the following link we started adding projects that we think would be interesting for this year's GSoC https://github.com/nipy/dipy/wiki/Google-Summer-of-Code-2016 Be happy to add your projects at the wiki or suggest ideas in this thread. We also welcome the previous participants of GSoC (Julio and Rafael) to take part as mentors this year. Finally, this year I am hoping to be able to get more than 2 projects funded. Hopefully 4 but that is not certain. Waiting for your ideas/suggestions. What would you like to see in Dipy that could be developed by a student during this summer? Best regards, Eleftherios -------------- next part -------------- An HTML attachment was scrubbed... URL: