From satra at mit.edu Sun Nov 11 18:15:45 2018 From: satra at mit.edu (Satrajit Ghosh) Date: Sun, 11 Nov 2018 18:15:45 -0500 Subject: [Neuroimaging] Please join us at Coastal Coding for Reproducible Neuroimaging: 10-16 Jan 2019 in Miami, FL In-Reply-To: <8A316B65-EDD9-4B36-8E99-58FE28C156E7@fiu.edu> References: <8A316B65-EDD9-4B36-8E99-58FE28C156E7@fiu.edu> Message-ID: ---------- Forwarded message --------- From: Angela Laird Date: Fri, Nov 9, 2018 at 3:42 PM Subject: Please join us at Coastal Coding for Reproducible Neuroimaging: 10-16 Jan 2019 in Miami, FL To: Dear Colleagues, Florida International University (FIU) is hosting a Miami workshop + code sprint in January 2019 called *Coastal Coding for Reproducible Neuroimaging* : *Jan 10-12: ReproNim + Nipype 2.0 Workshop* The goals of this workshop are two-fold. First, to enable attendees to become familiar with best practices for reproducible analysis from ReproNim. Second, to learn how to create scalable and re-executable neuroimaging workflows with the upcoming Nipype 2.0 platform. This event is targeted towards novice neuroimaging users who meet the following requirements: (1) Working knowledge of Python 3; and (2) Familiarity with scripting in at least 1 neuroimaging package (e.g., AFNI, ANTS, FSL, FreeSurfer, SPM). Advanced users are welcome to learn about the changes and hack with Nipype developers. *Jan 14-16: NiMARE Code Sprint* NiMARE (Neuroimaging Meta-Analysis Research Environment) is a new Python package for performing analyses using neuroimaging meta-analytic data. Current tools for neuroimaging meta-analyses are spread out across several packages and languages. NiMARE aims to provide a standardized interface for a broad range of algorithms, along with interfaces to open meta-analytic resources like Neurosynth, NeuroVault, and brainspell. The three-day NiMARE coding sprint will occur after the Nipype workshop, with a one-day rest period between, in which participants can enjoy Miami. At the sprint, we?ll be working to design a standard for representing meta-analytic data, to add implementations for new tools, and to improve the documentation and test coverage of the existing codebase. Anyone interested in contributing to this new package is welcome to join. *Both the workshop and code sprint will be held at FIU?s scenic Biscayne Bay Campus*, conveniently located 17 miles from the Miami International Airport (MIA) or 19 miles from the Fort Lauderdale-Hollywood International Airport (FLL). Registration costs are $125 for one event or $200 for both events. Fees cover breakfast, lunch, snacks, and plenty of coffee. *Full details can be found at:* https://coastal-coding.github.io Please feel free to forward to your colleagues! Best, Angie ***** Dr. Angie Laird, Ph.D. Professor, Department of Physics Director, Center for Imaging Science Florida International University 11200 SW 8th ST Miami, FL 33199 Office: AHC-4 Room 310 Lab: AHC-4 Room 380 http://cismri.fiu.edu http://neurolab.fiu.edu Phone: (305) 348-6737 Fax: (305) 348-6700 -------------- next part -------------- An HTML attachment was scrubbed... URL: From krishnainet at gmail.com Mon Nov 12 14:48:48 2018 From: krishnainet at gmail.com (sri krishna) Date: Tue, 13 Nov 2018 01:18:48 +0530 Subject: [Neuroimaging] tranformation to/from dicom to hl7 Message-ID: Is the library " *Neuroimaging *" supports to/fro transformation HL7<=>DICOM? If yes can you please help in pointing the api signatures to achieve the same. -------------- next part -------------- An HTML attachment was scrubbed... URL: From christophe at pallier.org Tue Nov 13 10:06:35 2018 From: christophe at pallier.org (Christophe Pallier) Date: Tue, 13 Nov 2018 16:06:35 +0100 Subject: [Neuroimaging] computing cluster size thresholds Message-ID: Dear all, I am using nistats to perform a second level group analysis I'd like to estimate the fwhm from the model's residuals to use afni's 3dFWHMx and 3dCLustSim to look at the distribution of cluster size under the Null. Is there a simple way to save a map of the residuals? Or is there even a better idea ? -- Christophe Pallier INSERM-CEA Cognitive Neuroimaging Lab, Neurospin, bat 145, 91191 Gif-sur-Yvette Cedex, France Tel: 00 33 1 69 08 79 34 Personal web site: http://www.pallier.org Lab web site: http://www.unicog.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Mon Nov 19 11:42:24 2018 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 19 Nov 2018 17:42:24 +0100 Subject: [Neuroimaging] New release of nilearn Message-ID: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> Hi everyone, We are happy to announce the release of version 0.5 of nilearn, a tool for multivariate analysis of brain imaging data in Python. Maybe the most exciting addition in this release is interactive plotting that naturally embeds in the Jupyter notebook, to plot volumes, surfaces, or connectomes: http://nilearn.github.io/plotting/index.html#interactive-plotting Nilearn can be used for decoding or biomarker extraction, resting-state fMRI analysis, or more general manipulation of brain images in Python: http://nilearn.github.io/user_guide.html A detailed changelog can be found at: http://nilearn.github.io/whats_new.html Nilearn is developed by a variety of contributors around the world: https://github.com/nilearn/nilearn/graphs/contributors Ga?l From satra at mit.edu Mon Nov 19 11:47:17 2018 From: satra at mit.edu (Satrajit Ghosh) Date: Mon, 19 Nov 2018 11:47:17 -0500 Subject: [Neuroimaging] New release of nilearn In-Reply-To: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> References: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> Message-ID: Awesome. thank you nilearn team. cheers, satra On Mon, Nov 19, 2018 at 11:43 AM Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > Hi everyone, > > We are happy to announce the release of version 0.5 of nilearn, a tool > for multivariate analysis of brain imaging data in Python. > > Maybe the most exciting addition in this release is interactive plotting > that naturally embeds in the Jupyter notebook, to plot volumes, surfaces, > or connectomes: > http://nilearn.github.io/plotting/index.html#interactive-plotting > > Nilearn can be used for decoding or biomarker extraction, resting-state > fMRI analysis, or more general manipulation of brain images in Python: > http://nilearn.github.io/user_guide.html > > A detailed changelog can be found at: > http://nilearn.github.io/whats_new.html > > Nilearn is developed by a variety of contributors around the world: > https://github.com/nilearn/nilearn/graphs/contributors > > Ga?l > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njvack at wisc.edu Mon Nov 19 13:56:15 2018 From: njvack at wisc.edu (Nate Vack) Date: Mon, 19 Nov 2018 12:56:15 -0600 Subject: [Neuroimaging] New release of nilearn In-Reply-To: <13696_1542645824_0PIG004MV94U7U00_20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> References: <13696_1542645824_0PIG004MV94U7U00_20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> Message-ID: Excellent! nilearn is great, by the way ? thanks for all your work on it. Best, -Nate On Mon, Nov 19, 2018 at 10:43 AM Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > Hi everyone, > > We are happy to announce the release of version 0.5 of nilearn, a tool > for multivariate analysis of brain imaging data in Python. > > Maybe the most exciting addition in this release is interactive plotting > that naturally embeds in the Jupyter notebook, to plot volumes, surfaces, > or connectomes: > http://nilearn.github.io/plotting/index.html#interactive-plotting > > Nilearn can be used for decoding or biomarker extraction, resting-state > fMRI analysis, or more general manipulation of brain images in Python: > http://nilearn.github.io/user_guide.html > > A detailed changelog can be found at: > http://nilearn.github.io/whats_new.html > > Nilearn is developed by a variety of contributors around the world: > https://github.com/nilearn/nilearn/graphs/contributors > > Ga?l > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexandre.gramfort at inria.fr Tue Nov 20 06:15:45 2018 From: alexandre.gramfort at inria.fr (Alexandre Gramfort) Date: Tue, 20 Nov 2018 12:15:45 +0100 Subject: [Neuroimaging] [ANN] MNE-Python 0.17 Message-ID: Dear community, We are very pleased to announce the new 0.17 release of MNE-Python (http://martinos.org/mne/stable/). A few highlights ============ - This release will be the last release compatible with Python 2. The next version will be Python 3.5+ only. - Better support for Annotations, including readers for EEGLAB, BrainVision, EDF, CSV, TXT and Brainstorm formats, and a new tutorial in the documentation to dive in. - Better support to import data from FieldTrip and Neurmag 122 systems. - Add capability to read and save Epochs containing complex data (e.g. after Hilbert transform). - Add optically pumped magnetometer dataset and examples. - New source morph object to unify morphing any type of source estimates (surface or volume) from one subject to another for group studies. It is now possible to do group studies when working on the volume with MNE. - Add ability to read and write beamformers. - New source power spectral estimation example for resting-state data - Better support for Reports, now they can be load/saved in HDF5 and the existing figures from a report can be removed. - Add support for reading MATLAB v7.3+ for EEGLAB. - Add interactive visualization of volume source estimates using plot_volume_source_estimates - New BIDS-compatible raw filename construction - Better helmet visualization for Artemis123 and CTF Notable API changes ================ - Deprecation of annot and annotmap parameters in mne.io.read_raw_edf - Deprecated mne.SourceEstimate.morph_precomputed, mne.SourceEstimate.morph, mne.compute_morph_matrix, mne.morph_data_precomputed and mne.morph_data in favor of mne.compute_source_morph - Calling mne.Epochs.decimate no longer copies the data when decim=1. - Warning messages are now only logged when warn_explicit is set (unless a logging file is being used) to avoid duplicate warning messages. - src.kind now equals to 'mixed' (and not 'combined') for a mixed source space (i.e., made of surfaces and volume grids) - The default value of stop_receive_thread in mne.realtime.RtEpochs.stop has been changed to True - Using the mne.io.Raw.add_channels on an instance with memmapped data will now resize the memmap file to append the new channels on Windows and Linux - Mismatches in CTF compensation grade are now checked in inverse computation For a full list of improvements and API changes, see: http://martinos.org/mne/stable/whats_new.html#version-0-17 To install the latest release the following command should do the job: $ pip install --upgrade mne As usual, we welcome your bug reports, feature requests, critiques, and contributions. Some links: - https://github.com/mne-tools/mne-python (code + readme on how to install) - http://martinos.org/mne/stable/ (full MNE documentation) Follow us on Twitter: https://twitter.com/mne_news Regards, The MNE-Python developers 40 people made commits that contributed to this release (in alphabetical order): * Alexandre Gramfort * Antoine Gauthier * Britta Westner * Christian Brodbeck * Clemens Brunner * Daniel McCloy * David Sabbagh * Denis A. Engemann * Eric Larson * Ezequiel Mikulan * Henrich Kolkhorst * Hubert Banville * Jasper J.F. van den Bosch * Jen Evans * Joan Massich * Johan van der Meer * Jona Sassenhagen * Kambiz Tavabi * Lorenz Esch * Luke Bloy * MJAS1 * Mainak Jas * Marcin Koculak * Marijn van Vliet * Peter J. Molfese * Sam Perry * Sara Sommariva * Sergey Antopolskiy * Sheraz Khan * Stefan Appelhoff * Stefan Repplinger * Steven Bethard * Teekuningas * Teon Brooks * Thomas Hartmann * Thomas Jochmann * Tom Dupr? la Tour * Tristan Stenner * buildqa * jeythekey From seralouk at hotmail.com Tue Nov 20 06:32:36 2018 From: seralouk at hotmail.com (serafim loukas) Date: Tue, 20 Nov 2018 11:32:36 +0000 Subject: [Neuroimaging] New release of nilearn In-Reply-To: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> References: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> Message-ID: <82276B40-8F96-4188-9FC4-90AF63D8C866@hotmail.com> Amazing news. Thank you for this amazing Python module. Serafeim > On 19 Nov 2018, at 17:42, Gael Varoquaux wrote: > > Hi everyone, > > We are happy to announce the release of version 0.5 of nilearn, a tool > for multivariate analysis of brain imaging data in Python. > > Maybe the most exciting addition in this release is interactive plotting > that naturally embeds in the Jupyter notebook, to plot volumes, surfaces, > or connectomes: > http://nilearn.github.io/plotting/index.html#interactive-plotting > > Nilearn can be used for decoding or biomarker extraction, resting-state > fMRI analysis, or more general manipulation of brain images in Python: > http://nilearn.github.io/user_guide.html > > A detailed changelog can be found at: > http://nilearn.github.io/whats_new.html > > Nilearn is developed by a variety of contributors around the world: > https://github.com/nilearn/nilearn/graphs/contributors > > Ga?l > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging From elef at indiana.edu Tue Nov 20 06:36:49 2018 From: elef at indiana.edu (Eleftherios Garyfallidis) Date: Tue, 20 Nov 2018 06:36:49 -0500 Subject: [Neuroimaging] New release of nilearn In-Reply-To: <82276B40-8F96-4188-9FC4-90AF63D8C866@hotmail.com> References: <20181119164224.zcoe5amqrjodaefk@phare.normalesup.org> <82276B40-8F96-4188-9FC4-90AF63D8C866@hotmail.com> Message-ID: Great, thanks for all the hard work. On Tue, Nov 20, 2018 at 6:32 AM serafim loukas wrote: > Amazing news. Thank you for this amazing Python module. > > Serafeim > > > On 19 Nov 2018, at 17:42, Gael Varoquaux > wrote: > > > > Hi everyone, > > > > We are happy to announce the release of version 0.5 of nilearn, a tool > > for multivariate analysis of brain imaging data in Python. > > > > Maybe the most exciting addition in this release is interactive plotting > > that naturally embeds in the Jupyter notebook, to plot volumes, surfaces, > > or connectomes: > > http://nilearn.github.io/plotting/index.html#interactive-plotting > > > > Nilearn can be used for decoding or biomarker extraction, resting-state > > fMRI analysis, or more general manipulation of brain images in Python: > > http://nilearn.github.io/user_guide.html > > > > A detailed changelog can be found at: > > http://nilearn.github.io/whats_new.html > > > > Nilearn is developed by a variety of contributors around the world: > > https://github.com/nilearn/nilearn/graphs/contributors > > > > Ga?l > > > > _______________________________________________ > > Neuroimaging mailing list > > Neuroimaging at python.org > > https://mail.python.org/mailman/listinfo/neuroimaging > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From satra at mit.edu Tue Nov 20 07:12:33 2018 From: satra at mit.edu (Satrajit Ghosh) Date: Tue, 20 Nov 2018 07:12:33 -0500 Subject: [Neuroimaging] [ANN] MNE-Python 0.17 In-Reply-To: References: Message-ID: great to see all these new releases! keep up the good work! cheers, satra On Tue, Nov 20, 2018 at 6:17 AM Alexandre Gramfort < alexandre.gramfort at inria.fr> wrote: > Dear community, > > We are very pleased to announce the new 0.17 release of MNE-Python > (http://martinos.org/mne/stable/). > > A few highlights > ============ > - This release will be the last release compatible with Python 2. The > next version will be Python 3.5+ only. > - Better support for Annotations, including readers for EEGLAB, > BrainVision, EDF, CSV, TXT and Brainstorm formats, and a new tutorial > in the documentation to dive in. > - Better support to import data from FieldTrip and Neurmag 122 systems. > - Add capability to read and save Epochs containing complex data (e.g. > after Hilbert transform). > - Add optically pumped magnetometer dataset and examples. > - New source morph object to unify morphing any type of source > estimates (surface or volume) from one subject to another for group > studies. It is now possible to do group studies when working on the > volume with MNE. > - Add ability to read and write beamformers. > - New source power spectral estimation example for resting-state data > - Better support for Reports, now they can be load/saved in HDF5 and > the existing figures from a report can be removed. > - Add support for reading MATLAB v7.3+ for EEGLAB. > - Add interactive visualization of volume source estimates using > plot_volume_source_estimates > - New BIDS-compatible raw filename construction > - Better helmet visualization for Artemis123 and CTF > > Notable API changes > ================ > - Deprecation of annot and annotmap parameters in mne.io.read_raw_edf > - Deprecated mne.SourceEstimate.morph_precomputed, > mne.SourceEstimate.morph, mne.compute_morph_matrix, > mne.morph_data_precomputed and mne.morph_data in favor of > mne.compute_source_morph > - Calling mne.Epochs.decimate no longer copies the data when decim=1. > - Warning messages are now only logged when warn_explicit is set > (unless a logging file is being used) to avoid duplicate warning > messages. > - src.kind now equals to 'mixed' (and not 'combined') for a mixed > source space (i.e., made of surfaces and volume grids) > - The default value of stop_receive_thread in > mne.realtime.RtEpochs.stop has been changed to True > - Using the mne.io.Raw.add_channels on an instance with memmapped data > will now resize the memmap file to append the new channels on Windows > and Linux > - Mismatches in CTF compensation grade are now checked in inverse > computation > > For a full list of improvements and API changes, see: > > http://martinos.org/mne/stable/whats_new.html#version-0-17 > > To install the latest release the following command should do the job: > > $ pip install --upgrade mne > > As usual, we welcome your bug reports, feature requests, critiques, > and contributions. > > Some links: > - https://github.com/mne-tools/mne-python (code + readme on how to > install) > - http://martinos.org/mne/stable/ (full MNE documentation) > > Follow us on Twitter: https://twitter.com/mne_news > > Regards, > The MNE-Python developers > > 40 people made commits that contributed to this release (in alphabetical > order): > > * Alexandre Gramfort > * Antoine Gauthier > * Britta Westner > * Christian Brodbeck > * Clemens Brunner > * Daniel McCloy > * David Sabbagh > * Denis A. Engemann > * Eric Larson > * Ezequiel Mikulan > * Henrich Kolkhorst > * Hubert Banville > * Jasper J.F. van den Bosch > * Jen Evans > * Joan Massich > * Johan van der Meer > * Jona Sassenhagen > * Kambiz Tavabi > * Lorenz Esch > * Luke Bloy > * MJAS1 > * Mainak Jas > * Marcin Koculak > * Marijn van Vliet > * Peter J. Molfese > * Sam Perry > * Sara Sommariva > * Sergey Antopolskiy > * Sheraz Khan > * Stefan Appelhoff > * Stefan Repplinger > * Steven Bethard > * Teekuningas > * Teon Brooks > * Thomas Hartmann > * Thomas Jochmann > * Tom Dupr? la Tour > * Tristan Stenner > * buildqa > * jeythekey > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Nov 20 12:25:26 2018 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 20 Nov 2018 09:25:26 -0800 Subject: [Neuroimaging] dmriprep Message-ID: Hello everyone, We've started a new project that we call "dmriprep". The goal is to provide a flexible, yet opinionated, workflow for preprocessing of diffusion MRI data, that produces not only preprocessed data, but also We invite your comments, issues and feature requests here: https://github.com/nipy/dmriprep These are early days for this thing, so join the fun on the ground floor! Best wishes, The dipy/dmriprep developers -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Tue Nov 20 12:26:39 2018 From: arokem at gmail.com (Ariel Rokem) Date: Tue, 20 Nov 2018 09:26:39 -0800 Subject: [Neuroimaging] dmriprep In-Reply-To: References: Message-ID: Whoops. Hit "send" too soon! On Tue, Nov 20, 2018 at 9:25 AM Ariel Rokem wrote: > Hello everyone, > > We've started a new project that we call "dmriprep". The goal is to > provide a flexible, yet opinionated, workflow for preprocessing of > diffusion MRI data, that produces not only preprocessed data, but also > > ... but also detailed QA from your data, so that you can decide how you would like to process the data in subsequent steps. We invite your comments, issues and feature requests here: > > https://github.com/nipy/dmriprep > > These are early days for this thing, so join the fun on the ground floor! > > Best wishes, > > The dipy/dmriprep developers > -------------- next part -------------- An HTML attachment was scrubbed... URL: From elef at indiana.edu Tue Nov 20 13:07:27 2018 From: elef at indiana.edu (Eleftherios Garyfallidis) Date: Tue, 20 Nov 2018 13:07:27 -0500 Subject: [Neuroimaging] dmriprep In-Reply-To: References: Message-ID: Great thanks! On Tue, Nov 20, 2018 at 12:27 PM Ariel Rokem wrote: > Whoops. Hit "send" too soon! > > On Tue, Nov 20, 2018 at 9:25 AM Ariel Rokem wrote: > >> Hello everyone, >> >> We've started a new project that we call "dmriprep". The goal is to >> provide a flexible, yet opinionated, workflow for preprocessing of >> diffusion MRI data, that produces not only preprocessed data, but also >> >> > ... but also detailed QA from your data, so that you can decide how you > would like to process the data in subsequent steps. > > We invite your comments, issues and feature requests here: >> >> https://github.com/nipy/dmriprep >> >> These are early days for this thing, so join the fun on the ground floor! >> >> Best wishes, >> >> The dipy/dmriprep developers >> > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tracygilliam5 at gmail.com Tue Nov 20 14:54:17 2018 From: tracygilliam5 at gmail.com (tracy gilliam) Date: Tue, 20 Nov 2018 13:54:17 -0600 Subject: [Neuroimaging] Research Collaboration Message-ID: Dear Colleagues, Reaching out today for help. While the National Science Foundation did not approve funding for our Small Business Innovation Research proposal, we did receive sufficient assessment from the three merit reviews and panel review to gain keen insights from our peers and the division into our research focus. Out of 1203 entries, only 150 grants were awarded. With the excellent feedback, we would like to move forward with our research to collaborate with university sources. Our research needs to connect with fMRI studies where researchers used photographs to map areas of the brain. We are looking for introductions or recommendations. We are wanting to understand how cognitive identities can be segmented to understand patterns of change in communities, cities, states and the nation. If you are interested, need more information, or can recommend studies, please message me. Thanks so much. Happy Thanksgiving! Tracy Gilliam The Neurographics Group TracyGilliam5 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From alessandro.stranieri at gmail.com Fri Nov 30 11:26:58 2018 From: alessandro.stranieri at gmail.com (Alessandro Stranieri) Date: Fri, 30 Nov 2018 18:26:58 +0200 Subject: [Neuroimaging] Understanding atlas usage Message-ID: Hi, I am working on a small project with the adhd data. My background is software engineering, but I am very new to nipy and neuroscience in general so I might get some terminology wrong. Currently I am working my way through the correlation/connectome examples. I have a few doubts but, in order to keep it simple, I will just post a couple, hoping for help. 1. The fetch_adhd functions states that a maximum of 40 subjects can be retrieved. This means that if I want to use more, I need to do all the processing myself from the files at https://www.nitrc.org. Is it correct? 2. nilearn provides some atlases and I have more or less understood how to use them. However, on NITRC one can also find 2 functional parcellation templates, and time series are provided for those parcellations. I think I managed to create and display connectomes with those templates but, what if I want to know the label of a region? Those regions are labelled by numbers. For example, if I use the aal atlas, I can see 'Precentral_L', 'Precental_R' and so on. Can I do the same with the cc200 for example? I fear that, since those parcellation are functional and not anatomical, there are no real labels associated to the region. But I would still like to have a data-structure that allows me to query things like: this region is in V1, or in the frontal lobe or other. Best Regards, Alessandro -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertrand.thirion at inria.fr Fri Nov 30 16:59:28 2018 From: bertrand.thirion at inria.fr (bthirion) Date: Fri, 30 Nov 2018 22:59:28 +0100 Subject: [Neuroimaging] Understanding atlas usage In-Reply-To: References: Message-ID: On 30/11/2018 17:26, Alessandro Stranieri wrote: > Hi, > > I am working on a small project with the adhd data. My background is > software engineering, but I am very new to nipy and neuroscience in > general so I might get some terminology wrong. > > Currently I am working my way through the correlation/connectome > examples. I have a few doubts but, in order to keep it simple, I will > just > post a couple, hoping for help. You're probably using Nilearn. Could you post these questions on Neurostars.org, so that other people can benefit from the answers ? > > 1. The fetch_adhd functions states that a maximum of 40 subjects can > be retrieved. This means that if I want to use more, I need to do all > the processing myself from the files at > https://www.nitrc.org. Is it correct? > Yes. > 2. nilearn provides some atlases and I have more or less understood > how to use them. However, on NITRC one can also find 2 functional > parcellation templates, and time series are provided > for those parcellations. I think I managed to create and display > connectomes with those templates but, what if I want to know the label > of a region? Those regions are labelled by numbers. > For example, if I use the aal atlas, I can see 'Precentral_L', > 'Precental_R' and so on. Can I do the same with the cc200 for example? > I fear that, since those parcellation are functional > and not anatomical, there are no real labels associated to the region. > But I would still like to have a data-structure that allows me to > query things like: this region is in V1, or in the > frontal lobe or other. > The best way to learn about parcellations is to run the corresponding examples of the library. Each atlas comes with different information, so you may not have functional labels if the atlas comes from a functional parcellation, such as resting-state for instance. HTH. Bertrand -------------- next part -------------- An HTML attachment was scrubbed... URL: