From Farshid.Sepehrband at loni.usc.edu Thu Sep 5 19:34:53 2019 From: Farshid.Sepehrband at loni.usc.edu (Farshid Sepehrband) Date: Thu, 5 Sep 2019 23:34:53 +0000 Subject: [Neuroimaging] postdoc position In-Reply-To: References: Message-ID: <9C2CCAF1-75D7-41F9-9209-F719835AF451@ini.usc.edu> Dear neuroimaging group member, We are looking for a postdoctoral fellow to join us at USC Stevens Neuroimaging and Informatics Institute. See the attached flyer for more information. I would be most grateful if you spread the word. Thanks. Best, Farshid Farshid Sepehrband, PhD Assistant Professor USC Stevens Neuroimaging and Informatics Institute Keck School of Medicine, University of Southern California 2025 Zonal Ave, Los Angeles, CA 90033, USA (323) 44-BRAIN | direct: (323) 865-1710 farshid.sepehrband at loni.usc.edu | http://www.ini.usc.edu ??Linkedin: http://lnkd.in/wDzQk7 | Twitter: @fsepehrband -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: postdoc_img.pdf Type: application/pdf Size: 278826 bytes Desc: postdoc_img.pdf URL: -------------- next part -------------- An HTML attachment was scrubbed... URL: From arokem at gmail.com Wed Sep 11 11:11:02 2019 From: arokem at gmail.com (Ariel Rokem) Date: Wed, 11 Sep 2019 08:11:02 -0700 Subject: [Neuroimaging] Post-doc position at Stanford University (in collaboration with University of Washington) Message-ID: * Postdoc Position at Stanford University - Leveraging big data to understand the neurobiological underpinnings of reading and math abilities* The Yeatman (Stanford University) and Rokem (University of Washington) labs have 2 years of funding for a jointly mentored postdoc who is interested in capitalizing on new innovations in data science and diffusion MRI methods to understand the neurobiological underpinnings of reading and math abilities. With the emergence of public datasets containing tens of thousands of subjects (e.g., ABCD, Healthy Brain Network, Human Connectome Project, etc.) there are new opportunities to apply computational approaches to understanding the multivariate relationship between white matter development and academic skills. Beyond identifying correlations between a single behavioral measure and diffusion properties in a single white matter tract, we would like to progress towards a model characterizing how the interrelated developmental trajectories of the brain?s many white matter connections relate to different components of academic development (e.g., reading, math, executive function). This project would involve working with large, publicly available datasets, and developing/applying new statistical approaches to relate measures of brain anatomy to cognitive skills. Related work: Yeatman J.D., Richie-Halford A., Smith J.K., Keshavan A., Rokem A. (2018). A browser-based tool for visualization and analysis of diffusion MRI data. Nature Communications. 9(1):940. Link Starting salary range: $65,000- $75,000/yr depending on experience. Applicants should have a PhD in neuroscience, psychology, or related fields, or a PhD in computer science, statistics, physics or engineering fields, and an interest in rapidly learning about measurements of the living human brain. Either way, the ideal candidate should possess strong computational skills, proficiency programming in Python and familiarity with MATLAB. Experience with diffusion MRI data analysis or other related neuroimaging is a plus but not required. Must enjoy working in a fast-paced collaborative and open environment and have strong writing and communication skills. Start date: Any time To apply, please send: 1) A curriculum vitae 2) Contact information for three references 3) A two-paragraph letter (less than 1 page) describing: (1) a finding from your PhD (or previous postdoc) research that you are excited about and (2) a project you would like to tackle in the lab. Jason Yeatman (jyeatman at stanford.edu) Ariel Rokem (arokem at gmail.com) -------------- next part -------------- An HTML attachment was scrubbed... URL: From frakkopesto at gmail.com Thu Sep 12 09:02:41 2019 From: frakkopesto at gmail.com (Franco Pestilli) Date: Thu, 12 Sep 2019 09:02:41 -0400 Subject: [Neuroimaging] Multiple postdoctoral positions at Indiana University Message-ID: Post-doctoral fellow position(s) to the study of the natural visual environments of infants and young children and their implications for visual, cognitive and language development and machine learning at Indiana University. The larger collaborative project involves analyses of the properties of a very large corpus of head camera images (500 million) collected by infants 1 to 24 months of age with respect to low, mid and higher level properties, the examination of the statistical structure of early learned visual categories (and their in-home naming by parents), the design and implementation of computational experiments using machine learning and computer vision models, as well as experiments with infants testing novel predictions from these analyses and models. The post-doctoral fellow(s) will take part in the intellectually rich cognitive science, computational neuroscience, vision, developmental, and computer science communities at Indiana University under the Emerging Areas of Research Initiative titled Learning: Brains, Machines and Children. Collaborators on the larger project include Linda Smith, David Crandall, Franco Pestilli, Rowan Candy, Jason Gold, and Chen Yu. This is an excellent opportunity for individuals with past training in one or more of the following: infant statistical learning, infant visual development (including face and object perception), visual neuroscience, adult vision, computer vision. Other areas of training with computational and/or experimental backgrounds will be considered. Please apply to Linda Smith, smith4 at indiana.edu, with Visual Environments in the subject heading by sending a cover letter stating your interest in this project, your cv, and a research statement. References will be requested after initial contact. The filling of these position(s) are open in their timing; although we hope to appoint one position this fall, January or this spring are also possible start dates. Franco Pestilli, PhD Associate Professor Psychology , Neuroscience and Cognitive Science Engineering , Computer Science , and Optometry by courtesy Midwest Big Data Hub | brainlife.io/plab ? Indiana University, Bloomington franpest at indiana.edu -------------- next part -------------- An HTML attachment was scrubbed... URL: From jdgispert at barcelonabeta.org Fri Sep 20 03:25:10 2019 From: jdgispert at barcelonabeta.org (=?UTF-8?Q?Juan_Domingo_Gispert_L=C3=B3pez?=) Date: Fri, 20 Sep 2019 09:25:10 +0200 Subject: [Neuroimaging] Researcher Position in Artificial Intelligence on Magnetic Resonance Imaging for Alzheimer's Prevention Message-ID: The Barcelonabta Brain Research Center (BBRC ) is looking for a researcher to develop deep learning methods to predict preclinical Alzheimer?s disease based on Magnetic Resonance and wet biomarkers, as part of its Alzheimer?s Prevention Program. This is a full-time position and the person selected will report to the Neuroimaging?s group leader. Main Responsibilities: - To develop and apply Machine/Deep learning (ML/DL) techniques to MRI data to predict preclinical AD stages on the ALFA/ALFA+ datasets from the BBRC - To develop the ML/DL algorithms so that they are compatible with use in real-world scenarios - To liaise with third parties to validate the performance of the developed algorithms - To write technical reports, journal articles and present results in scientific conferences We offer: - The research position is scheduled initially for two years (open to renewal). - Salary will be in accordance with qualifications and experience Further information and application instructions: https://fpmaragall.org/wp-content/uploads/2014/07/Researcher-Position-in-Deep-Learning-Job-Description.pdf -- *Juan Domingo Gispert*Group Leader - Neuroimaging *BarcelonaBeta* * Brain Research Center (BBRC) *jdgispert at barcelonabeta.org T. (+34) 93 326 31 67 C/ Wellington, 30 08005 Barcelona www.barcelonabeta.org www.fpmaragall.org ------------------------------ Confidencialitat correu electr?nic BBRC . Confidencialidad correo electr?nico BBRC . E-mail confidentiality BBRC . Gr?cies per estalviar paper. *Gracias por ahorrar papel. *Thanks for saving paper. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexandre.gramfort at inria.fr Tue Sep 24 15:12:28 2019 From: alexandre.gramfort at inria.fr (Alexandre Gramfort) Date: Tue, 24 Sep 2019 21:12:28 +0200 Subject: [Neuroimaging] [ANN] MNE-Python 0.19 Message-ID: Dear community! We, the 67 people who contributed, are very pleased to announce the new 0.19 release of MNE-Python (http://mne.tools/stable/). A few highlights ================ - New URL http://mne.tools - Reorganized documentation, with 19 new or revised tutorials. - Automatic MRI fiducial estimation based on MNI Talairach transforms. - Improved plotting support, including new Butterfly plots for PSD epochs, and 3D sensor connectivity plots. - Speed improvements in clustering, coregistration, volumetric source space creation, and other parts of the codebase. - More supported file formats, including curry files, the updated NYU New York 2019 system for KIT, and the new Annotations support for CTF marker files. In addition, we caught and fixed more than 41 bugs! Notable API changes =================== - New minimum supported dependencies, most critically **Python >= 3.5 is now required**. - Complete reworking of EEG channel montage/digitization, including improvements for EEG source modeling with no MRI (surrogate MRIs). - Fixes to volumetric morphing and plotting functions. - Update of the FIF constants. For a full list of improvements and API changes see: http://mne.tools/stable/whats_new.html#version-0-19 To install the latest release the following command should do the job: $ pip install --upgrade mne As usual, we welcome your bug reports, feature requests, feedback, and contributions. Some links: - https://github.com/mne-tools/mne-python (code + readme on how to install) - http://mne.tools/stable/ (full MNE documentation) Follow us on Twitter for general news (https://twitter.com/mne_news) and for a regular feed of merged PRs (https://twitter.com/mne_python). Regards, The MNE-Python developers 67 people made commits that contributed to this release (in alphabetical order): * Abram Hindle * Achilleas Koutsou * Alexander Kovrig * Alexandre Gramfort * Antoine Gauthier * Britta Westner * Bruno Nicenboim * Burkhard Maess * Chris Bailey * Chris Holdgraf * Christian Brodbeck * Christian Clauss * Clemens Brunner * Crist?bal Mo?nne-Loccoz * Daniel McCloy * David Haslacher * Denis A. Engemann * Dirk G?tlin * Elizabeth DuPre * Eric Larson * Evgenii Kalenkovich * Fede Raimondo * Guillaume Favelier * Hubert Banville * Ivana Kojcic * Jean-Remi King * Jeff Hanna * Joan Massich * Johannes Kasper * Jon Houck * Jona Sassenhagen * Jose Alanis * Jussi Nurminen * Kambiz Tavabi * Katarina Slama * Keith Doelling * Kostiantyn Maksymenko * Larry Eisenman * Legrand Nicolas * Lorenz Esch * Luke Bloy * Mainak Jas * Maksymenko Kostiantyn * Marijn van Vliet * Mikolaj Magnuski * Milan Ryb?? * Nathalie Gayraud * Nikolas Chalas * Oleh Kozynets * Quentin Bertrand * Richard H?chenberger * Robert Seymour * Samuel Deslauriers-Gauthier * Sebasti?n Casta?o * Simon Kern * Stanislas Chambon * Stefan Appelhoff * Stefan Repplinger * Steve Matindi * Teon Brooks * Theodore Papadopoulo * Thomas Donoghue * Thomas Hartmann * Thomas Radman * Eberhard Eich * Joshua J Bear * Paul Roujansky -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertrand.thirion at inria.fr Tue Sep 24 15:17:15 2019 From: bertrand.thirion at inria.fr (bthirion) Date: Tue, 24 Sep 2019 21:17:15 +0200 Subject: [Neuroimaging] [ANN] MNE-Python 0.19 In-Reply-To: References: Message-ID: Congratulations to all of you ! Best, Bertrand On 24/09/2019 21:12, Alexandre Gramfort wrote: > > Dear community! > > We, the 67 people who contributed, are very pleased to announce the > new 0.19 release of MNE-Python (http://mne.tools/stable/). > > A few highlights > ================ > > - New URL http://mne.tools > - Reorganized documentation, with 19 new or revised tutorials. > - Automatic MRI fiducial estimation based on MNI Talairach transforms. > - Improved plotting support, including new Butterfly plots for PSD > epochs, and 3D sensor connectivity plots. > - Speed improvements in clustering, coregistration, volumetric source > space creation, and other parts of the codebase. > - More supported file formats, including curry files, the updated NYU > New York 2019 system for KIT, and the new Annotations support for CTF > marker files. > > In addition, we caught and fixed more than 41 bugs! > > Notable API changes > =================== > > - New minimum supported dependencies, most critically **Python >= 3.5 > is now required**. > - Complete reworking of EEG channel montage/digitization, including > improvements for EEG source modeling with no MRI (surrogate MRIs). > - Fixes to volumetric morphing and plotting functions. > - Update of the FIF constants. > > For a full list of improvements and API changes see: > http://mne.tools/stable/whats_new.html#version-0-19 > > To install the latest release the following command should do the job: > > ? ? $ pip install --upgrade mne > > As usual, we welcome your bug reports, feature requests, feedback, and > contributions. > > Some links: > - https://github.com/mne-tools/mne-python (code + readme on how to > install) > - http://mne.tools/stable/ (full MNE documentation) > > Follow us on Twitter for general news (https://twitter.com/mne_news) > and for a regular feed of merged PRs (https://twitter.com/mne_python). > > Regards, > The MNE-Python developers > > > 67 people made commits that contributed to this release (in > alphabetical order): > > * Abram Hindle > * Achilleas Koutsou > * Alexander Kovrig > * Alexandre Gramfort > * Antoine Gauthier > * Britta Westner > * Bruno Nicenboim > * Burkhard Maess > * Chris Bailey > * Chris Holdgraf > * Christian Brodbeck > * Christian Clauss > * Clemens Brunner > * Crist?bal Mo?nne-Loccoz > * Daniel McCloy > * David Haslacher > * Denis A. Engemann > * Dirk G?tlin > * Elizabeth DuPre > * Eric Larson > * Evgenii Kalenkovich > * Fede Raimondo > * Guillaume Favelier > * Hubert Banville > * Ivana Kojcic > * Jean-Remi King > * Jeff Hanna > * Joan Massich > * Johannes Kasper > * Jon Houck > * Jona Sassenhagen > * Jose Alanis > * Jussi Nurminen > * Kambiz Tavabi > * Katarina Slama > * Keith Doelling > * Kostiantyn Maksymenko > * Larry Eisenman > * Legrand Nicolas > * Lorenz Esch > * Luke Bloy > * Mainak Jas > * Maksymenko Kostiantyn > * Marijn van Vliet > * Mikolaj Magnuski > * Milan Ryb?? > * Nathalie Gayraud > * Nikolas Chalas > * Oleh Kozynets > * Quentin Bertrand > * Richard H?chenberger > * Robert Seymour > * Samuel Deslauriers-Gauthier > * Sebasti?n Casta?o > * Simon Kern > * Stanislas Chambon > * Stefan Appelhoff > * Stefan Repplinger > * Steve Matindi > * Teon Brooks > * Theodore Papadopoulo > * Thomas Donoghue > * Thomas Hartmann > * Thomas Radman > * Eberhard Eich > * Joshua J Bear > * Paul Roujansky > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbpoline at gmail.com Tue Sep 24 16:35:51 2019 From: jbpoline at gmail.com (JB Poline) Date: Tue, 24 Sep 2019 16:35:51 -0400 Subject: [Neuroimaging] [ANN] MNE-Python 0.19 In-Reply-To: References: Message-ID: Indeed ! On Tue, Sep 24, 2019 at 3:17 PM bthirion wrote: > Congratulations to all of you ! > Best, > Bertrand > > On 24/09/2019 21:12, Alexandre Gramfort wrote: > > Dear community! > > We, the 67 people who contributed, are very pleased to announce the new > 0.19 release of MNE-Python (http://mne.tools/stable/). > > A few highlights > ================ > > - New URL http://mne.tools > - Reorganized documentation, with 19 new or revised tutorials. > - Automatic MRI fiducial estimation based on MNI Talairach transforms. > - Improved plotting support, including new Butterfly plots for PSD epochs, > and 3D sensor connectivity plots. > - Speed improvements in clustering, coregistration, volumetric source > space creation, and other parts of the codebase. > - More supported file formats, including curry files, the updated NYU New > York 2019 system for KIT, and the new Annotations support for CTF marker > files. > > In addition, we caught and fixed more than 41 bugs! > > Notable API changes > =================== > > - New minimum supported dependencies, most critically **Python >= 3.5 is > now required**. > - Complete reworking of EEG channel montage/digitization, including > improvements for EEG source modeling with no MRI (surrogate MRIs). > - Fixes to volumetric morphing and plotting functions. > - Update of the FIF constants. > > For a full list of improvements and API changes see: > http://mne.tools/stable/whats_new.html#version-0-19 > > To install the latest release the following command should do the job: > > $ pip install --upgrade mne > > As usual, we welcome your bug reports, feature requests, feedback, and > contributions. > > Some links: > - https://github.com/mne-tools/mne-python (code + readme on how to > install) > - http://mne.tools/stable/ (full MNE documentation) > > Follow us on Twitter for general news (https://twitter.com/mne_news) > and for a regular feed of merged PRs (https://twitter.com/mne_python). > > Regards, > The MNE-Python developers > > > 67 people made commits that contributed to this release (in alphabetical > order): > > * Abram Hindle > * Achilleas Koutsou > * Alexander Kovrig > * Alexandre Gramfort > * Antoine Gauthier > * Britta Westner > * Bruno Nicenboim > * Burkhard Maess > * Chris Bailey > * Chris Holdgraf > * Christian Brodbeck > * Christian Clauss > * Clemens Brunner > * Crist?bal Mo?nne-Loccoz > * Daniel McCloy > * David Haslacher > * Denis A. Engemann > * Dirk G?tlin > * Elizabeth DuPre > * Eric Larson > * Evgenii Kalenkovich > * Fede Raimondo > * Guillaume Favelier > * Hubert Banville > * Ivana Kojcic > * Jean-Remi King > * Jeff Hanna > * Joan Massich > * Johannes Kasper > * Jon Houck > * Jona Sassenhagen > * Jose Alanis > * Jussi Nurminen > * Kambiz Tavabi > * Katarina Slama > * Keith Doelling > * Kostiantyn Maksymenko > * Larry Eisenman > * Legrand Nicolas > * Lorenz Esch > * Luke Bloy > * Mainak Jas > * Maksymenko Kostiantyn > * Marijn van Vliet > * Mikolaj Magnuski > * Milan Ryb?? > * Nathalie Gayraud > * Nikolas Chalas > * Oleh Kozynets > * Quentin Bertrand > * Richard H?chenberger > * Robert Seymour > * Samuel Deslauriers-Gauthier > * Sebasti?n Casta?o > * Simon Kern > * Stanislas Chambon > * Stefan Appelhoff > * Stefan Repplinger > * Steve Matindi > * Teon Brooks > * Theodore Papadopoulo > * Thomas Donoghue > * Thomas Hartmann > * Thomas Radman > * Eberhard Eich > * Joshua J Bear > * Paul Roujansky > > _______________________________________________ > Neuroimaging mailing listNeuroimaging at python.orghttps://mail.python.org/mailman/listinfo/neuroimaging > > > _______________________________________________ > Neuroimaging mailing list > Neuroimaging at python.org > https://mail.python.org/mailman/listinfo/neuroimaging > -------------- next part -------------- An HTML attachment was scrubbed... URL: From markiewicz at stanford.edu Tue Sep 24 17:06:12 2019 From: markiewicz at stanford.edu (Christopher Markiewicz) Date: Tue, 24 Sep 2019 21:06:12 +0000 Subject: [Neuroimaging] [ANN] Nibabel 2.5.1 Message-ID: Hi all, Nibabel 2.5.1 was released yesterday. These fixes deal primarily with edge cases, and no critical issues that demand immediate upgrade. As a reminder, the 2.5.x series will have extended bug-fix-only support for Python 2.7 (and probably 3.4, in passing) through the end of 2020, while Nibabel 3+ will require Python 3.5+. A Nibabel 3.0 release candidate is scheduled for November, with a 3.0 release will target December or early January. For more details or to weigh in on our schedule, see the following issues: - Release schedule: https://github.com/nipy/nibabel/issues/734 - Python 2 support: https://github.com/nipy/nibabel/issues/735 Thanks to new contributor Henry Braun, and to everybody who contributed their time with bug reports, reviews or code. Please cite this version using the Zenodo archive: https://doi.org/10.5281/zenodo.3458246 The full changelog follows. --- Bug fix release for the 2.5.x series. Most work on NiBabel so far has been by Matthew Brett (MB), Chris Markiewicz (CM), Michael Hanke (MH), Marc-Alexandre C?t? (MC), Ben Cipollini (BC), Paul McCarthy (PM), Chris Cheng (CC), Yaroslav Halchenko (YOH), Satra Ghosh (SG), Eric Larson (EL), Demian Wassermann, and Stephan Gerhard. References like "pr/298" refer to github pull request numbers. 2.5.1 (Monday 23 September 2019) ================================ The 2.5.x series is the last with support for either Python 2 or Python 3.4. Extended support for this series 2.5 will last through December 2020. Enhancements ------------ * Ignore endianness in ``nib-diff`` if values match (pr/799) (YOH, reviewed by CM) Bug fixes --------- * Correctly handle Philips DICOMs w/ derived volume (pr/795) (Mathias Goncalves, reviewed by CM) * Raise CSA tag limit to 1000, parametrize for future relaxing (pr/798, backported to 2.5.x in pr/800) (Henry Braun, reviewed by CM, MB) * Coerce data types to match NIfTI intent codes when writing GIFTI data arrays (pr/806) (CM, reported by Tom Holroyd) Maintenance ----------- * Require h5py 2.10 for Windows + Python < 3.6 to resolve unexpected dtypes in Minc2 data (pr/804) (CM, reviewed by YOH) API changes and deprecations ---------------------------- * Deprecate ``nicom.dicomwrappers.Wrapper.get_affine()`` in favor of ``affine`` property; final removal in nibabel 4.0 (pr/796) (YOH, reviewed by CM) -- Chris Markiewicz Center for Reproducible Neuroscience Stanford University -------------- next part -------------- An HTML attachment was scrubbed... URL: