[Numpy-discussion] Loading large NIfTI file -> MemoryError

Matthew Brett matthew.brett at gmail.com
Fri Jan 3 05:39:25 EST 2014


Hi,

On Tue, Dec 31, 2013 at 1:29 PM, Julian Taylor
<jtaylor.debian at googlemail.com> wrote:
>
> On 31.12.2013 14:13, Amira Chekir wrote:
> > Hello together,
> >
> > I try to load a (large)  NIfTI file (DMRI from Human Connectome Project,
> > about 1 GB) with NiBabel.
> >
> > import nibabel as nib
> > img = nib.load("dmri.nii.gz")
> > data = img.get_data()
> >
> > The program crashes during "img.get_data()" with an "MemoryError"
> > (having 4 GB of RAM in my machine).
> >
> > Any suggestions?
>
> are you using a 64 bit operating system?
> which version of numpy?

I think you want the nipy-devel mailing list for this question :

http://nipy.org/nibabel/

I'm guessing that the reader is loading the raw data which is - say -
int16 - and then multiplying by the scale factors to make a float64
image, which is 4 times larger.

We're working on an iterative load API at the moment that might help
loading the image slice by slice :

https://github.com/nipy/nibabel/pull/211

It should be merged in a week or so - but it would be very helpful if
you would try out the proposal to see if it helps,

Best,

Matthew



More information about the NumPy-Discussion mailing list