[Neuroimaging] iterating a workflow over inputs

Ian Malone ibmalone at gmail.com
Mon Jan 9 13:59:30 EST 2017


Hi,

I've got a relatively complex workflow that I'd like to use as a
sub-workflow of another one, however it needs to be iterated over some
of the inputs. I suppose I could replace the appropriate pe.Node()s in
it with MapNode()s, but there are a fair number of them, and quite a
few connections. (I also think, that this would prevent it being used
on single instance inputs without first packing them into a list,
though I could be wrong.)

This is what I've tried:
sub_workflow = pe.MapNode(create_my_workflow(),
                        iterfield = ['in_4d_file',
                                     'in_text_file'],
                        name = 'data_fit')

Unsurprisingly it fails with :
    raise IOError('interface must be an instance of an Interface')
IOError: interface must be an instance of an Interface

I haven't been able to find much discussion of this, though there was
one mention of wrapping it in a function, which looked more like it
was intended to cause a separate cluster submission of the
sub-workflow (and would require using the function parameters/return
to connect up the inputs and outputs):

https://groups.google.com/forum/#!topic/nipy-user/zMGPJ74_fJU
# def reuse_wrapper(subject, etc.):
#  ....
#  reuseWorkflow = create_run_first_all() # get my run_first_all workflow
#  reuseWorkflow.run(plugin="Condor")
#
#
# rfa_node = MapNode(Function(function='reuse_wrapper'  etc.
)iterfield='subject')
# topLevelWorkflow.connect(#subjects to rfa_node)
# topLevelWorkflow.run(plugin="Condor")

Is this at all possible, or should I bite the bullet and start
MapNode-ing the sub-workflow?

-- 
imalone


More information about the Neuroimaging mailing list