Ideas for creating processes

bobicanprogram icanbob at gmail.com
Thu Mar 11 10:54:44 EST 2010


On Mar 10, 4:52 pm, J <dreadpiratej... at gmail.com> wrote:
> I'm working on a project and thought I'd ask for a suggestion on how
> to proceed (I've got my own ideas, but I wanted to see if I was on the
> right track)
>
> For now, I've got this:
>
> def main():
> ## get our list of directories to refresh
>     releases=sys.argv[1:]
>     if len(releases) < 1:
>         print "You need to provide at least one dir to update"
>         sys.exit()
>
>     ## Lets figure out what there is to update
>     updateDirs = []
>     for rel in releases:
>         currentDir = os.path.join(homedir, rel)
>         for item in os.listdir(currentDir):
>             updateDirs += [os.path.join(homedir, rel, item)]
>
> which returns a list of full pathnames to directories that need to be
> updated (updates will be carried out by calling rsync or zsync
> eventually)
>
> The directory hierarchy looks like this:
>
> /home/user/files
> /home/user/files/version1
> /home/user/files/version1/type1
> /home/user/files/version1/type2
> /home/user/files/version2
> /home/user/files/version2/type1
>
> and the list ends up looking like this:
>
> ['/home/user/files/version1/type1','/home/user/files/version1/type2','/home/user/files/version2/type1','/home/user/files/version2/type2']
>
> the next thing I need to do is figure out how to update those.
>
> the quick and dirty would be (as I'm imagining it at the moment):
> for path in pathlist:
>     chdir into path
>     execute rsync or zsync
>
> but that gets me moving into one dir, updating, then moving into another.
>
> What I was wondering about though, is spawning off separate rsync
> processes to run concurrently (each rsync will hit a different remote
> dir for each local dir)
>
> so I'm trying to figure out a good way to do this:
>
> for path in pathlist:
>     kick off an individual rsync|zsync process to update path
>
> wait for all child processes to end
> exit program.
>
> I've been looking at subprocess because at the moment, that's all I
> really know...
> But is there a better way of kicking off multiple simultaneous
> processes so I can update all dirs at once instead of one at a time?
>
> No, this isn't homework, it's something I'm working on to sync a
> couple directories of ISO images to grab nightly builds
> Yes, there are plenty of pre-made scripts out there, but I don't want
> to even look at those because I figured this would be a good learning
> experience, and I want to try to solve this as much on my own as I can
> without just cut and pasting from someone elses program.
>
> So, with that said, any ideas on the best way to proceed?  I'm going
> to start looking at ways to use subprocess to do this, or would there
> be a better way (multi-threading maybe?)
>
> Or am I even in the right ballpark?
>
> Cheers
> Jeff


You might be able to use the SIMPL toolkit for this one.
(http://www.icanprogram.com/06py/lesson1/lesson1.html)

You could wrap the rsync executable as a SIMPL receiver module and
then message to that from inside your Python script to kick it off and
synchronize actions.

bob



More information about the Python-list mailing list