[Moin-user] wget a wiki page and it's subpage

Ralf Gross Ralf-Lists at ralfgross.de
Tue Jan 31 05:44:01 EST 2006


Thomas Waldmann schrieb:
> >I'm trying to get a wiki page and its subpage with many images through
> >wget.
> 
> Some user agents (including wget) receive special treatment as people 
> use them often to DOS wikis. So make sure you change the user agent it 
> uses (and if it is not your own wiki: USE CAREFULLY).

Changing the agent string didn't help.
 
> >I need to do this, because we want to give the pages as documentation
> >to a customer. After retrieving the pages in html I'm going to try to
> >convert them to pdf.
> 
> Maybe look at moin-dump, too.

moin-dump doesn't care about the attachments. But I found
http://moinmoin.wikiwikiweb.de/MoinDump, which dumps the whole wiki
including attachments and fixing of the attachment paths. It'd be
nice if I could limit the output to the pages I really need, but it's
ok for now.

Ralf




More information about the Moin-user mailing list