Writing Multiple files at a times

subhabangalore at gmail.com subhabangalore at gmail.com
Sun Jun 29 13:32:00 EDT 2014


On Sunday, June 29, 2014 7:31:37 PM UTC+5:30, Roy Smith wrote:
> In article <mailman.11325.1404048700.18130.python-list at python.org>,
> 
>  Dave Angel <davea at davea.name> wrote:
> 
> 
> 
> > subhabangalore at gmail.com Wrote in message:
> 
> > > Dear Group,
> 
> > > 
> 
> > > I am trying to crawl multiple URLs. As they are coming I want to write them 
> 
> > > as string, as they are coming, preferably in a queue. 
> 
> > > 
> 
> > > If any one of the esteemed members of the group may kindly help.
> 
> > > 
> 
> > 
> 
> > >From your subject line,  it appears you want to keep multiple files open, 
> 
> > >and write to each in an arbitrary order.  That's no problem,  up to the 
> 
> > >operating system limits.  Define a class that holds the URL information and 
> 
> > >for each instance,  add an attribute for an output file handle. 
> 
> > 
> 
> > Don't forget to close each file when you're done with the corresponding URL.
> 
> 
> 
> One other thing to mention is that if you're doing anything with 
> 
> fetching URLs from Python, you almost certainly want to be using Kenneth 
> 
> Reitz's excellent requests module (http://docs.python-requests.org/).  
> 
> The built-in urllib support in Python works, but requests is so much 
> 
> simpler to use.

Dear Group,

Sorry if I miscommunicated. 

I am opening multiple URLs with urllib.open, now one Url has huge html source files, like that each one has. As these files are read I am trying to concatenate them and put in one txt file as string. 
>From this big txt file I am trying to take out each html file body of each URL and trying to write and store them with attempts like,

for i, line in enumerate(file1):
	f = open("/python27/newfile_%i.txt" %i,'w')
	f.write(line)
	f.close()

Generally not much of an issue, but was thinking of some better options.

Regards,
Subhabrata Banerjee. 



More information about the Python-list mailing list