parsing and searching big text files
Darrell Gallion
darrell at dorb.com
Fri Dec 8 22:19:27 EST 2000
Also untested:
def scanGaborsFile(file, scanFor='John Doe', seperator=':'):
buf=open(file).read()
res=re.findall("(?:^|\012)(.*%s.*)"%scanFor,buf)
return [ i.split(seperator) for i in res]
--Darrell
----- Original Message -----
From: "Gabor Gludovatz"
> I have a big text file which contains 3 variable lenght columns with
> names. I have to search this file for names in either column and have to
> show to other 2 names, and I have to do this very fast!
>
> The text file is about 3 megs long. Does someone know a method to do this
> quickly?
>
> Which is the faster, should I read to whole text file into the memory and
> parse and search it there, or should I read it from the disk line by line
> and parse it..?
> the last seemed to be very slow.
>
> Here is an example line from this text file:
> Foo Bar:John Doe:Bill
>
> If I look for, for example, John Doe, the function should return Foo Bar
> and Bill and all the other records which contain John Doe.
>
> Again: I have to do it very fast.
>
>
> --
> Gabor Gludovatz <ggabor at sopron.hu> http://www.sopron.hu/~ggabor/
>
>
>
> --
> http://www.python.org/mailman/listinfo/python-list
More information about the Python-list
mailing list