RSS aggregator with curses and feedparser

Roberto Bechtlufft robertobech at gmail.com
Sun Sep 24 09:44:39 EDT 2006


Hi, I'm new around here... I'm a Python hobbyist, and I'm far from
being a professional programmer, so please be patient with me...

I'm working on my first Python program: a curses based RSS Aggregator.
It's basically a clone of snownews, one of my very favorite programs.
But I want to add some funcionality. Whenever you update a feed in
snownews, it loses all previous topics, even though you may not have
read them, it only keeps the current topics. I want my program to
actually aggregate feeds. Also, I want to make it able to read atom
feeds, which is easy since I'm using feedparser. I see that liferea
keeps a cache file for each feed it downloads, storing all of it's
topics. I'm doing the same here.

A question: how do I tell my program that a certain entry was/wasn't
downloaded yet? Should I use the date or link tag of the entry?




More information about the Python-list mailing list