feed is a program for parsing RSS feeds and converting tehm into webpages or other rss feeds. It can be used to produce a webpage with a summary of your rss feeds, or a rss feed of other feeds, or a few other things. Examples of its use can be found here and here
feed is a program for generating webpages from rss feeds. It reads in one or more rss feeds though stdin
and writes out the result though stdout. So a logical use is to have a list of feeds you read daily, like this one
here, and to get cron to downlaod them with wget and pass the result to the program like this:
wget -i list -O - | feed > output.html
Yes, exists. Add this line to cron and get your news every day. My advice is to only run it once or twice a day as many sites, including slashdot block you if you hit them too often. If you do get blocked, you can wait 48 hours and see if they unblock you, or just tone down the number of times you connect to their site.
Why doesn't feed download the feeds itself, rather than having wget to do it? The system feed was supposed to run on lies behind a proxy, and urllib's proxy support is non-existant. it also helped with testing. If you wanted to add support for such a thing, it could be added in a hacky way, or in quite a nice way. If you want such a version, drop me a line.