This is a small but flexible feed aggregator.
It scans news feeds (RSS & Atom) and sends the entries via e-Mail. The user can configure each feed to be sent immediately or in a daily digest.
libraries | ||
models | ||
readability | ||
.gitignore | ||
atomstrom.conf.sample | ||
atomstrom.py | ||
ddate.py | ||
feedparser.py | ||
gpl.txt | ||
html2text.py | ||
LICENSE | ||
README |
========= Atomstrom ========= Atomstrom is a small but flexible feed aggregator. Copyright (C) 2013 Ronald Schaten <ronald@schatenseite.de> See <https://dev.0x50.de/projects/atomstrom/> for current versions and development info. Idea ---- The idea of Atomstrom is based on the great little aggregator rss2email <http://www.allthingsrss.com/rss2email/>, a small tool written in Python that fetches feeds and sends the received entries via mail. It is a fine program, and I used it for several years. Installation ------------ Atomstrom doesn't need any installation. Just copy the included atomstrom.conf.sample to atomstrom.conf and make the appropriate settings. If you use MySQL, you'll have to create a database first. The database structure will be created by Atomstrom on its first launch. Configuration ------------- Up until now, feeds are configured by making the right settings in the feed-table of the database. The settings are as follows: * id: Set automatically by the database, you don't have to change this. * url: URL of the feed you'd like to fetch. * frequency: Minutes between fetching the feed. * keepdaysafterlastfetch: Entries are deleted if they aren't included in the feed for n days. * daily: Entries won't be sent immediately, they are included in the daily digest. * resolveredirects: Some feeds use URL-forwarders in their feeds to create usage stats. Enable this if the forwarder-URL is ugly. * readability: Use readability to fetch the URL and extract the part that is probably the entry content. * fullpage: Fetch the full page from the URL. * contentcolumn: Can be one of summary, content, fullpage or readability. This marks the column in which the content is found that will be sent. * html2textcontent: Convert the content-text from HTML to text. * html2textignorelinks: Ignore links when converting to text. * html2textignoreimages: Ignore images when converting to text. * enabled: Marks if this feed is enabled or not. The on-/off-options have to be NULL to be disabled, 1 otherwise. Usage ----- I use two cronjobs to fetch and send feeds: */5 * * * * cd /atomstrom-directory/ && ./atomstrom.py -fs > /dev/null 2>&1 5 6 * * * cd /atomstrom-directory/ && ./atomstrom.py -d > /dev/null 2>&1 So every five minutes, all due feeds will be fetched and single mails will be sent. Every day at 06:05, the daily digest will be sent. Command Line Arguments ---------------------- The following switches can be used from the command line interface: -h, --help show this help message and exit -f, --fetch fetch all feeds -s, --single send single mails -d, --daily send daily digest -l, --list list all configured feeds -e ID, --delete ID delete feed <ID> from configuration -r ID, --reset ID reset data for feed <ID> Note that until now it's not possible to add or configure feeds from the command line. Maybe this feature will be included in some future version. Thanks ------ I'd like to thank the creators of the modules I was able to use for this project: * Universal Feed Parser: <http://code.google.com/p/feedparser/> * SQLAlchemy: <http://www.sqlalchemy.org/> * python-readability: <https://github.com/buriy/python-readability>