Add a custom sfeedrc for getting only new posts

This commit is contained in:
Jaidyn Ann 2023-11-17 16:04:08 -06:00
parent 6641b7c63c
commit 264f8a4a94
2 changed files with 71 additions and 14 deletions

View File

@ -20,20 +20,15 @@ Youve done it!
## Configuration
### sfeed
We need to create a config file and feed directory for sfeed — they can be anywhere, your choice.
We need to create a config file and feed directory for sfeed_update.
You can use the sfeedrc.example file in this repo as a base for your own config file.
```
$ mkdir ~/.config/sfeed/
$ cat > ~/.config/sfeed/config <<EOF
sfeedpath="$HOME/.config/sfeed/"
feeds() {
feed "Planet GNU" "https://planet.gnu.org/rss20.xml" "https://planet.gnu.org" "UTF-8"
feed "Tiriftjo" "https://tirifto.xwx.moe/en/news.atom" "https://tirifto.xwx.moe" "UTF-8"
}
EOF
$ cp sfeedrc.example ~/.config/sfeedrc
```
You can read up more on sfeeds configuration in its documentation¸ sfeedrc(5).
You need to edit the example sfeedrc to add in your own Atom/RSS feeds, or to change the feed path.
You can read up more on sfeeds configuration in its man-page¸ sfeedrc(5).
### Mastodon
@ -55,12 +50,14 @@ Whenever you use sfeed_mastodon, make sure that this token is stored in the envi
## Usage
```
$ FEDI_AUTH="yourAuthorizationTokenHere"
$ sfeed_update ~/.config/sfeed/config | sfeed_mastodon https://yourServer.here
$ sfeed_update ~/.config/sfeedrc
$ cat ~/.config/sfeed/* | sfeed_mastodon https://yourServer.here
```
Its that simple. Its safe to run this command several times in a row — feed entries that have
already been posted wont be reposted. You can even add this to your crontab to mirror an Atom/RSS
feed automatically.
Its that simple. Its safe to run these commands several times in a row — feed entries that have
already been posted wont be reposted, if you use our example sfeedrc.
To automatically mirror an Atom/RSS feed, you can put these commands into a script and put it in your crontab.
### Templates

60
sfeedrc.example Normal file
View File

@ -0,0 +1,60 @@
# This is an sfeedrc(5) configuration file for sfeed_update(1).
# The key difference is that it truncates your feed files at every update,
# keeping only new posts. This saves you the work of filtering out old posts.
# You probably want to EDIT this.
sfeedpath="$HOME/.config/sfeed_mastodon/"
# You probably want to EDIT this.
# This contains a list of all your feeds, in the format:
# feed NAME URL DOMAIN ENCODING
feeds() {
feed "Planet GNU" "https://planet.gnu.org/rss20.xml" "https://planet.gnu.org" "UTF-8"
feed "Tiriftjo" "https://tirifto.xwx.moe/en/news.atom" "https://tirifto.xwx.moe" "UTF-8"
}
# This overrides sfeed_updates default merge() function.
# This makes it so that only new and unseen posts are put in the feed file.
# This is done by storing the date of the latest post in an extended attribute,
# for comparison during the next update..
merge() {
local oldfile="$2"
local newfile="$3"
local previous_max_date="$(attr -q -g sfeed_latest "$oldfile" 2> /dev/null)"
if test -z "$previous_max_date"; then
previous_max_date=0
fi
# Update the date of the last-processed post.
local latest_date="$(latest_date "$newfile")"
attr -qs sfeed_latest -V "$latest_date" "$oldfile" 2> /dev/null
# Output only new and unprocessed posts.
after_date "$newfile" "$previous_max_date"
}
# Given an sfeed file, this returns the date of the latest post (in seconds
# since the UNIX epoch).
latest_date() {
local file="$1"
awk -F '\t' \
'$1 > latest { latest = $1 } END { print latest }' \
"$file"
}
# This outputs only lines of an sfeed file with a date after the given min_date
# (in seconds since UNIX epoch).
after_date() {
local file="$1"
local min_date="$2"
awk -F '\t' -v min_date="$min_date" \
'$1 > min_date { print $0 }' \
"$file"
}