[–] sakuramboo 0 points 5 points (+5|-0) ago  (edited ago)

Something like this...


LINKS=$(curl -s https://voat.co/v/all/new | grep class=\"comments | awk '{print $2}' | cut -d'"' -f2)
DATE=$(date +%Y-%m-%d-%H:%m)

if [ -e $BACKUP ]; then
     mkdir $BACKUP

for i in ${LINKS}; do
     FILENAME=$(echo "${i}" | cut -d"/" -f3)
     curl -o ${BACKUP}/${FILENAME} ${BASE}/${i}

tar czf voat.${DATE}.tar.gz $BACKUP

Off the top of my head, so there's probably some errors. Toss it in cron to run every 10 minutes or so, so it gets the front page of /v/all.

But, you would need to manually clean up old backups. That could be added to the script, too.

EDIT: Also, this is just to archive the front page, it doesn't check for changes, that would be manual, though could be scripted, too. It could probably be made even simpler by just looking for voat links and saving them without actually downloading the posts, if all you care about are disappearing posts.

[–] Morbo 0 points 1 point (+1|-0) ago 

It seems like you suspect that when Voat goes down, the reason for that is to remove certain posts. Posts could be removed without taking the site down and people probably wouldn't notice unless it was an already hot or controversial post. I don't think you're going to find what you're looking for. There are easier ways of disappearing content that don't require taking the site down, even if only as a distraction technique.

[–] JunOS ago 

If the site is down no active script will help since any page it's looking for is gone and It's after the fact. You'd need to have it running constantly and then when it goes down, look at the previous hour.