True, my pages don‘t change often, but when they do I would like the latest revision to be added to the archive.
Using a cron job to schedule a submission would work, but I’d need a means to walk through every single page in my site and submit them individually. The Internet Archive does not provide a direct way to submit and crawl an entire site.
Exactly. Saving after every minor edit doesn’t make sense.
The code snippet you provided gives me something to tinker with. Perhaps I just need to add a checkbox to the editing panel, so I can choose whether or not to submit a update to the archive when a change is being published.
It’s not all that hard to get all the urls… you can use the wget terminal command for this kind of thing.
It’s not directly obvious but from googling around earlier, i read that you can submit a text file with a list of links on it into that submission form and it will add them all - how true that is i dont know.
As an alternative you could also put a button into each page to trigger saving to the machine manually, or a button into the dashboard to trigger saving of the whole page.
The hook alternative has the downside that it only reacts on content changes. If your content stays the same but your design changes, then a hook is of now use and a cron job or a manual trigger would be more useful.
The downside with that checkbox is that you have to uncheck it the next time you want to save and then check again if you want to submit the next version. Otherwise, you would again submit every comma change.