Database Backup - Mediawiki

I got mediawiki up and running through docker and the mariah db container is a separate container. I’ve finally invested some time in populating the database is there anything special I need to do?

I was thinking if i just copy the folders that i made persistent in the db, but is that enough for an actual database backup?

If it is syncthing is how I’m probably going to do it, so if there are any words of wisdom or anything to lookout for please let me know.

Thank you.

I don’t know about words of wisdom but I’ve found Mediawiki great for capturing my information. However, for updating / upgrading / disaster recovery I’m at a complete loss, the software is an utter mess and not user friendly at all.

I’m running the whole instance in a vm which I backup daily.

Probably best to do all your testing now before you become too dependent.

I don’t use it in docker but as it’s own VM. Here is a basic bash script you can run that will export the database and the images folder that contains all the uploaded media. Backing up the whole VM is a much larger file so we only do that when there are OS / App changes made. We have our system backing up the database / media hourly and sent off site.

#!/bin/bash
SQLFILE=/root/wikibackup/wiki_backup.`date +"%A"`.sql
IMAGESFILE=/root/wikibackup/wikiFiles_backup.`date +"%A"`.tar.z
ZIPFILE=/root/wikibackup/wiki_Backup.`date +"%A"`.zip

DBSERVER=127.0.0.1
DATABASE=wiki
USER=wiki
PASS="HPbSgC8G5BVaUbsasdqwTE"

mysqldump --opt --user=${USER} --password=${PASS} ${DATABASE} > ${SQLFILE}
tar cfvz ${IMAGESFILE} /var/www/html/images/
2 Likes

Nice thanks @LTS_Tom

Thanks Tom. I knew it had to be something more than just backing up directories. I will give this a shot and see if it works out by spinning another wiki up and using the backed up data.

1 Like