I use different tools like OpenHAB, InfluxDB, Grafana, Node-RED, MonogoDB etc. in my Home Automation System. There is always a chance any tool may crash or the server may crash or HDD may fail or due to some other causes can lead to data loss. This has happened to me earlier also where I again had to start my setup from scratch but it taught me a lesson, backups are really important. So I went ahead and wrote a simple shell script to do the backup tasks automatically with a period of around 10 hours. Moreover, the files are pushed to GitHub to a private repository, so in case if I want to check a previous version of some config file it will always be there, which usually happens as I tend to change and experiment every now and then. So this script is all about the script and the backup processes of different tools I use in my Automation System.
To make things easy I copy backups from each to separate folders inside my OpenHAb conf folder(For me it is,
/srv/openhab2-conf/ as I am using Openhabian), which holds all my OpenHAB configs and I do a git commit and push to GitHub.
As per the Node-RED official docs, which can be found here following are the critical files to be backed up.
flows_*.json & flows_*_cred.json : Theses are flows and any credentails stored in Node-RED.
.config.json : Current configuration being used.
settings.js : User global settings for Node-RED.
package.json : Defines the extra node modules used.
libs (Folder) : This contains snippets of flows and JS code saved from the Node-RED admin UI as part of the “Library” feature.
.sessions.json (If Exists) : Session information
nodes (Folder/ If Exists) : Manually installed nodes.
So backing up Node-RED is pretty, just copy the files to a secondary location. I have created a folder
.nodered, which holds all the nodred baccked up files. To copy the files we can use following commands,
cd ~/.node-red sudo cp flows_*.json /srv/openhab2-conf/.nodered/ sudo cp flows_*_cred.json /srv/openhab2-conf/.nodered/ sudo cp settings.js /srv/openhab2-conf/.nodered/ sudo cp package.json /srv/openhab2-conf/.nodered/ sudo cp .config.json /srv/openhab2-conf/.nodered/ sudo cp lib -R /srv/openhab2-conf/.nodered/ sudo cp .sessions.json /srv/openhab2-conf/.nodered/ sudo cp nodes -R /srv/openhab2-conf/.nodered/
So, yes, thats pretty much straighforward.
In case you want to restore, you can simply copy the backed up files to
~/.node-red directory, stop Node-REd run the command
npm install from the same directory. Now start Node-RED. However if
package.json is not having info about used modules, you need to install those manually.
Backing up Grafana is also just a matter of copying files. I have a folder
.grafana, inside OpenHAB conf folder to where the script copies the files from Grafana directory. Grafana uses sqlite as database and is stored as grafana.db. Apart from it settings are saved in grafana.ini file. So we basically copy the above two files and the grafana folder itself, but is not required. To copy we can use the follwing commands,
sudo cp /etc/grafana/grafana.ini /srv/openhab2-conf/.grafana/ sudo cp /var/lib/grafana/grafana.db /srv/openhab2-conf/.grafana/ sudo cp -R /usr/share/grafana /srv/openhab2-conf/.grafana/
This is also easy right ? When it comes to restore, it can be done by simply compying the .db file and restarting grafana and also making sure settings.ini are same.
If you want to learn about backing up and restoring InfluxDB, you can click here.
influxd backup does the job. It takes parameters like the server address port and the destination where the backups will be stored. In my case it is
.influxdb directory inside OpenHAB config. Also if you want to backup a particular DB you can pass the db name. The commands used to backup is,
export INFLUX_USERNAME=<admin_username> export INFLUX_PASSWORD=<admin_password> influxd backup -portable -host localhost:8088 /srv/openhab2-conf/.influxdb/
Usage of MongoDB is optional, though it can be used with OpenHAB, I use InfluxDB as it is best suited for time-series data. MongoDB is in place for to store some other data around my setup. Backing up MongoDB is done using mongodump. The dumps are stored to
.mongodb folder inside the OpenHAB config directory.
mongodump -h localhost -d <database_name> -u <user_name> -p <password> -o /etc/openhab2/.mongodb
To restore from dumps mongorestore can be used.
As mentioned above all the OpenHAB files (.items, .sitemaps, .things etc. ) are stored inside the conf. folder. And we have also copied all the backups from other tools to the same folder. So it’s now a matter of committing the changes and pushing to GitHub. (Of course, the .conf folder has to be a local git repo and Github remote has to be added.) For automatic commits, I add date and time as the commit message. I also regularly copy the habpanel configs to a folder .habpanels inside the OpenHAB config folder.
To achive same we can use the following commands,
sudo chmod 777 -R /srv/openhab2-conf/ #Backing up Openhab Confs folder cd /srv/openhab2-conf/ git add . #git checkout master git commit -m "Auto commit on "$currentDate git push https://<github_username>:<github_password>@github.com/<username>/<repository> --all
Yes, thats all about the script, the complete script can be obtained from here.
You can pretty much use crontab or any other mechanism in Linux to run the script periodically to perform the backup. But I use Node-RED personally. This allows me a few automation around it also, like sending a Telegram message when the backup is done. You can customize it to do anything. I have added some OpenHAB items to keep track of the backup cycle.
The Node-RED flow looks like,
Basically, the inject node triggers the exec node every 10 hours which does the backup. The above flow definition can be downloaded from here.
So that’s all guys, hope it will help you in some ways.