Need a different approach to backup postgresql database to ensure just the config data for zabbix. Do not want to backup the transient data for history and trends.
The current backps are more than 800 MB in size. Need to upload this to a datastore where there is a size limit. Need to reduce the size but wanted make sure that we keep the required schema objects to ensure a successful restore.The script is using the following options.
pg_dumpall -U postgres --globals-only --file=$backupdir/globals.sql and then zip -j $backupdir/backup.$index.zip $backupdir/*.sql
The purpose here is to come back on with a zabbix set up, just in case a crash occurs. In other words what are the required schema objects that Zabbix depend on, that need to be backed up to ensure a perfect restore, ignoring all the monitored data that may be available in history and trends. Thanks for your help.
The current backps are more than 800 MB in size. Need to upload this to a datastore where there is a size limit. Need to reduce the size but wanted make sure that we keep the required schema objects to ensure a successful restore.The script is using the following options.
pg_dumpall -U postgres --globals-only --file=$backupdir/globals.sql and then zip -j $backupdir/backup.$index.zip $backupdir/*.sql
The purpose here is to come back on with a zabbix set up, just in case a crash occurs. In other words what are the required schema objects that Zabbix depend on, that need to be backed up to ensure a perfect restore, ignoring all the monitored data that may be available in history and trends. Thanks for your help.
. Literally, just don't backup any tables that have the name "history" or "trends" in them. If you back up all other tables, you'll get all the configuration data that exists in the database.
Comment