Hi,
I have a small issue regarding the upgrade to 6.0. We have a pretty big PostgreSQL database (around 600 GB in size, and an estimate of 5.5 billion records).
We would like to update to Zabbix 6.0 LTS.
I have tested the upgrade and we can run 6.0 using the 5.0 database structure, but we would really like to follow best practices.
I have read the upgrade guide on here for PostgreSQL:
My questions are:
1. Is it really necessary to move the data to a CSV file to a TMP table and to the newly created table that has the Primary Keys? Can't we just move data from the old table to the new table with the PKs created on it?
1. We have tried various tests migrating the data between tables and even after a few days of waiting, none of the tests were able to complete successfully. I have thought of another idea: removing the duplicate records on the old tables and then creating the PKs. Would this also work? And if it would work... Would it have any negative consequences?
Thank you
I have a small issue regarding the upgrade to 6.0. We have a pretty big PostgreSQL database (around 600 GB in size, and an estimate of 5.5 billion records).
We would like to update to Zabbix 6.0 LTS.
I have tested the upgrade and we can run 6.0 using the 5.0 database structure, but we would really like to follow best practices.
I have read the upgrade guide on here for PostgreSQL:
My questions are:
1. Is it really necessary to move the data to a CSV file to a TMP table and to the newly created table that has the Primary Keys? Can't we just move data from the old table to the new table with the PKs created on it?
1. We have tried various tests migrating the data between tables and even after a few days of waiting, none of the tests were able to complete successfully. I have thought of another idea: removing the duplicate records on the old tables and then creating the PKs. Would this also work? And if it would work... Would it have any negative consequences?
Thank you
Comment