Dear Zabbix Users,
is it possible to analyze which data/items are the most in the HistoryCache?
I am asking this because we have the problem that the HistoryCache is beeing filled and filled and filled until it is full. Then the zabbix server stops to work correctly. It thinks that hosts are unavailable and send alarm mails out. so I have disabled all trigger actions at the moment.
Stopping the zabbix server leads to hours and ours of history syncing entries in the log. It even starts with negative percentage values. As it takes to long I have to kill -9 all zabbix_server procs. Then I can start the server again. Just to see it filling up it's HistoryCache again. I already raised the size to 1 GB and the HistoryCacheIndex to 256MB, but that does not help. Also raising dbsyncers does not help. I noticed that most of the time only one dbsyncer is working. It takes 100% of one core.
The database is postgresql 12 and pg_activity is not showing any problems as far as I can see.
Why is there only one dbsyncer working? Could it be that something is blocking the others to do their work?
I think some proxy is the culprit for all of this. It may be the case that it is pushing too much data to the server but how can I see this? Is it possible to find out which items in the history cache are the most? I know there is the proxy overview page with the number of items and required performance per proxy. But this does not help me out apprently.
We are running zabbix version 4.0.27
Postgresql 12
zabbix server and db on the same host. The host is a vm with 32GB ram and 8 cores.
StartPollers=30
StartIPMIPollers=1
StartPollersUnreachable=15
StartTrappers=10
StartPingers=3
StartDiscoverers=3
StartHTTPPollers=3
JavaGateway=127.0.0.1
StartJavaPollers=1
SNMPTrapperFile=/var/log/snmptrap/snmptrap.log
CacheSize=256M
HistoryCacheSize=1G
HistoryIndexCacheSize=256M
TrendCacheSize=16M
ValueCacheSize=512M
Timeout=4
AlertScriptsPath=/usr/lib/zabbix/alertscripts
ExternalScripts=/usr/lib/zabbix/externalscripts
LogSlowQueries=3000
StartProxyPollers=5
TIA
Timo
is it possible to analyze which data/items are the most in the HistoryCache?
I am asking this because we have the problem that the HistoryCache is beeing filled and filled and filled until it is full. Then the zabbix server stops to work correctly. It thinks that hosts are unavailable and send alarm mails out. so I have disabled all trigger actions at the moment.
Stopping the zabbix server leads to hours and ours of history syncing entries in the log. It even starts with negative percentage values. As it takes to long I have to kill -9 all zabbix_server procs. Then I can start the server again. Just to see it filling up it's HistoryCache again. I already raised the size to 1 GB and the HistoryCacheIndex to 256MB, but that does not help. Also raising dbsyncers does not help. I noticed that most of the time only one dbsyncer is working. It takes 100% of one core.
The database is postgresql 12 and pg_activity is not showing any problems as far as I can see.
Why is there only one dbsyncer working? Could it be that something is blocking the others to do their work?
I think some proxy is the culprit for all of this. It may be the case that it is pushing too much data to the server but how can I see this? Is it possible to find out which items in the history cache are the most? I know there is the proxy overview page with the number of items and required performance per proxy. But this does not help me out apprently.
We are running zabbix version 4.0.27
Postgresql 12
zabbix server and db on the same host. The host is a vm with 32GB ram and 8 cores.
StartPollers=30
StartIPMIPollers=1
StartPollersUnreachable=15
StartTrappers=10
StartPingers=3
StartDiscoverers=3
StartHTTPPollers=3
JavaGateway=127.0.0.1
StartJavaPollers=1
SNMPTrapperFile=/var/log/snmptrap/snmptrap.log
CacheSize=256M
HistoryCacheSize=1G
HistoryIndexCacheSize=256M
TrendCacheSize=16M
ValueCacheSize=512M
Timeout=4
AlertScriptsPath=/usr/lib/zabbix/alertscripts
ExternalScripts=/usr/lib/zabbix/externalscripts
LogSlowQueries=3000
StartProxyPollers=5
TIA
Timo
Comment