Hello community,
within the logfile of the Zabbix server, I can see a lot of
messages.
For today there were already 13696 failed queries:
I tried to filter them, to show only unique items:
This issue seems to only have started recently (according to the unique entries and assuming I am filtering them correctly):
How can I solve these duplicate entries? Is there some "tool/script" to see what actually is the issue, or do I need to look up every query manually and see where the error comes from?
Thanks!
within the logfile of the Zabbix server, I can see a lot of
Code:
query failed: [1062] Duplicate entry 'XXX' for key 'XXX'
For today there were already 13696 failed queries:
Code:
root@zabbix:/home/steffen# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log | wc -l 13696
Code:
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
61
root@zabbix:/home/steffen#
Code:
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
61
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log.1 | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
89
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log.2 | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
71
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log.3 | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
1
root@zabbix:/var/log/zabbix# grep 'query failed.*Duplicate entry' /var/log/zabbix/zabbix_server.log.4 | awk '{print $12}' | sed "s/'//g" | sort | uniq | wc -l
1
root@zabbix:/var/log/zabbix#
Thanks!
Comment