Hello,
I've to made some advanced check on our storage (netapp).
The only way to retrieve this kinds of stats on the storage is to execute a command directly on the storage cli with a specific command; there in no snmp values for this specific stats.
I've made a script to use for the discovery (external check type), that produce a json file; All the items are collected with success.
Ex. collect_netapp_stats.pl["{$STORAGE}","Volume"] -> Where "Volume" are the type of stats that i want to collect, following a json example:
"{#TYPE}":"Volume",
"{#NAME}":"vol_test",
"{#VALUE}":"avg_latency",
"{#OUTPUT}":"12.76us"
Every item prototype has the following key, with external check type:
return_netapp_stats.bash["{$STORAGE}","Volume","{#NAME}","avg_latency"]
Then for EVERY SINGLE ITEM the batch "return_netapp_stats.bash" must be run and it return the correct value of the item.
The problem is that we have many item to be collected, and this have impact to the collection item (the script exit with timeout), but in particulary on the zabbix server performance, with a great increment of the running processes (and, obviously, of the cpu load).
There is a way to collect all the data item in one way? I've all data in the json file but the script return a single value for a single item. It's not possible to collect directly ALL data from json file, without call the script for every single value?
Maybe something like the zabbix_agent type works:
Volume.vol_test.avg_latency
?
Do you have some idea?
Thank you very Much,
Roberto
I've to made some advanced check on our storage (netapp).
The only way to retrieve this kinds of stats on the storage is to execute a command directly on the storage cli with a specific command; there in no snmp values for this specific stats.
I've made a script to use for the discovery (external check type), that produce a json file; All the items are collected with success.
Ex. collect_netapp_stats.pl["{$STORAGE}","Volume"] -> Where "Volume" are the type of stats that i want to collect, following a json example:
"{#TYPE}":"Volume",
"{#NAME}":"vol_test",
"{#VALUE}":"avg_latency",
"{#OUTPUT}":"12.76us"
Every item prototype has the following key, with external check type:
return_netapp_stats.bash["{$STORAGE}","Volume","{#NAME}","avg_latency"]
Then for EVERY SINGLE ITEM the batch "return_netapp_stats.bash" must be run and it return the correct value of the item.
The problem is that we have many item to be collected, and this have impact to the collection item (the script exit with timeout), but in particulary on the zabbix server performance, with a great increment of the running processes (and, obviously, of the cpu load).
There is a way to collect all the data item in one way? I've all data in the json file but the script return a single value for a single item. It's not possible to collect directly ALL data from json file, without call the script for every single value?
Maybe something like the zabbix_agent type works:
Volume.vol_test.avg_latency
?
Do you have some idea?

Thank you very Much,
Roberto

Comment