When I am checking logfiles, I have several topics to cover:
I doubt, only 1. and 2. can be done, while 4. is probably not done with log monitoring feature itself anyway - I could use vfs.file.time, for example as proposed here: https://www.zabbix.com/forum/zabbix-...time-is-recent).
And the other question is: If I have several items checking against the same log file, would the zabbix agent read the same file multiple times or would it recognize several items requesting the content of the same file and therefore reading it only once?
I am currently not sure, if it would be better to create a UserParameter with an external script to do these checks, but then I need to do the work myself avoiding to read the whole file. I mean, the Log monitoring would do this - however, if I have more items all accessing the same file I am not sure if then the default feature would still have the better performance - if each item is causing again a file read (even if it is only a part of it).
What do you think?
- Searching for keywords in Logfile (e.g. error|failed|severe) considered as problem
- Etracting other informations to check for critical data (e.g. number of requests logged within a particular time interval)
- Checking performance (time between some entries, e.g. START ... and FINISHED ...)
- Logfile should have a timestamp not older than n hours ago.
I doubt, only 1. and 2. can be done, while 4. is probably not done with log monitoring feature itself anyway - I could use vfs.file.time, for example as proposed here: https://www.zabbix.com/forum/zabbix-...time-is-recent).
And the other question is: If I have several items checking against the same log file, would the zabbix agent read the same file multiple times or would it recognize several items requesting the content of the same file and therefore reading it only once?
I am currently not sure, if it would be better to create a UserParameter with an external script to do these checks, but then I need to do the work myself avoiding to read the whole file. I mean, the Log monitoring would do this - however, if I have more items all accessing the same file I am not sure if then the default feature would still have the better performance - if each item is causing again a file read (even if it is only a part of it).
What do you think?
Comment