I really searched hard for this but I just couldn't find a suggestion which points me in the right direction yet.
I've some log files I need to monitor and check if there are some certain string values in it like 'ERROR' or 'FATAL'. Isn't a problem at all as long as those strings are found in the first 64 kB ... which brings me to my problem: I also need to get some triggers pulled if there is a certain string after 64 kB which gives me the following message in the agentd log:
At the moment I need to deal with record logs which sometimes hold more than ~512 kB per row ... I know what you're thinking -> you should optimize the stuff which is logged in the first place and I assure you that's something that will happen but right now I need to work with such scope of records.
I've been thinking into the UserParameter option in the agentd config as well as setting up a cron job to executes a script which is parsing, lets say every 10min, the large record log which writes the info I need into a additional log which is then monitored by Zabbix.
Any help is very appreciated!
I've some log files I need to monitor and check if there are some certain string values in it like 'ERROR' or 'FATAL'. Isn't a problem at all as long as those strings are found in the first 64 kB ... which brings me to my problem: I also need to get some triggers pulled if there is a certain string after 64 kB which gives me the following message in the agentd log:
Code:
Only the first 64 kB will be analyzed, the rest will be ignored while Zabbix agent is running
I've been thinking into the UserParameter option in the agentd config as well as setting up a cron job to executes a script which is parsing, lets say every 10min, the large record log which writes the info I need into a additional log which is then monitored by Zabbix.
Any help is very appreciated!
Comment