Ad Widget

Collapse

Monitoring of big logfiles

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • buergi
    Junior Member
    • Aug 2008
    • 12

    #1

    Monitoring of big logfiles

    I am monitoring centralized syslog files. I configured my elements so that they filter me out all log messages for ipmi. Someone knows how the filter for the logfiles works anf if it is a problem if a logfile is big (for example more than 1GB)?
  • exkg
    Senior Member
    Zabbix Certified Trainer
    Zabbix Certified Specialist
    • Mar 2007
    • 718

    #2
    Hmmm,


    Have this some relation with:

    - [DEV-137] increased max number of log file lines sent per second to 100 (Alexei)


    []s,
    Luciano
    --
    Luciano Alves
    www.zabbix.com
    Brazil | México | Argentina | Colômbia | Chile
    Zabbix Performance Tuning

    Comment

    • buergi
      Junior Member
      • Aug 2008
      • 12

      #3
      No i want to know if a logfile is big like 1GB this cost me performance on my host when i search for specific values?

      ~buergi

      Comment

      • trikke
        Senior Member
        • Aug 2007
        • 140

        #4
        Hi Buergi,

        u will have lot's of traffic and load the first time the monitor for the logfile is activated, as Zabbix has to go through the whole logfile ( starting from record/line 1 to end of file). After this "initial" load, next logfilechecks swill go from lastlogfile-record seen to "new" end of file. So basically only the changes since last run. So nearly no traffic.

        Greets
        Patrick

        Comment

        • buergi
          Junior Member
          • Aug 2008
          • 12

          #5
          Thank you!

          greets buergi

          Comment

          • buergi
            Junior Member
            • Aug 2008
            • 12

            #6
            Its normal that the server reads all of the logfile if he was restarted?

            Comment

            • trikke
              Senior Member
              • Aug 2007
              • 140

              #7
              No! lastlogfile stamp should be stored on the DB ( items Table ). Restarting the Server should not trigger a complete "read/reload" of the File.

              Comment

              • buergi
                Junior Member
                • Aug 2008
                • 12

                #8
                Ok. Maybe it sends only the last values of the logfile if i restart my zabbixserver?

                Comment

                • Alexei
                  Founder, CEO
                  Zabbix Certified Trainer
                  Zabbix Certified SpecialistZabbix Certified Professional
                  • Sep 2004
                  • 5654

                  #9
                  Yes, ZABBIX agent will send only new entries to the server! It won't send 1GB of information once again.

                  Also, remember that you may use filtering of agent side in order to make it much more efficient. Usr item key:

                  log["/var/log/huge_logfile","IPMI"]

                  Note presence of the second parameter. It specified regular expression, so the agent will send only lines containing string "IPMI".
                  Alexei Vladishev
                  Creator of Zabbix, Product manager
                  New York | Tokyo | Riga
                  My Twitter

                  Comment

                  • buergi
                    Junior Member
                    • Aug 2008
                    • 12

                    #10
                    I do so. I am filtering the logfile with a regular expression.

                    Thank you guys!

                    Buergi

                    Comment

                    • tekknokrat
                      Senior Member
                      • Sep 2008
                      • 140

                      #11
                      problem with massive growing logfiles

                      It seems theres a problem with monitoring of huge growing log files. I have some log[ items checked against one logfile that grows up to 20GB that the agent doesn't get managed.

                      I have now installed SEC for parsing the logfile and some checks.

                      Below there are 2 history values of the agent and the trapper (SEC):

                      Code:
                      logWatchSCG (/var/log/172.27.42.33/syslog) regex:test1 	14 Jun 08:37:51 	Jun 14 06:31:46 172. ...
                      logWatchSCG trap.match (test1) 	14 Jun 11:16:45 	Jun 14 09:10:45 172. ...
                      The first is the zabbix agent item the second the trapper from SEC script.
                      The zabbix agent leaves behind three hours and I am not sure if it will find a further match for today.

                      This is another example where the agent matches an entry 20h in the past. I must admit I have restarted the agent the day before and also deleted the proxy database for test reasons on that day but there are still at least ten hours drawback for the agent.

                      Code:
                      logWatchSCG (/var/log/172.27.42.33/syslog) regex:test2 	14 Jun 06:47:08 	Jun 13 09:20:20 172. ...

                      Comment

                      • ldunston
                        Junior Member
                        • May 2009
                        • 12

                        #12
                        This may be unrelated but I as having an issue with large (1+GB) apache log files where a small delay between zabbix timestamp under latest data and the timestamp on apache log entries would start to skew and grow larger as the apache servers grew busy (upwards of 30+ minutes). Turns out I was using an older 1.4 version of the agent and upgrading to 1.6.4 fixed the problem.

                        I remember coming across a forum where there was a global var that was set to constrain the number of log entries the zabbix agent would parse per second which was increased from 10 to 100 in the 1.6.4 release and is tuneable for even more. I suspect that's what was causing my issue.

                        Hope that helps someone else.

                        --Les

                        Comment

                        Working...