Ad Widget

Collapse

Read from file discovery rule

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • ijonjic
    Junior Member
    • Nov 2023
    • 10

    #1

    Read from file discovery rule

    Hi all,

    I need help with creating Template on Zabbix. Basically, what i want to achieve is following:

    I have file on my host(linux server). File is generated (and parsed) with local script runed by cron every 15 minutes. Script is checking some components of the database system and write down (to the file) the names of the components that haves status "failed". File will have following format:

    Name of the Component 1
    Name of the Component 2
    Name of the Component 3
    etc.

    I want to make Zabbix Template that will have 1 Discovery Rule, 1 Item prototype and 1 Trigger Prototype.

    Discovery rule need to be pointed to the file and read line by line.
    1 discovered Item on host = 1 line in file (All discovered items need to have value "Down")
    Trigger need to be raised for all discovered items.

    Additional info:

    Zabbix Version: 6.0.23.
    Target host already have installed Zabbix Agent.

    Help would be much appreciated. Thank you in advance.
  • cyber
    Senior Member
    Zabbix Certified SpecialistZabbix Certified Professional
    • Dec 2006
    • 4807

    #2
    Why discovery? Isn't just simple log item with "PROBLEM event generation mode" set to "Multiple" enough?

    Creating items from your file would not be a big issue (would be easier, if your output would be json formatted), but where should the values come? Its not possible to force that value to "down"...

    Comment

    • ijonjic
      Junior Member
      • Nov 2023
      • 10

      #3
      I thought that discovery would be perfect fit since number of components (lines in file) is changing. So you are suggesting that i make template with simple item that will read the file as log? Can you maybe write down some steps how to make it etc.?

      Thank you,

      IJ

      Comment

      • PeterZielony
        Senior Member
        • Nov 2022
        • 146

        #4
        you could have a powershell script that will read it and output JSON for LLD - got something more complex where I have a config file with sections [companyname] and different settings etc. You can simplify it ofc as you need Powershell script for to read each line for the discovery rule - but that might give you an idea of what is what - you have to build script that will give you specific JSON output
        https://www.zabbix.com/documentation...stom-lld-rules


        Use UserParameter like this:
        UserParameter=Config.discovery,powershell -File "C:\Program Files\Zabbix Agent 2\zabbix_agent2.d\scripts\ParseINI.ps1"

        but if you need it? that depends on what you want to do with those values afterwards - do you need to do something with them later? like dynamic item creation that monitor.. jobs with that name?

        UserParameter:
        https://www.zabbix.com/documentation...userparameters

        This is my PS script that will take config like (from memory):
        (script ignores # as comments)

        HTML Code:
        [Server]
        ...some settings for servers which PS will ignore​
        [Site1]
        CustID=1
        DBHost=URL
        DBName=DATABASENAME1
        [Site2]
        CustID=2
        DBHost=URL
        DBName=DATABASENAME2
        HTML Code:
        function Get-IniFile {
          
            param(
                [parameter(Mandatory = $true)] [string] $filePath,
                [string] $anonymous = 'NoSection',
                [switch] $comments,
                [string] $commentsSectionsSuffix = '#',
                [string] $commentsKeyPrefix = 'Comment'
            )
            $ini = @{}
          
          
          
            switch -regex -file ($filePath) {
                "^\[(.+)\]$" {
                    # Section
                    $section = $matches[1]
                    $ini[$section] = @{}
                    $CommentCount = 0
                    if ($comments) {
                        $commentsSection = $section + $commentsSectionsSuffix
                        $ini[$commentsSection] = @{}
                    }
                    continue
                }
                "^(;.*)$" {
                    # Comment
                    if ($comments) {
                        if (!($section)) {
                            $section = $anonymous
                            $ini[$section] = @{}
                        }
                        $value = $matches[1]
                        $CommentCount = $CommentCount + 1
                        $name = $commentsKeyPrefix + $CommentCount
                        $commentsSection = $section + $commentsSectionsSuffix
                        $ini[$commentsSection][$name] = $value
                        Write-Host $name
                    }
                    continue
                }
                "^(.+?)\s*=\s*(.*)$" {
                    # Key
                      
                    if (!($section)) {
                        $section = $anonymous
                        $ini[$section] = @{}
                    }
                    $name, $value = $matches[1..2]
                    $ini[$section][$name] = $value
                    }
                  
                }
            return $ini
        }
        $iniFile = "someinifile.ini"
          
        $y = Get-IniFile $iniFile
        
        $jsondirlist = "{`n"
        $jsondirlist += " `"data`":["
        $jsondirlist
        #total count of keys
        $total = $y.Keys.Count
        foreach ($Item in $y.Keys) {
          
            $settings = ""
          
            foreach ($Name in $y.Item($Item)) {
        
                if ($Name.Item("CustID") -eq $string -and $Item -eq "Server") {
                  
                    $settings = "{ ""{#SERVER}"" : " + """$Item"""+ " }"
                }
                else {
                    $settings = "{ ""{#COMPANYNAME}""  : "+"""$Item"""+","+" ""{#ID}"" : """+ $Name.Item("CustID").PadLeft(10,'0')+""""+", ""{#USER}"" : " + """" + $Name.Item("DBUser")+""", " +"""{#DATABASENAME}"" : " + """" + $Name.Item("DBName") +""", " +"""{#DBHOST}"" : " + """" + $Name.Item("DBHost") +"""" +" }"
                }
              
            }
            #count of all items
            $keyPosition = $($y.keys).indexOf($Item) +1
            if ($keyPosition -ne $total) {
                $settings += ","
            }
            #if end add closing
            else {
                $settings += "]`n}"
            }
            Write-Host  $settings  
        }​
        Last edited by PeterZielony; 06-12-2023, 17:15.

        Hiring in the UK? Drop a message

        Comment

        • Brambo
          Senior Member
          • Jul 2023
          • 245

          #5
          If you already have a script preprossing your output why not use zabbix sender and push over the file.
          Layout of the file should then look something like this:
          Code:
          <your host ip> Keyname {"data":[{"{#lldmacro1}":"valuenumber 1","{#lldmacro2}":"parameter 2"},{"{#lldmacro1}":"valuenumber 33","{#lldmacro2}":"parameter 33"}]}
          <your host ip> Keyname.[valuenumber 1] parameter_2
          <your host ip> Keyname.[valuenumber 33] parameter_33​
          The pushed over value should not contain any spaces...
          Last edited by Brambo; 07-12-2023, 14:16.

          Comment

          • ijonjic
            Junior Member
            • Nov 2023
            • 10

            #6
            Hi PeterZielony,

            Let me add some more details:

            So i have connectivity_check.sh script that check some components for Adobe Connect On-premise infrastructure. The script (runned by cron job every 15 minutes) have following output:

            Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection) : failure
            Network connectivity test from container: ncc-debugger ---> Media Server Node2 (SIP Connection) : success
            webrtc-debugger attached successfully with apigw : failure


            Another script parse the above output to this (parse only lines that are "failed"):

            Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection)
            webrtc-debugger attached successfully with apigw


            This output is saved to connectivity_check_output.txt file and it is overwritten every 15 minutes by new info.

            What i want to achieve is:

            Create template with discovery rule that will read the file above and create items and triggers. In this example,discovery rule need to create 2 items and 2 triggers:

            ITEMS:
            Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection) (with value "failure" or similar, true,false is also good if possible)
            webrtc-debugger attached successfully with apigw (with value "failure" or similar, true,false is also good if possible)

            TRIGGERS(PROBLEMS):
            Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection)
            webrtc-debugger attached successfully with apigw

            Also,i need to have trigger for OK event,when ​item is not longer in the file (problem solved).

            I'm not sure if this is possible...

            Comment

            • PeterZielony
              Senior Member
              • Nov 2022
              • 146

              #7
              Originally posted by ijonjic
              Hi PeterZielony,

              Let me add some more details:

              So i have connectivity_check.sh script that check some components for Adobe Connect On-premise infrastructure. The script (runned by cron job every 15 minutes) have following output:

              Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection) : failure
              Network connectivity test from container: ncc-debugger ---> Media Server Node2 (SIP Connection) : success
              webrtc-debugger attached successfully with apigw : failure


              Another script parse the above output to this (parse only lines that are "failed"):

              Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection)
              webrtc-debugger attached successfully with apigw


              This output is saved to connectivity_check_output.txt file and it is overwritten every 15 minutes by new info.

              What i want to achieve is:

              Create template with discovery rule that will read the file above and create items and triggers. In this example,discovery rule need to create 2 items and 2 triggers:

              ITEMS:
              Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection) (with value "failure" or similar, true,false is also good if possible)
              webrtc-debugger attached successfully with apigw (with value "failure" or similar, true,false is also good if possible)

              TRIGGERS(PROBLEMS):
              Network connectivity test from node: Signaling Node ---> Media Server Node1 (SIP Connection)
              webrtc-debugger attached successfully with apigw

              Also,i need to have trigger for OK event,when ​item is not longer in the file (problem solved).

              I'm not sure if this is possible...

              Everything is possible, sorry for late reply.. been very busy lately.
              The only thing is how you structure it.

              Can't help you with script and write for you that will read it but maybe different approach?

              You could let zabbix agent to do discovery node list so in zabbix fronend you will see all nodes - let say every one hour but you need to think of method how to do it (based on global file, api or something that can return list of media server nodes) - this is custom low level discovery rule and you need to follow json structure for low level discovery to work.


              Then for each node in discovery item prototype you set item that will check each node separately based on value in your discovery - not sure what exactly your initial check script does (is it api call or service state or checks if file exists .. ?).
              Then trigger prototype to check if it is in failed state, with recovery expressions when it came back as "success"/OK state.

              In each step you can use preprocessing to manipulate input returned etc.

              this way you will have all nodes checked and listed dynamically in zabbix and you will be notified if something went wrong.



              or you could go hard way and not very optimal way to do discovery based on file that contains only problems - in zabbix discovery is used more to create list of "points" (or nodes) that it needs monitoring with set of items attached for checking so zabbix can check them separately.

              You also don't want zabbix to over do this discovery either if your data is changing every 15 min and you would need to think how to cleanse data - you wont have history of each nodes status with your approach- item will be destroyed when not reporting issues. Zabbix don't hold history of item creation and checking for historical data will be nightmare if not impossible


              In my example I'm using discovery script to check database points - and then for each DB I have item prototypes and triggers that checks various things, like if in cache folder with custid (folder name on cache server) exists on server, and how many files are inside and growth rate (as a calulated item) etc. If my ini file changes then zabbix dynamically allocate checks for new customers and holds all historical information if I would need check something from last week.



              Step back one step and start small:
              1) how can zabbix agent can check state of adobe component on single node (or by zabbix proxy/server itself) by either build-in functions or write your own tiny script that will accept macro as a value/s to script (see userparameter and how to pass value to it from zabbix)
              2)how how to create trigger for item you created in 1st step
              3) how to bring list of the nodes that needs checking - it could be config file, API etc that contains info required for connection (ip address, name or other essential data required to make standalone query to node) and create discovery rule - this is custom LLD and there are few ways to create them - if custom you have to follow json structure. You also can use userparameter as a way to get discovery- this is static item that when ran it gives json in specific format.
              4) redefine item from step 1&2 using macros as a item and trigger prototype that will create them from discovery rules (remember json o talk about from discovery rule? - it contains value pairs like {#VALUENAME}:{ACTUAL_VALUE}). You can use output from your 1st sh script that have all info too - but again cant write script for you - this will require some testing with live data and data would have to be simply represented (or you will have headache when something changes) . Relying on missing data as "ok" state ( nodata() ) is bad practice trust me.
              Essentially you want zabbix to know both states, for all sites - either OK or fail but based on not structured file it could go messy and from experience I rather let zabbix to do all check directly to "nodes". It's cleaner this way


              "Essentially you want zabbix to know both states, for all sites - either OK or fail" - this is important. Once you have this in zabbix .. then it is easy with rest.


              5) play with it and use preprocessing in each step and read documentation for each step - docs makes more sense when you are focusing on small problems
              6) enjoy nice template and forget about checking. If something changes its better to modify one step than script

              With this approach you have more flexibility- if you will get another requirement to check on all nodes for example " check if we get ping responses" - you just add another item prototype and if you have ip address from LLD - you simply use it as a new item prototype and all pings will be checked if you have macro already setup with required data.

              I hope this makes sense
              Last edited by PeterZielony; 10-12-2023, 04:19.

              Hiring in the UK? Drop a message

              Comment

              • ijonjic
                Junior Member
                • Nov 2023
                • 10

                #8
                Hi PeterZielony,

                no worries.


                You can find full output of my connectivity_check.sh script attached. Real IPs and Hosts are masked .

                connectivity_check.sh script is made by Adobe i think and apparently it is created to be executed on CLI,not to be used for any further processing.

                But,also there is multiple servers that i want to monitor and each of them has different role in the system,some of them are gateway,some of them are used for application and other for saving the recordings. Each of these servers has different number of TESTS,but all of those tests is a subset of attached output.

                I dont know how to create monitoring for all servers so that i no need to rely on fixed number of tests or possibility that Adobe will add some more tests in the future.

                In addition to this, there is another possibility... to read log file for the connectivity_check.sh script. In that log file every test is represented like:

                {"timestamp": "2023-12-04T13:45:20.875019", "test": 59, "status": "success", "message": " Redis connection [API - x.x.x.x 6379]", "stdout": "", "stderr": ""}

                Maybe this can help.
                Attached Files

                Comment

                • PeterZielony
                  Senior Member
                  • Nov 2022
                  • 146

                  #9
                  Then if you don't have control on tests... Create template with standard static item to read full file:
                  https:/www.zabbix.com/documentation/current/en/manual/config/items/itemtypes/log_items

                  Create discovery that will point to script (userparameter) - this userparameter output script need to return json lld that contains unique parameters that will identify each test. (unless reading json, read note3 below)
                  But you will have to write script to do it and use chatgpt if you dont know how to create it - but you need to follow custom lld json documentation
                  example of output simplified like this (add more items if you need to each json block)
                  [
                  { "{#UNITTEST}":"TEST 1" },
                  { "{#UNITTEST}":"TEST 2" }
                  ]
                  https://www.zabbix.com/documentation...stom-lld-rules

                  quick chatgpt script (not tested):
                  HTML Code:
                  #!/bin/bash
                  
                  # Assuming the input file is named "log.txt"
                  input_file="log.txt"
                  
                  # Extract lines matching the pattern and format them into JSON
                  json_output=$(grep -oP '\[TEST\s+\d+\].*:\s+success|failure' "$input_file" | \
                    awk -F ' --> ' '{split($1, test, " "); print "{ "{#UNITTEST}":"" test[2] "" },"}')
                  
                  # Remove the trailing comma and wrap the output in square brackets
                  json_output="[${json_output%,}]"
                  
                  # Print the final JSON output
                  echo "$json_output"
                  ​
                  output:
                  HTML Code:
                  [
                    { "{#UNITTEST}":"TEST 1" },
                    { "{#UNITTEST}":"TEST 2" }
                  ]
                  { "{#UNITTEST}":"TEST 1" } will be passed to each item/trigger prototype. if you need each test can have more values like this:
                  { "{#UNITTEST}":"TEST 1", "{#UNITTESTDESCRIPTION}:"description","{#UNITI P}:" 127.0.0.1","{#UNITPORT}:"123"}
                  note: when sending json remove empty spaces
                  note2: using JSONPath {#FSNAME} → $.fsname and {#FSTYPE} → $.fstype​ - as per documentation you don't need naming "#MACRO" - you can map them later in json path preprocessing but if I would to create script i can map them on script level already (but that's my personal choice)
                  note3: {"timestamp": "2023-12-04T13:45:20.875019", "test": 59, "status": "success", "message": " Redis connection [API - x.x.x.x 6379]", "stdout": "", "stderr": ""}​ - you could use it too to create it from this line - but overall structure will looks slightly different, instead userparameter as your script you could use read file use log item
                  https://www.zabbix.com/documentation...persistent-dir
                  then map macros as per note2. you will have to create item prototype of macro value mapped to "test" value is it seems like it is the most unique and i guess should be same same step for each test run.

                  [{"{#UNITTEST}":"TEST 1","{#UNITTESTDESCRIPTION}":"description","{#UN ITI P}":"127.0.0.1","{#UNITPORT}":"123" }, { "{#UNITTEST}":"TEST 1","{#UNITTESTDESCRIPTION}":"description","{#UN ITI P}":"127.0.0.1","{#UNITPORT}": "123"}]
                  formatted json can look like this:
                  [
                  {
                  "{#UNITTEST}": "TEST 1",
                  "{#UNITTESTDESCRIPTION}": "description",
                  "{#UNITIP}": "127.0.0.1",
                  "{#UNITPORT}": "123"
                  }
                  ,
                  {
                  "{#UNITTEST}": "TEST 1",
                  "{#UNITTESTDESCRIPTION}": "description",
                  "{#UNITIP}": "127.0.0.1",
                  "{#UNITPORT}": "123"
                  }

                  ]


                  item prototype 1
                  item prototype 2



                  Create dependent items as item prototype (that points to static item created 1st) that will capture each line with different type (TEST 1, TEST 2 etc) of tests using string filter as a macro {#UNITTEST} and it will separate tests into each item. If using your json (note3) - then you simply use jsonpath to filter results.

                  Then Create a trigger prototype which will look at filtered results for failure from each TEST

                  That is how I would do it - but can't write a script for LLD. This will give you a general overview of how to achieve it.

                  If you can't get list of nodes and types - you will need to create each discovery for each type of server. Each server type would have to have separate file too if you have more servers.

                  It doesn't sound very easy if those are docker images or whatever else.. I dont know why you don't monitor each node with zabbix agent from inside where they are rather than connecting to them. And to be honest low level discovery it a overkill anyway. You could use your script that filters "failure" and send those lines to generic trapper item with zabbix_sender. But again - you don't want to close problem based on nodata() - this is bad practice and not very efficient​.

                  Overall - everything is possible but not sure if this method is worth doing - one thing will change and you have to rethink whole thing. If you read it all - then you have 2 methods how you can achieve it, both very similar and you can get same thing. There are more ways to do it, but for what it is - not worth mentioning.
                  Last edited by PeterZielony; 11-12-2023, 13:48.

                  Hiring in the UK? Drop a message

                  Comment

                  • ijonjic
                    Junior Member
                    • Nov 2023
                    • 10

                    #10
                    Thank you PeterZielony.

                    I've prepared log file to look like this:

                    {"TestNo": "TEST 60","Message": "webrtc-debugger de-attached successfully","Status": "Success"}
                    {"TestNo": "TEST 61","Message": "webrtc-debugger removed successfully","Status": "Success"}


                    I have manage to do following:

                    -Created template (TemplateAdobeConnect)
                    - Created item inside template
                    Name: READ_LOG_FILE
                    Type: Zabbix agent (active)
                    Key: log[{$FILE_PATH}] - {$FILE_PATH} is host-based MACRO which points to log file location on that specific host.
                    Type of information: Log
                    Interval: 1m

                    I'm getting following output:​

                    Click image for larger version  Name:	Item_ouput_screenshot.png Views:	0 Size:	36.6 KB ID:	475663

                    - Created Discover rule:
                    Name: READ_LOG_FILE_Discovery
                    Type: Dependent item
                    Key: read_log_file_discovery
                    Master item: TemplateAdobeConnect: READ_LOG_FILE

                    Added 3 LLD Macros:
                    {#MESSAGE} --> $.Message
                    {#STATUS}-->$.Status
                    {#TESTNUMBER}-->$.TestNo​​

                    Created Item prototype:
                    Name: {#MESSAGE}
                    Type: Dependent item
                    Key: log.item.prototype
                    Type of information: Log
                    Master item: TemplateAdobeConnect: READ_LOG_FILE

                    I'm not getting any discovery,clearly i'm missing something out,please help,if you can.

                    Once more,thank you.
                    Last edited by ijonjic; 12-12-2023, 13:19.

                    Comment

                    • PeterZielony
                      Senior Member
                      • Nov 2022
                      • 146

                      #11
                      Can you export template and share it here with sample file? I'll try to look at it .. possibly tomorrow/after tomorrow

                      Hiring in the UK? Drop a message

                      Comment

                      • ijonjic
                        Junior Member
                        • Nov 2023
                        • 10

                        #12
                        No worries, please take a look at it when you catch some free time.
                        Export and sample file attached.

                        zbx_export_templates.txt is originally yaml file so please just change extension from .txt to .yaml (if you have problem when import) as i'm not allowed to upload .yaml version.

                        Thanks in advance.

                        Best regards
                        Attached Files

                        Comment

                        • PeterZielony
                          Senior Member
                          • Nov 2022
                          • 146

                          #13
                          Originally posted by ijonjic
                          No worries, please take a look at it when you catch some free time.
                          Export and sample file attached.

                          zbx_export_templates.txt is originally yaml file so please just change extension from .txt to .yaml (if you have problem when import) as i'm not allowed to upload .yaml version.

                          Thanks in advance.

                          Best regards
                          is this output from sample log file come from sh from adobe or is this based on the original output file that constructs json?

                          Hiring in the UK? Drop a message

                          Comment

                          • ijonjic
                            Junior Member
                            • Nov 2023
                            • 10

                            #14
                            Sample log file comes from my script which additionally parse adobe script output.

                            Orignial output look like this:

                            [TEST 45] --> webrtc-debugger de-attached successfully : success

                            Comment


                            • PeterZielony
                              PeterZielony commented
                              Editing a comment
                              can you upload this script too? i guess we would need it to run discovery using userparameter

                              i guess i was wrong about it. I mean discovery has to read a full file that will be sent to discovery, this script instead of writing to file it needs to have a simple output with json - when using discovery with userparameter zabbix agent will capture it and pass it back to discovery for processing.

                              it seems we cant use dependent item as discovery as log is "single" liner and it need to read it all in one go
                              Last edited by PeterZielony; 14-12-2023, 13:20.
                          • ijonjic
                            Junior Member
                            • Nov 2023
                            • 10

                            #15
                            Sure,here you go...
                            Attached Files

                            Comment


                            • PeterZielony
                              PeterZielony commented
                              Editing a comment
                              i mean script that converts this to json

                            • ijonjic
                              ijonjic commented
                              Editing a comment
                              Sorry...uploaded now.
                          Working...