Ad Widget

Collapse

Help with a refresh of LLD logic

Collapse
This topic has been answered.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Rodolfo Rojas
    Junior Member
    • Jun 2023
    • 8

    #1

    Help with a refresh of LLD logic

    Hi all, I am stuck to get a python script to work, I need to receive metrics from a query to a remote server and send them through zabbix_send to a prototype item in zabbix. Everything seemed to be perfect but the problem is that when receiving the values it sends them in the key items (creating new items) instead of sending them as values, that is to say, I am connecting correctly and receiving the metrics that I want but I am not assigning them as values to my items, I imagine that I am not sending the values that I receive to my macros but I do not know where to make this assignment. I have checked many times my code and reviewed the logic of the LLD but I am at a point of no return, someone please help me.

    HTML Code:
    from pyzabbix import ZabbixMetric, ZabbixSender
    from fabric import Connection
    import time
    import datetime
    import pytz
    import re
    import yaml
    import sys
    import json
    def sendZabbix(server, port, data):
        # Configura la dirección y el puerto del servidor Zabbix
        zabbix_server = server
        zabbix_port = port
        # Create a dictionary to hold the discovery data
        discovery_data = {'data': [],}
        for i in data:
            discovery_item = {
                '{#REPLID}': i[0],
                '{#REPLNAME}': i[1],
                '{#REPLCLUSTER}': i[2],
                '{#REPLFROZENDURATION}': str(i[4].total_seconds()),
                '{#REPLMASTER}': i[5],
                '{#REPLSTATUS}': i[6],
            }
            discovery_data['data'].append(discovery_item)
        # Convert the dictionary to a JSON string
        json_data = json.dumps(discovery_data)
        # Create ZabbixMetric to discovery item
        discovery_metric = ZabbixMetric(host, 'repl.discovery', json_data)
        # Create a ZabbixSender instance and send the discovery metric
        zabbix_sender = ZabbixSender(zabbix_server=zabbix_server, zabbix_port=10051)
        print(discovery_metric)
        result = zabbix_sender.send([discovery_metric])
        print(result)
        # Show result
        if result.failed:
            print(f"Error al enviar métricas de descubrimiento: {result.failed}")
        else:
            print("Métricas de descubrimiento enviadas exitosamente.")
          
    def runnerReplic(host, user, pssw):
        # Establish a conection SSH
        conn = Connection(host=host, user=user, connect_kwargs={"password": pssw})
        # Execute command
        timezone = conn.run("showtimezone -delim :", hide=True)
        frezee_time = conn.run("lsrcconsistgrp -delim ,", hide=True)
        # Print output
        print(frezee_time.stdout)
        match = re.search(r'\d+:(\w+/\w+)', timezone.stdout)
        # Close SSH
        conn.close()
        original_timezone = pytz.timezone("UTC")
    
        target_timezone = pytz.timezone(match.group(1))
    
        # Split the lines
        lineas = (frezee_time.stdout).split('\n')
        data = []
        # Iterar a través de las líneas, excluyendo la primera (que contiene encabezados)
        for linea in lineas[1:]:
            elementos = linea.split(',')  # Dividir la línea en elementos separados por comas
            if len(elementos) >= 2:
                id = elementos[0]
                name = elementos[1]
                cluster_name = elementos[5]
                master_name = elementos[3]
                status = elementos[6]
                freeze_times = target_timezone.localize(datetime.datetime.strptime(elementos[-1], "%Y/%m/%d/%H/%M/%S"))
                freeze_times = freeze_times.astimezone(original_timezone)
                now = datetime.datetime.now(original_timezone).replace(microsecond=0)
                data.append([id,name,cluster_name,freeze_times,now-freeze_times,master_name,status])
        return data
    def main():
        global host
        host = sys.argv[1]
        # Credentials from the YAML
        conf = yaml.load(open('path/configuration.yml'), Loader=yaml.SafeLoader)
        user = conf['svc']['user']
        password = conf['svc']['password']
        data = runnerReplic(host, user, password)
        zabbix_server = 'localhost'
        port = 10050
        sendZabbix(zabbix_server, port, data)
    if __name__ == "__main__":
        main()
    
    ​
    Click image for larger version

Name:	2023-10-25 14_26_01-Configuration of items.png
Views:	173
Size:	135.4 KB
ID:	473125 Click image for larger version

Name:	2023-10-27 08_50_32.png
Views:	173
Size:	69.6 KB
ID:	473124
    Attached Files
  • Answer selected by Rodolfo Rojas at 07-11-2023, 19:23.
    ISiroshtan
    Senior Member
    • Nov 2019
    • 324


    Ok, first of all, if you want to use trapper discovery and then trapper items - I would advise to separate them in two different scripts. Discovery trapper will not fill out data to items. It's aim is to get JSON with fields that you want to use in item names and item keys. This JSON can accept 2 possible formats:

    1. Old style - in this case you expected to have JSON object with data field containing array inside, which will be iterated to create items prototypes. I.e.:
    Code:
    { "data":[
    { "Name":"/", "Type":"rootfs" },
    { "Name":"/sys", "Type":"sysfs" },
    { "Name":"/proc", "Type":"proc" },
    { "Name":"/dev", "Type":"devtmpfs" }]}
    2. New style, where you just need to feed an JSON array:
    Code:
    [
    { "Name":"/", "Type":"rootfs" },
    { "Name":"/sys", "Type":"sysfs" },
    { "Name":"/proc", "Type":"proc" },
    { "Name":"/dev", "Type":"devtmpfs" }]

    Any other combination, like providing JSON object (not array) without "data" field will result in error.
    In both approaches above, Zabbix will start iterate through array and for each element of an array it will create a new set of items based on existing prototypes. So if you have 5 item prototypes it will create(discover) 5 items FOR EACH object inside the JSON array.
    You MUST define some form of LLD Macro and use it in item.prototype key to ensure it's unique for each of JSON items(item names and trigger names can be duplicate as far as I remember, so using LLD Macro there is up to you). Normally you would run this trapper on long interval, like once an hours or even less often, depending on your expectation for how often this items can change (how often new replications are added/removed/stopped/started on your system)


    Now the second step is feeding values to these newly created items through the trapper. When you feed values you need to use newly discovered item keys, not the key of discovery itself! And you don't need to feed JSON as value. You actually instead push just single value representing what that specific items should hold. So you would do something like {"host": "172.x.x.x", "key": "repl.frozen[229.0]", "value": "192.0"}, to feed value "192.0" into discovered items with key repl.frozen[229.0] on host 172.x.x.x
    Last edited by ISiroshtan; 30-10-2023, 20:44.

    Comment

    • ISiroshtan
      Senior Member
      • Nov 2019
      • 324

      #2
      Can you remove the extra [ from your item prototype keys and try again? What I mean is instead of
      [{#REPLID}_[{#REPLNAME}]
      do the
      [{#REPLID}_{#REPLNAME}]​
      . And do similar change for all item prototypes except Freeze Time, where you don't have such a mistake

      Comment

      • Rodolfo Rojas
        Junior Member
        • Jun 2023
        • 8

        #3
        Thanks for your feedback, I corrected it, but that is not the problem itself, the problem is that I am receiving data and putting it into the new item prototype, so I am taking the data and assigning it to the macros (which is expected) but I don't see anything in latest data.
        I guess I am not setting what value the item should take so I guess my problem is my script I made some modifications to assign each type of value to my macros and now I have this error. Cannot find the "data" array in the received JSON object. Click image for larger version

Name:	2023-10-30 11_49_49 zabbix_test.py - Visual Studio Code.png
Views:	181
Size:	109.3 KB
ID:	473233 Click image for larger version

Name:	2023-10-30 11_41_19-.png
Views:	173
Size:	33.5 KB
ID:	473232 Click image for larger version

Name:	2023-10-30 11_48_44-Configuration of discovery rules.png
Views:	267
Size:	21.9 KB
ID:	473231

        Comment

        • ISiroshtan
          Senior Member
          • Nov 2019
          • 324

          #4

          Ok, first of all, if you want to use trapper discovery and then trapper items - I would advise to separate them in two different scripts. Discovery trapper will not fill out data to items. It's aim is to get JSON with fields that you want to use in item names and item keys. This JSON can accept 2 possible formats:

          1. Old style - in this case you expected to have JSON object with data field containing array inside, which will be iterated to create items prototypes. I.e.:
          Code:
          { "data":[
          { "Name":"/", "Type":"rootfs" },
          { "Name":"/sys", "Type":"sysfs" },
          { "Name":"/proc", "Type":"proc" },
          { "Name":"/dev", "Type":"devtmpfs" }]}
          2. New style, where you just need to feed an JSON array:
          Code:
          [
          { "Name":"/", "Type":"rootfs" },
          { "Name":"/sys", "Type":"sysfs" },
          { "Name":"/proc", "Type":"proc" },
          { "Name":"/dev", "Type":"devtmpfs" }]

          Any other combination, like providing JSON object (not array) without "data" field will result in error.
          In both approaches above, Zabbix will start iterate through array and for each element of an array it will create a new set of items based on existing prototypes. So if you have 5 item prototypes it will create(discover) 5 items FOR EACH object inside the JSON array.
          You MUST define some form of LLD Macro and use it in item.prototype key to ensure it's unique for each of JSON items(item names and trigger names can be duplicate as far as I remember, so using LLD Macro there is up to you). Normally you would run this trapper on long interval, like once an hours or even less often, depending on your expectation for how often this items can change (how often new replications are added/removed/stopped/started on your system)


          Now the second step is feeding values to these newly created items through the trapper. When you feed values you need to use newly discovered item keys, not the key of discovery itself! And you don't need to feed JSON as value. You actually instead push just single value representing what that specific items should hold. So you would do something like {"host": "172.x.x.x", "key": "repl.frozen[229.0]", "value": "192.0"}, to feed value "192.0" into discovered items with key repl.frozen[229.0] on host 172.x.x.x
          Last edited by ISiroshtan; 30-10-2023, 20:44.

          Comment

          • Rodolfo Rojas
            Junior Member
            • Jun 2023
            • 8

            #5
            thank you ISiroshtan I was able to solve the problem

            Comment

            Working...