Ad Widget

Collapse

Monitoring cloud storage (Strato HIDrive)

Collapse
This topic has been answered.
X
X
 
  • Time
  • Show
Clear All
new posts
  • AusDieMaus
    Junior Member
    • May 2019
    • 3

    #1

    Monitoring cloud storage (Strato HIDrive)

    Hello all,

    I have a cloud storage (Strato HIDrive) which I would like to monitor.
    Especially the size of the stored files, so I can throw a warning via a trigger when the capacity is running low.

    I can operate the cloud storage via the following protocols:

    Encryped:
    SFTP
    FTPS
    WebDAV (HTTPS)
    rSync over SSH
    OpenVPN
    SCP

    Not encrypted:
    FTP
    SMB/CIFS
    WebDAV (HTTP)



    Since there is no template for this exact storage, I wanted to ask for some input/ideas on how I could implement this cloud storage capacity monitoring.

    Thank you very much for the help.
    Many greetings.
  • Answer selected by AusDieMaus at 04-04-2022, 16:29.
    AusDieMaus
    Junior Member
    • May 2019
    • 3

    Hello all,

    I have now managed to read out the Strato HIdrive cloud storage so that I can issue a warning if the limits are about to be exceeded.
    I'll describe it briefly here in case someone wants to do something similar and needs help.

    Since Stratos does not offer the possibility for SSH access and also the functions of the SCP protocol were strongly limited here, I had to realize the whole thing over SFTP.
    According to my research I could not implement my plan with the protocols WebDAV, rSync or OpenVPN.

    I implemented the whole thing as a bash script, it surely also works with Windows via Powershell, but for my purposes the way via Linux was more comfortable.
    But this also means that I needed a Linux machine running the Zabbix agent on which I can execute the script.


    In the script for reading via SFTP I had to work with the LFTP client, because in the classic SFTP there is no possibility to pass a password directly from the script; the LFTP client offers this possibility.

    I give the script the directory inside the HIdrive and it returns with the file size of all files contained in the directory (incl. subdirectories).


    Step 1)
    Install LFTP Client (under Debian: apt-get install lftp)

    Step 2)
    Create script for reading the HIdrive directory with the following content:
    ----------------------------------------------------------------------------------------------------------------
    #!/bin/bash

    username="your_username"
    server="sftp.hidrive.strato.com"
    folder="$1"
    export LFTP_PASSWORD="your_supersafe_password"
    lftp --env-password sftp://$username@$server:$folder -e "find -l; bye" | awk '{sum += $3} END {print sum}'
    export LFTP_PASSWORD=""
    ----------------------------------------------------------------------------------------------------------------

    Step 3)
    Adjust the executability of the script (under Debian: chmod +x filename)

    Step 4)
    Entry of the script in the config file of the Zabbix client:
    UserParameter=hidrive.check[*],/folder/filename.sh $1

    Step 5)
    Restart the Zabbix service (under Debian: service zabbix-agent restart)

    Step 6)
    Manual execution of the script with local Linux user and agreeing the server entry in 'know_hosts' (under: /home/user/.ssh/)
    Create file 'ssh_known_hosts' in directory '/etc/ssh/'.
    Adjust the permissions of the file 'ssh_known_hosts' (under Debian: chmod 644 /etc/ssh/ssh_known_hosts)
    Copy the corresponding server entry from 'know_hosts' of the user to 'ssh_known_hosts' in the directory '/etc/ssh/'.

    This must be done because it is a nologin user and otherwise there will be a timeout in Zabbix because the SSH host is not known or confirmed.

    Step 7)
    Creation of new template in Zabbix

    Step 8)
    Creation of new items (one item for each directory/customer to be read)
    Name: HIdrive_check XXX (XXX is replaced by the name of the directory/customer)
    Type: Zabbix agent
    Key: hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/] (/XXX_FOLDER/XXX_SUB_FOLDER/ is replaced by the required directory structure)
    Unit: B
    Update interval: any (e.g.: 10800s)

    Step 9)
    Creation of macros in template (values can be changed as needed):
    1.
    Macro: {$FREE_CLOUD_HARD_WARNING}
    Value: 5
    Description: Hard warning free cloud disk space in percent

    2.
    Macro: {$FREE_CLOUD_SOFT_WARNING}
    Value: 10
    Description: Soft warning free cloud disk space in percent

    3.
    Macro: {$TOTAL_CLOUD_XXX}
    Value: 2199023255552
    Description: Total cloud disk space quota in bytes
    (XXX is replaced by the name of the directory/customer; the value must correspond to the limit in bytes set in the HIdrive settings, here e.g.: 2TB)

    For each additional required directory another marko with individual {$TOTAL_CLOUD_XXX} value.


    Step 10)
    For each item create one trigger each for soft-warning and hard-warning (XXX is replaced by name of directory/customer && /XXX_FOLDER/XXX_SUB_FOLDER/ is replaced by required directory structure).

    Soft-Warning:
    Name: Free (HIdrive) cloud disk space is less than {$FREE_CLOUD_SOFT_WARNING}% on company XXX.
    Severity: High
    Expression: (100-(last(/template HIdrive/hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/])/{$TOTAL_CLOUD_XXX}*100))<{$FREE_CLOUD_HARD_WARNING }


    Hard-Warning:
    Name: Free (HIdrive-)cloud disk space is less than {$FREE_CLOUD_SOFT_WARNING}% on company XXX
    Severity: Warning
    Expression: (100-(last(/template HIdrive/hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/])/{$TOTAL_CLOUD_XXX}*100))<{$FREE_CLOUD_SOFT_WARNING }


    DONE!
    Now all created HIdrive directories will be read accordingly and a warning will be thrown under the defined limits.


    Maybe someone here needs this and it helps to save time for research.
    Many greetings.

    Comment

    • AusDieMaus
      Junior Member
      • May 2019
      • 3

      #2
      Hello all,

      I have now managed to read out the Strato HIdrive cloud storage so that I can issue a warning if the limits are about to be exceeded.
      I'll describe it briefly here in case someone wants to do something similar and needs help.

      Since Stratos does not offer the possibility for SSH access and also the functions of the SCP protocol were strongly limited here, I had to realize the whole thing over SFTP.
      According to my research I could not implement my plan with the protocols WebDAV, rSync or OpenVPN.

      I implemented the whole thing as a bash script, it surely also works with Windows via Powershell, but for my purposes the way via Linux was more comfortable.
      But this also means that I needed a Linux machine running the Zabbix agent on which I can execute the script.


      In the script for reading via SFTP I had to work with the LFTP client, because in the classic SFTP there is no possibility to pass a password directly from the script; the LFTP client offers this possibility.

      I give the script the directory inside the HIdrive and it returns with the file size of all files contained in the directory (incl. subdirectories).


      Step 1)
      Install LFTP Client (under Debian: apt-get install lftp)

      Step 2)
      Create script for reading the HIdrive directory with the following content:
      ----------------------------------------------------------------------------------------------------------------
      #!/bin/bash

      username="your_username"
      server="sftp.hidrive.strato.com"
      folder="$1"
      export LFTP_PASSWORD="your_supersafe_password"
      lftp --env-password sftp://$username@$server:$folder -e "find -l; bye" | awk '{sum += $3} END {print sum}'
      export LFTP_PASSWORD=""
      ----------------------------------------------------------------------------------------------------------------

      Step 3)
      Adjust the executability of the script (under Debian: chmod +x filename)

      Step 4)
      Entry of the script in the config file of the Zabbix client:
      UserParameter=hidrive.check[*],/folder/filename.sh $1

      Step 5)
      Restart the Zabbix service (under Debian: service zabbix-agent restart)

      Step 6)
      Manual execution of the script with local Linux user and agreeing the server entry in 'know_hosts' (under: /home/user/.ssh/)
      Create file 'ssh_known_hosts' in directory '/etc/ssh/'.
      Adjust the permissions of the file 'ssh_known_hosts' (under Debian: chmod 644 /etc/ssh/ssh_known_hosts)
      Copy the corresponding server entry from 'know_hosts' of the user to 'ssh_known_hosts' in the directory '/etc/ssh/'.

      This must be done because it is a nologin user and otherwise there will be a timeout in Zabbix because the SSH host is not known or confirmed.

      Step 7)
      Creation of new template in Zabbix

      Step 8)
      Creation of new items (one item for each directory/customer to be read)
      Name: HIdrive_check XXX (XXX is replaced by the name of the directory/customer)
      Type: Zabbix agent
      Key: hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/] (/XXX_FOLDER/XXX_SUB_FOLDER/ is replaced by the required directory structure)
      Unit: B
      Update interval: any (e.g.: 10800s)

      Step 9)
      Creation of macros in template (values can be changed as needed):
      1.
      Macro: {$FREE_CLOUD_HARD_WARNING}
      Value: 5
      Description: Hard warning free cloud disk space in percent

      2.
      Macro: {$FREE_CLOUD_SOFT_WARNING}
      Value: 10
      Description: Soft warning free cloud disk space in percent

      3.
      Macro: {$TOTAL_CLOUD_XXX}
      Value: 2199023255552
      Description: Total cloud disk space quota in bytes
      (XXX is replaced by the name of the directory/customer; the value must correspond to the limit in bytes set in the HIdrive settings, here e.g.: 2TB)

      For each additional required directory another marko with individual {$TOTAL_CLOUD_XXX} value.


      Step 10)
      For each item create one trigger each for soft-warning and hard-warning (XXX is replaced by name of directory/customer && /XXX_FOLDER/XXX_SUB_FOLDER/ is replaced by required directory structure).

      Soft-Warning:
      Name: Free (HIdrive) cloud disk space is less than {$FREE_CLOUD_SOFT_WARNING}% on company XXX.
      Severity: High
      Expression: (100-(last(/template HIdrive/hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/])/{$TOTAL_CLOUD_XXX}*100))<{$FREE_CLOUD_HARD_WARNING }


      Hard-Warning:
      Name: Free (HIdrive-)cloud disk space is less than {$FREE_CLOUD_SOFT_WARNING}% on company XXX
      Severity: Warning
      Expression: (100-(last(/template HIdrive/hidrive.check[/XXX_FOLDER/XXX_SUB_FOLDER/])/{$TOTAL_CLOUD_XXX}*100))<{$FREE_CLOUD_SOFT_WARNING }


      DONE!
      Now all created HIdrive directories will be read accordingly and a warning will be thrown under the defined limits.


      Maybe someone here needs this and it helps to save time for research.
      Many greetings.

      Comment

      Working...