Ad Widget

Collapse

Zabbix API - Need help with a workaround - Reached max JSON size limit?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • diveratis
    Junior Member
    • Aug 2013
    • 4

    #1

    Zabbix API - Need help with a workaround - Reached max JSON size limit?

    Using:
    Python 2.4
    Zabbix 2.0.5
    RHEL 5

    Previously I had an issue with importing groups and templates linked to hosts, and I found that I should have used templateLinkage instead of templates.

    However, it seems that after adding templateLinkage, when I configuration.import on xml files larger than a certain size, the script will output a JSON error:

    Code:
    Traceback (most recent call last):
        File "/usr/lib64/python2.4/site-packages/simplejson/__init__.py", line 267, in load
           parse_constant=parse_constant, **kw)
        File "/usr/lib64/python2.4/site-packages/simplejson/__init__.py", line 307, in loads
           return _default_decoder.decode(s)
        File "/usr/lib64/python2.4/site-packages/simplejson/decoder.py", line 335, in decode
           obj, end = self.raw_decode(s, idx=_w(s, 0).end())
        File "/usr/lib64/python2.4/site-packages/simplejson/decoder.py", line 353, in raw_decode
           raise ValueError("No JSON object could be decoded")
    ValueError: No JSON object could be decoded
    This doesn't appear on smaller .xml files that I've exported via the webserver. If i break the larger .xml files into smaller files, they all individually work fine.

    Is there a limit on the size of the xml file i can pass in via API, and if so, is there anyway I can work around this without having to break larger files into smaller ones?

    Unfortunately with my inexperience with most of this, I'm having trouble visualizing how to get around this issue. Right now all I can think of is parsing the entire XML file at once, traverse the string until I've reached a certain number of characters and have found a </host> since most of my export files end in
    Code:
            .
            .
            </host>
        </hosts>
    </export_zabbix>
    split the string and append </hosts>\n</export_zabbix> so that it'll follow the standard zabbix export template, and then call a configuration.import on each of the split strings. This seems like a really awful workaround...

    Searching the Zabbix forums on json issues, I came across this post:



    Which, seems like it could be related to my problem. However, I don't have access to the source files, so unless there's someway to pass in an env variable to change MAX_BUFFER_LEN [assuming this post is related to my issue], I'm out of luck.
  • BDiE8VNy
    Senior Member
    • Apr 2010
    • 680

    #2
    As far as I know MAX_BUFFER_LEN is used in processing of values only.
    Do you have already inspected the web server log files?

    Maybe one of these has been exceeded:
    Code:
    grep ^\ *php_value /etc/httpd/conf.d/zabbix.conf
        php_value max_execution_time 300
        php_value memory_limit 384M
        php_value post_max_size 16M
        php_value upload_max_filesize 2M
        php_value max_input_time 300

    Comment

    • diveratis
      Junior Member
      • Aug 2013
      • 4

      #3
      Thanks for the reply. Your advice gave me a great place to start looking, but unfortunately no immediate solution was found.

      I looked at my /var/log/httpd/error_log file and noticed a bunch of PHP Fatal Errors that all mentioned

      Code:
      PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 76 bytes) in /usr/share/zabbix/include/classes/debug/CProfiler.php on line 232.
      I also get some memory size limit errors pointing to /usr/share/zabbix/include/db.inc.php line 572

      Line 232 in CProfiler.php refers to:
      Code:
      public function profileSql($time, $sql){
         global $USER_DETAILS;
         if ((!is_null($USER_DETAILS) && isset ($USER_DETAILS['debug_mode']) && ($USER_DETAILS['debug_mode'] == GROUP_DEBUG_MODE_DISABLED))){
            return;
         }
         
         $time = round($time, 6);
         $this->sqlTotalTime+= $time;
         $this->sqlQueryLog[] = array(
            $time,
            $sql,
            [B]array_slice(debug_backtrace(),1)[/B]
          );
      }
      Line 572 in db.inc.php refers to:
      Code:
      function DBfetch(&$cursor, $convertNulls = true){
      .
      .
      .
         switch ($DB['TYPE']){
            case ZBX_DB_MYSQL:
      [B]         if (!$result=mysql_fetch_assoc($cursor)){[/B]
                  mysql_free_result($cursor);
               }
               break;
      .
      .
      I spent a little bit of time googling up this memory error which led me to changing memory_limit in /etc/php.ini and then restarting the httpd service, but regardless of what value I set memory_limit to [i've tried 32M, 128M, 256M, 512M], I'll always get this memory size limit error.


      If you can provide some assistance for this issue, that'd be great.

      For the time being, I ended up just going with the terrible workaround of parsing the large xml file into smaller strings, performing a regex on each string for xml open and closing tags and individually doing a configuration.import on each string. That seems to work for me at the moment.

      Comment

      Working...