Using:
Python 2.4
Zabbix 2.0.5
RHEL 5
Previously I had an issue with importing groups and templates linked to hosts, and I found that I should have used templateLinkage instead of templates.
However, it seems that after adding templateLinkage, when I configuration.import on xml files larger than a certain size, the script will output a JSON error:
This doesn't appear on smaller .xml files that I've exported via the webserver. If i break the larger .xml files into smaller files, they all individually work fine.
Is there a limit on the size of the xml file i can pass in via API, and if so, is there anyway I can work around this without having to break larger files into smaller ones?
Unfortunately with my inexperience with most of this, I'm having trouble visualizing how to get around this issue. Right now all I can think of is parsing the entire XML file at once, traverse the string until I've reached a certain number of characters and have found a </host> since most of my export files end in
split the string and append </hosts>\n</export_zabbix> so that it'll follow the standard zabbix export template, and then call a configuration.import on each of the split strings. This seems like a really awful workaround...
Searching the Zabbix forums on json issues, I came across this post:
Which, seems like it could be related to my problem. However, I don't have access to the source files, so unless there's someway to pass in an env variable to change MAX_BUFFER_LEN [assuming this post is related to my issue], I'm out of luck.
Python 2.4
Zabbix 2.0.5
RHEL 5
Previously I had an issue with importing groups and templates linked to hosts, and I found that I should have used templateLinkage instead of templates.
However, it seems that after adding templateLinkage, when I configuration.import on xml files larger than a certain size, the script will output a JSON error:
Code:
Traceback (most recent call last):
File "/usr/lib64/python2.4/site-packages/simplejson/__init__.py", line 267, in load
parse_constant=parse_constant, **kw)
File "/usr/lib64/python2.4/site-packages/simplejson/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.4/site-packages/simplejson/decoder.py", line 335, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.4/site-packages/simplejson/decoder.py", line 353, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
Is there a limit on the size of the xml file i can pass in via API, and if so, is there anyway I can work around this without having to break larger files into smaller ones?
Unfortunately with my inexperience with most of this, I'm having trouble visualizing how to get around this issue. Right now all I can think of is parsing the entire XML file at once, traverse the string until I've reached a certain number of characters and have found a </host> since most of my export files end in
Code:
.
.
</host>
</hosts>
</export_zabbix>
Searching the Zabbix forums on json issues, I came across this post:
Which, seems like it could be related to my problem. However, I don't have access to the source files, so unless there's someway to pass in an env variable to change MAX_BUFFER_LEN [assuming this post is related to my issue], I'm out of luck.
Comment