Hi all,
I am running into an issue I cannot seem to solve and would like your help on it.
I have a robots.txt file on a website that I'd like to check with Zabbix. To that end I have created an item that pulls this file via a custom/userparameter key. So far so good, the itemdata looks like this:
User-agent: *
Disallow: /
Sitemap: https://www.someurlhere.com/sitemap.xml
Now I'd like to make sure this data stays the same forever (sometimes people accidentally overwrite it) so I have created a trigger like this:
{Host:robots.check[https://www.someurlhere.com/robots.txt].regexp("User-agent: Disallow: / Sitemap: https://www.someurlhere.com/sitemap.xml")}=0
But that does not work. So I need to trigger this when the item text is not the same as this static text. I have tried adding returns, escaping characters and many different things, but it does not trigger correctly.
Can someone point me in the right direction?
Many thanks in advance.
I am running into an issue I cannot seem to solve and would like your help on it.
I have a robots.txt file on a website that I'd like to check with Zabbix. To that end I have created an item that pulls this file via a custom/userparameter key. So far so good, the itemdata looks like this:
User-agent: *
Disallow: /
Sitemap: https://www.someurlhere.com/sitemap.xml
Now I'd like to make sure this data stays the same forever (sometimes people accidentally overwrite it) so I have created a trigger like this:
{Host:robots.check[https://www.someurlhere.com/robots.txt].regexp("User-agent: Disallow: / Sitemap: https://www.someurlhere.com/sitemap.xml")}=0
But that does not work. So I need to trigger this when the item text is not the same as this static text. I have tried adding returns, escaping characters and many different things, but it does not trigger correctly.
Can someone point me in the right direction?
Many thanks in advance.
Comment