I was wondering if someone could share their experience with monitoring services running inside docker containers.
We have a couple dozen containers running various services (nginx, php-fpm, mysql) and would like to utilize Zabbix functionality to get alerts. We have working solutions on traditional non containerized services, but are feeling a bit in the dark on how to go about doing this.
Our options are:
Install "super agent" locally on baremetal agent, that connects to the containers and performs the various checks. This poses some problems as we do not have access to the networks that the containers are running on, so would need to map external ports to the services (say mariadb:3306 for example). This could get ugly quick.
Install agents in each of the containers and perform manual configuration for each container. This seems time consuming, and a potential performance hit - Not to mention the fact we would lose the benefits of some pre-made packages (i.e. using the mariadb provided image from hub.docker.com - we would have to build our own images going forward each time a new release occurred).
Perform Server side monitoring only - Again we would have to expose specific ports on our network to gain access to those services.
Am I missing something? Information for monitoring docker containers is available on the internet but strictly for monitoring the CPU usage, Memory and so on, yet very bare on monitoring the actual services inside the containers themselves.
We have a couple dozen containers running various services (nginx, php-fpm, mysql) and would like to utilize Zabbix functionality to get alerts. We have working solutions on traditional non containerized services, but are feeling a bit in the dark on how to go about doing this.
Our options are:
Install "super agent" locally on baremetal agent, that connects to the containers and performs the various checks. This poses some problems as we do not have access to the networks that the containers are running on, so would need to map external ports to the services (say mariadb:3306 for example). This could get ugly quick.
Install agents in each of the containers and perform manual configuration for each container. This seems time consuming, and a potential performance hit - Not to mention the fact we would lose the benefits of some pre-made packages (i.e. using the mariadb provided image from hub.docker.com - we would have to build our own images going forward each time a new release occurred).
Perform Server side monitoring only - Again we would have to expose specific ports on our network to gain access to those services.
Am I missing something? Information for monitoring docker containers is available on the internet but strictly for monitoring the CPU usage, Memory and so on, yet very bare on monitoring the actual services inside the containers themselves.
Comment