Datadog System Installation and Configuration through Docker
Table of Contents
System Installation
https://docs.datadoghq.com/getting_started/agent/?tab=datadogussite
DD_API_KEY=1234567890 bash -c "$(curl -L https://raw.githubusercontent.com/DataDog/datadog-agent/master/cmd/agent/install_mac_os.sh)"
System through Docker
DD_API_KEY=1234567890 bash -c "$(curl -L https://raw.githubusercontent.com/DataDog/datadog-agent/master/cmd/agent/install_script.sh)"
Docker System Installation
https://app.datadoghq.com/account/settings#agent/debian https://hub.docker.com/r/datadog/docker-dd-agent/dockerfile/
FROM alpine:3.6
MAINTAINER Datadog <package@datadoghq.com>
ARG AGENT_VERSION_ARG=5.31.0
ENV DD_HOME=/opt/datadog-agent \
# prevent the agent from being started after install
DD_START_AGENT=0 \
DOCKER_DD_AGENT=yes \
PYCURL_SSL_LIBRARY=openssl \
AGENT_VERSION=$AGENT_VERSION_ARG \
INTEGRATIONS_VERSION=$AGENT_VERSION_ARG \
DD_ETC_ROOT="/opt/datadog-agent/agent" \
PATH="/opt/datadog-agent/venv/bin:/opt/datadog-agent/agent/bin:$PATH" \
PYTHONPATH="/opt/datadog-agent/agent" \
DD_CONF_LOG_TO_SYSLOG=no \
NON_LOCAL_TRAFFIC=yes \
DD_SUPERVISOR_DELETE_USER=yes \
DD_CONF_PROCFS_PATH="/host/proc"
# Install minimal dependencies
RUN apk add -qU --no-cache coreutils curl curl-dev python-dev tar sysstat tini
# Install build dependencies
ADD https://raw.githubusercontent.com/DataDog/dd-agent/master/packaging/datadog-agent/source/setup_agent.sh /tmp/setup_agent.sh
RUN apk add -qU --no-cache -t .build-deps gcc musl-dev postgresql-dev linux-headers libffi-dev \
# Install the agent
&& sh /tmp/setup_agent.sh \
# Clean build dependencies
&& apk del -q .build-deps \
&& rm /tmp/setup_agent.sh
# Add healthcheck script
COPY probe-alpine.sh $DD_HOME/probe.sh
# Configure the Agent
# and make healthcheck script executable
RUN cp ${DD_ETC_ROOT}/datadog.conf.example ${DD_ETC_ROOT}/datadog.conf \
&& chmod +x $DD_HOME/probe.sh
# Add Docker check
COPY conf.d/docker_daemon.yaml "${DD_ETC_ROOT}/conf.d/docker_daemon.yaml"
# Add install and config files
COPY entrypoint.sh /entrypoint.sh
COPY config_builder.py /config_builder.py
# Extra conf.d and checks.d
VOLUME ["/conf.d", "/checks.d"]
# Expose DogStatsD port
EXPOSE 8125/udp
# Healthcheck
HEALTHCHECK --interval=5m --timeout=3s --retries=1 \
CMD ./probe.sh
ENTRYPOINT ["/sbin/tini", "-g", "--", "/entrypoint.sh"]
WORKDIR $DD_HOME
CMD ["supervisord", "-c", "agent/supervisor.conf"]
Dockerfile
Docker Compose
https://docs.docker.com/compose/ https://github.com/DataDog/trace-examples/blob/master/python/docker-compose.yml https://github.com/y-ohgi/datadog-examples/blob/655af48a104827f468424a635bfd9b801dcabec0/express/docker-compose.yml
version: "3.4" services: agent: image: datadog/agent:latest volumes: - /var/run/docker.sock:/var/run/docker.sock:ro - /proc/:/host/proc/:ro - /sys/fs/cgroup/:/host/sys/fs/cgroup:ro environment: - DD_API_KEY=${DATADOG_API_KEY} - DD_APM_ENABLED=true # - DD_LOG_LEVEL=debug ports: - 8126:8126 django2-apache2: build: context: django_demo_app dockerfile: Dockerfile_wsgi ports: - '8020:80' environment: DATADOG_TRACE_AGENT_HOSTNAME: agent SOME_RANDOM_VALUE: hey_you volumes: - $PWD/django_demo_app:/var/www/html/app
Django
Terraform
https://www.terraform.io/docs/providers/datadog/index.html
This can also be done using Amazon Web Services Integration:
https://app.datadoghq.com/account/settings#integrations/amazon_web_services
Fargate
https://www.datadoghq.com/blog/monitor-aws-fargate/
{
"family": "redis-datadog",
"networkMode": "awsvpc",
"containerDefinitions": [
{
"name": "redis",
"image": "redis:latest",
"essential": true,
"dockerLabels": {
"com.datadoghq.ad.instances": "[{\"host\": \"%%host%%\", \"port\": 6379}]",
"com.datadoghq.ad.check_names": "[\"redisdb\"]",
"com.datadoghq.ad.init_configs": "[{}]"
}
}, {
"name": "datadog-agent",
"image": "datadog/agent:latest",
"essential": true,
"environment": [
{
"name": "DD_API_KEY",
"value": "$YOUR_API_KEY"
},
{
"name": "ECS_FARGATE",
"value": "true"
}
]
}
],
"requiresCompatibilities": [
"FARGATE"
],
"cpu": "256",
"memory": "512"
}