Create an Ansible playground with Docker

Let's create an Ansible playground with Docker
Thursday, February 29, 2024

Dive into the world of Ansible by building your own playground environment with Docker. This blog post will guide you through the steps to set up a Docker environment. Get ready to explore, experiment, and enhance your Ansible skills in a controlled and dynamic environment.

Why use Ansible with Docker ?

Using Ansible inside Docker containers offers several advantages:

  1. Local Environment Convenience: By encapsulating Ansible within Docker containers locally, you mitigate the risk of inadvertently disrupting external systems. Enjoy the freedom to experiment and iterate without concerns about impacting servers utilized by others.
  2. Effortless Installation and Reinstallation: Docker simplifies the deployment of Ansible, allowing for swift installation and reinstallation as needed.
  3. Ideal for Testing Scenarios: Testing your Ansible playbooks becomes easier with Docker containers, ensuring flawless performance in various scenarios.
  4. Seamless Integration with CI/CD Pipelines: Automating playbook tests is effortless when combined with CI/CD pipelines such as GitLab CI/CD or Jenkins. Dockerized Ansible playbooks fit seamlessly into your automated testing workflows, enhancing efficiency and reliability.

Let’s build the Ansible environment

Set the Docker images

First, let’s create two Docker images for two distinct environments:

  1. The first environment will have Ansible installed
  2. The second environment will serve as the target for Ansible playbooks

Let’s install Ansible:

FROM alpine:latest
RUN apk add ansible openssh
CMD ["tail", "-f", "/dev/null"]

And now let’s set up the targets and install openssh-server and python3:
FROM alpine:latest
RUN apk add openssh-server python3
RUN adduser -D ansible
RUN echo "ansible:ansible" | chpasswd
RUN ssh-keygen -A
CMD ["/usr/sbin/sshd", "-D"]

Import the playbooks

Next, let’s create the necessary files within a directory named code_ansible/. We’ll then mount this directory to the Ansible container.

Here’s an example of what a host file might look like:


And here is an example of playbook:

- hosts: targets
remote_user: ansible
- name: My first playbook
msg: "A super debug message !"
- name: Create a file called "hello.txt" on all the servers
dest: /tmp/hello.txt
content: My super content !

Run the containers

Let’s create the containers with a docker-compose.yml file:

version: '3.8'
image: ansible
context: .
dockerfile: Dockerfile.ansible
- ./code_ansible:/code_ansible/
working_dir: /code_ansible/
image: target
context: .

Finally, we’ll start the containers. After that, we’ll generate SSH keys across the containers to allow connections from Ansible.

docker-compose down
docker-compose up -d --build
# docker-compose scale target=3
docker-compose exec ansible mkdir -p "/root/.ssh/"
docker-compose exec ansible ssh-keygen -t rsa -b 4096 -N "" -f "/root/.ssh/id_rsa"
SSH_PUBLIC_KEY=$(docker-compose exec ansible cat "/root/.ssh/" | tr -d '\r\n')
docker-compose exec target sh -c "mkdir -p /home/ansible/.ssh/"
docker-compose exec target sh -c "echo '$SSH_PUBLIC_KEY' > /home/ansible/.ssh/authorized_keys"
docker-compose exec target sh -c "chown ansible:ansible /home/ansible/.ssh/authorized_keys"
# Avoid to have this first SSH connection message:
# TASK [Gathering Facts]
# The authenticity of host 'target (' can't be established.
# ED25519 key fingerprint is SHA256:AdO48xg01Oe7sslgmC6/SJoN7AzR1fCF0cz0lVzBSpM.
# This key is not known by any other names.
# Are you sure you want to continue connecting (yes/no/[fingerprint])? no
docker-compose exec ansible sh -c "echo 'Host *
StrictHostKeyChecking no' > /root/.ssh/config"
docker-compose exec ansible sh -c "echo 'ansible-playbook -i hosts playbook.yml'"
docker-compose exec ansible sh

Once the SSH keys are generated and configured, your Ansible setup will be ready to manage the target environments effectively.


In conclusion, the combination of Ansible and Docker offers a potent solution for managing and automating tasks with ease. By following these steps, you can harness the full potential of both tools, empowering your infrastructure with efficiency and reliability.

Recommended articles