Running local kibana and elasticsearch in containers with docker — Technical guide
For such developers, I’ve created a clean, minimal docker-compose file that you can use to spawn up Elastic Search and Kibana instance on your local environment in just a single command.
Link to repo: https://github.com/priyank-R/technical_guides/tree/master/Elasticsearch_Kibana_Docker_Compose
Pre Requisites
Make sure you’ve docker-desktop installed on your mac / windows machine and you’re able to execute docker-compose command.
This is the docker-compose file:
This is the .env file required for the docker-compose to run:
Finally, the command to start the containers within a stack:
docker-compose -f docker-compose.yaml --env-file .env -p elk_stack up -d
Break-down of the docker-compose command and docker-compose.yaml file:
- For the docker-compose up command, we’re passing 4 main arguments: -f is the file we want to execute, -env-file is the environment file that will be used for replacing the environments variables specified in docker-compose.yaml, -p is the name of the stack that will be created in docker and -d denotes that the containers should keep running even after the process (command) that triggered the execution is stopped.
- Stack name is very important here, since if you change the name of the stack to something else, you will also have to change the ELASTICSEARCH_HOSTS value with the appropriate stack name. You cannot use the local IP of the container directly because you won’t know it unless you run the container.
- We’re enabling http security in our elastic search so you will be expected to pass in username and password (found in .env file) when you connect to your instance.
Priyank Rupareliya is a Senior Software Engineer focused on architecting solutions revolving around Cloud, DevOps, Containerization, Backend and Frontend.