- Docker installed
- Docker Compose installed
- HTTPie installed - https://httpie.org/
- Execute
./build-docker-images
This will build the following images:
- Fluentd Docker image: with plugins that are used during the demo
- Apache Httpd Docker image: modified version of the HTTPD image using the combined logging format
- Spring Boot Demo Docker image: Small Spring Boot app that generates some logging information
- Start Fluentd using docker-compose > dc up
- Show docker-compose.yml > including portmapping
- Show Fluent config > fluent.conf
- Navigate to http://localhost:24220/api/plugins.json > show the default healtcheck
- Start Fluentd using docker-compose > dc up
- Show docker-compose.yml > including portmapping
- Explain the Docker logging driver > docker-compose.yml
- Without the links the echo container will stop because the Fluentd container is not available, --> use async-connect setting from the LogDriver
- Explain the default fields that Fluentd adds like:
container_name
source
log
- Show Fluent config > fluent.conf
- Stop Fluentd and enable dummy data in config > restart to show dummy data
- Start Fluentd using docker-compose > dc up
- Explain fluent.conf with the added
filter
- Generate some data by navigating to http://localhost/ or by executing
./generate-load.sh
- Show the results in Fluentd log
- Start Fluentd using docker-compose > dc up
- Exec inside the container that runs mongo
- docker exec -it mongodb bash
- Start Mongo shell
mongo
- `show dbs
- See that the fluentd database is not there
- Generate some data by navigating to http://localhost/ or by executing `./generate-load.sh
use fluentd
db.test.find()
db.test.count()
- Start Fluentd using docker-compose > dc up
- Generate some data by navigating to http://localhost/ or by executing `./generate-load.sh
- Explain docker-compose and show the 2 added services
- Elasticsearch
- Kibana
- Show fluent.conf and explain the added match with type @copy
- Stdout
- Elasticsearch
- File output
- Navigate to http://localhost:5601/ and show the data that flows in
- Create PIE vizualisation
- Split Slices > Aggregation > Terms > Field > agent
- Start Fluentd using docker-compose > dc up
- Explain the Spring Boot application
- Navigate to http://localhost:8080/ and generate some load showing the log entry
- Navigate to http://localhost:5601/ and show the data that flows in
- Filter on possibilities
- Start Fluentd using docker-compose > dc up
- Explain the Log Collector / Log Aggregator
- Show the Fluentd config for Collector / Aggregator
- Generate load by executing `./generate-load.sh
- Stop the Log Aggregator 1 >
docker stop fluentd-aggregator-1
- See data flowing to Log Aggregator 2
- Start Log Aggregator 1 >
docker start fluentd-aggregator-1