Introduction
The transaction logs that generated by applications can be leveraged by the ELK Stack — Elasticsearch, Logstash, and Kibana, into nicely visualization dashboard. In this guidance, we’ll go over how to use Logstash to ingest and transform transaction logs, making them accessible for analysis in Elasticsearch and visualization in Kibana. By the end, you’ll know how to leverage ELK for practical log management and insights.
Table of Contents
IntroductionTable of ContentsPrerequisites Step 1: Set Up the ELK Stack using Docker ComposeStep 2: Configure Logstash to Process Transaction LogsStep 3: Run the Docker ComposeStep 4: Prepare Sample Transaction LogsStep 5: Verify Data in ElasticsearchStep 6: Create Visualization in Kibana
Prerequisites
- Docker and Docker Compose installed on your system
Step 1: Set Up the ELK Stack using Docker Compose
Let’s start by setting up the ELK stack using Docker Compose. This setup will bring up Elasticsearch, Logstash, and Kibana in a single networked environment.
- Create a
docker-compose.yml
file with the following content:
version: '3' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:8.15.3 container_name: elasticsearch environment: - discovery.type=single-node - bootstrap.memory_lock=trye - "ES_JAVA_OPTS=-Xms1g -Xmx1g" ulimits: memlock: soft: -1 hard: -1 ports: - "9200:9200" networks: - elk logstash: image: docker.elastic.co/logstash/logstash:8.15.3 container_name: logstash volumes: - ./logstash_pipeline.conf:/usr/share/logstash/pipeline/logstash_pipeline.conf ports: - "5044:5044" networks: - elk kibana: image: docker.elastic.co/kibana/kibana:8.15.3 container_name: kibana ports: - "5601:5601" networks: - elk networks: elk: driver: bridge
- Create a Logstash Configuration File: Next to the
docker-compose.yml
file, create a file calledlogstash_pipeline.conf
. This will contain the configuration to read and transform transaction logs.
Step 2: Configure Logstash to Process Transaction Logs
Logstash is an event processing engine. You can see the Logstash as establishing pipeline for your events. Your events now can be received, processed, and shipped to other places.
Pipeline = input + (filter) + output
We’ll set up Logstash to parse and transform JSON log data. Here’s the
logstash_pipeline.con
configuration.input { file { path => "logs/transaction_logs*.log" start_position => "beginning" sincedb_path => "/dev/null" # Ensures Logstash reads the file from the beginning code => "json" } } filter { json { source => "message" target => "parsed" remove_field => ["message"] } date { match => ["[parsed][timestamp]", "yyyy-MM-dd HH:mm:ss"] target => "@timestamp" } mutate { rename => { "[parsed][logs]" => "transaction" } } } output { elasticsearch { host => ["http://elasticsearch:9200"] index => "transaction_logs" } stdout { codec => rubydebug } }
The configuration tells Logstash to:
- Input: Read the log file in file name and JSON format
- Filter: Parse JSON, extract and rename fields, and convert the timestamp
- Output: Send data to Elasticsearch under the
transaction_logs
index
Step 3: Run the Docker Compose
Now start the entire ELK stack:
docker compose up -d
Docker compose will launch the ELK stack, where Elasticsearch is available on http://localhost:9200 and Kibana on http://localhost:5601 .
Step 4: Prepare Sample Transaction Logs
Save transaction logs in JSON format in the path specified in
logstash_pipeline.conf
. Here’s a sample of what the logs might look like:2024-10-30 18:53:25 INFO [IS <bni.hanifa#KomiCommonLogging.services:doLogging>] trxlogger ~ {"logs":{"REFERENCE_NO":"000000021357","TYPE":"INQUIRY","LAYER":"CHANNEL","CHANNEL_NAME":"ECP","COMPANY_CODE":"0010000007","COMPANY_NAME":"Three Postpaid","LOCAL_IP":"192.168.244.57","RESPONSE_CODE":"14","RESPONSE_MESSAGE":"Invalid Billing Number"}} 2024-10-30 18:53:55 INFO [IS <bni.hanifa#KomiCommonLogging.services:doLogging>] trxlogger ~ {"logs":{"REFERENCE_NO":"000000021357","TYPE":"INQUIRY","LAYER":"BILLER","CHANNEL_NAME":"ECP","COMPANY_CODE":"0010000007","COMPANY_NAME":"Three Postpaid","LOCAL_IP":"192.168.244.57","RESPONSE_CODE":"BMCS-00","RESPONSE_MESSAGE":"Success"}}
Step 5: Verify Data in Elasticsearch
To ensure the logs are correctly ingested, you can query Elasticsearch for data in the
transaction_logs
index:curl -X GET "localhost:9200/transaction_logs/_search?pretty"
If everything is set up correctly, you should see transaction entries structured according to the Logstash pipeline.
Step 6: Create Visualization in Kibana
- Open Kibana by navigating to
http://localhost:5601
in your web browser. Log in using the credentials you set up forelastic
during installation.
- In the main menu, go to Management > Stack management > Data Views.
- Click Create data view
- Enter the index pattern that matches your data. For example, if your Logstash piepleine outputs data to an index like
transaction_logs
, typetransaction_logs
.
- Click Timestamp Field and select the timestamp field (e.g.,
@timestamp
) if available. Kibana uses this field for time-based visualizations.
- Click Save data view to Kibana to save it.
- Go to Discover to verify that your data is coming through correctly. Select your newly created data view (e.g.,
Transaction Logs
). Use this section to get a quick view of the log entries and test filters before creating visualizations.