Using Logstash to Transform Transaction Logs Into Visualization

Using Logstash to Transform Transaction Logs Into Visualization

Status
Created
Nov 1, 2024 03:39 AM
Tags
DevOps

Introduction

The transaction logs that generated by applications can be leveraged by the ELK Stack — Elasticsearch, Logstash, and Kibana, into nicely visualization dashboard. In this guidance, we’ll go over how to use Logstash to ingest and transform transaction logs, making them accessible for analysis in Elasticsearch and visualization in Kibana. By the end, you’ll know how to leverage ELK for practical log management and insights.
notion image

Table of Contents

Prerequisites

  • Docker and Docker Compose installed on your system

Step 1: Set Up the ELK Stack using Docker Compose

Let’s start by setting up the ELK stack using Docker Compose. This setup will bring up Elasticsearch, Logstash, and Kibana in a single networked environment.
  1. Create a docker-compose.yml file with the following content:
    1. version: '3' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:8.15.3 container_name: elasticsearch environment: - discovery.type=single-node - bootstrap.memory_lock=trye - "ES_JAVA_OPTS=-Xms1g -Xmx1g" ulimits: memlock: soft: -1 hard: -1 ports: - "9200:9200" networks: - elk logstash: image: docker.elastic.co/logstash/logstash:8.15.3 container_name: logstash volumes: - ./logstash_pipeline.conf:/usr/share/logstash/pipeline/logstash_pipeline.conf ports: - "5044:5044" networks: - elk kibana: image: docker.elastic.co/kibana/kibana:8.15.3 container_name: kibana ports: - "5601:5601" networks: - elk networks: elk: driver: bridge
  1. Create a Logstash Configuration File: Next to the docker-compose.yml file, create a file called logstash_pipeline.conf. This will contain the configuration to read and transform transaction logs.

Step 2: Configure Logstash to Process Transaction Logs

Logstash is an event processing engine. You can see the Logstash as establishing pipeline for your events. Your events now can be received, processed, and shipped to other places.
page icon
Pipeline = input + (filter) + output
We’ll set up Logstash to parse and transform JSON log data. Here’s the logstash_pipeline.con configuration.
input { file { path => "logs/transaction_logs*.log" start_position => "beginning" sincedb_path => "/dev/null" # Ensures Logstash reads the file from the beginning code => "json" } } filter { json { source => "message" target => "parsed" remove_field => ["message"] } date { match => ["[parsed][timestamp]", "yyyy-MM-dd HH:mm:ss"] target => "@timestamp" } mutate { rename => { "[parsed][logs]" => "transaction" } } } output { elasticsearch { host => ["http://elasticsearch:9200"] index => "transaction_logs" } stdout { codec => rubydebug } }
The configuration tells Logstash to:
  • Input: Read the log file in file name and JSON format
  • Filter: Parse JSON, extract and rename fields, and convert the timestamp
  • Output: Send data to Elasticsearch under the transaction_logs index

Step 3: Run the Docker Compose

Now start the entire ELK stack:
docker compose up -d
Docker compose will launch the ELK stack, where Elasticsearch is available on http://localhost:9200 and Kibana on http://localhost:5601 .

Step 4: Prepare Sample Transaction Logs

Save transaction logs in JSON format in the path specified in logstash_pipeline.conf . Here’s a sample of what the logs might look like:
2024-10-30 18:53:25 INFO [IS <bni.hanifa#KomiCommonLogging.services:doLogging>] trxlogger ~ {"logs":{"REFERENCE_NO":"000000021357","TYPE":"INQUIRY","LAYER":"CHANNEL","CHANNEL_NAME":"ECP","COMPANY_CODE":"0010000007","COMPANY_NAME":"Three Postpaid","LOCAL_IP":"192.168.244.57","RESPONSE_CODE":"14","RESPONSE_MESSAGE":"Invalid Billing Number"}} 2024-10-30 18:53:55 INFO [IS <bni.hanifa#KomiCommonLogging.services:doLogging>] trxlogger ~ {"logs":{"REFERENCE_NO":"000000021357","TYPE":"INQUIRY","LAYER":"BILLER","CHANNEL_NAME":"ECP","COMPANY_CODE":"0010000007","COMPANY_NAME":"Three Postpaid","LOCAL_IP":"192.168.244.57","RESPONSE_CODE":"BMCS-00","RESPONSE_MESSAGE":"Success"}}

Step 5: Verify Data in Elasticsearch

To ensure the logs are correctly ingested, you can query Elasticsearch for data in the transaction_logs index:
curl -X GET "localhost:9200/transaction_logs/_search?pretty"
If everything is set up correctly, you should see transaction entries structured according to the Logstash pipeline.

Step 6: Create Visualization in Kibana

  1. Open Kibana by navigating to http://localhost:5601 in your web browser. Log in using the credentials you set up for elastic during installation.
  1. In the main menu, go to Management > Stack management > Data Views.
    1. notion image
  1. Click Create data view
  1. Enter the index pattern that matches your data. For example, if your Logstash piepleine outputs data to an index like transaction_logs , type transaction_logs .
    1. notion image
  1. Click Timestamp Field and select the timestamp field (e.g., @timestamp ) if available. Kibana uses this field for time-based visualizations.
  1. Click Save data view to Kibana to save it.
  1. Go to Discover to verify that your data is coming through correctly. Select your newly created data view (e.g., Transaction Logs ). Use this section to get a quick view of the log entries and test filters before creating visualizations.