Python Application Performance Monitoring (APM) using Elastic

Python Application Performance Monitoring (APM) using Elastic

Status
In progress
Created
Nov 6, 2024 07:57 AM
Tags
Description: Set up monitoring for a Python microservices application with a PostgreSQL database using the Elastic Stack. This project introduces you to performance monitoring, logging, and real-time visualization for backend services, helping you gain experience with Elastic APM, Flask microservices, and PostgreSQL integration.
Tech Stack: Python, Flask, PostgreSQL, Elastic APM, Elasticsearch, Kibana, Docker.
Features:
  • Real time monitoring of microservices performance and database queries
  • Service dependency mapping and performance bottleneck identification
  • Custom dashboards in Kibana for visualizing metrics
  • Error logging and alerts for application issues and database errors
Learning Path: Microservice basics, Python with Flask, PostgreSQL setup, Elastic Stack fundamentals, APM integration, Docker containerization
Goal: Ideal for beginners wanting to learn application performance monitoring (APM) and gain practical skills in deploying and monitoring Python microservices with Elastic

Table of Contents

Part 1: Architecture Overview

Firstly, let’s understand the overall architecture of system we’ll be setup. It consist of microservices application using Python and PostgreSQL database and one Elastic stack with APM server configured.
Architecture Breakdown:
  • Application layers:
    • Frontend: A basic web interface
    • Backend: Python-based API that serves as the business logic layer
    • Database: PostgreSQL for storing application data
  • Elastic Stack:
    • Elasticsearch: Stores APM data and other logs
    • Kibana: Provides visualization for real-time monitoring, service dependencies, and dashboards
    • APM Server: Collects and transmits APM metrics from the application
From the each services, they will sent data like request duration, error rates, and database query performance. This data will enable us to setup real-time monitoring and dashboarding.
The APM agent will also identifies interactions between services as well as capturing logs errors.
notion image

Part 2: Setting up the Python-PostgreSQL Application

In this part, we’re going to build a simple Python-PostgreSQL application. The application consists of three-layer application (Frontend, Backend, Database). We’re going to run the application using Docker Compose for easy deployment.

2.1 Setup

First, let’s create a project folder. You can do this by running the following commands on application host.
$ mkdir docker-python-sql $ cd docker-python-sql $ mkdir frontend $ mkdir backend $ mkdir database

2.2 Database

We’re going to use PostgreSQL is a free and open-source relational DBMS that is SQL compliant.
For instantianting a PostgreSQL database through Docker, we’e using the official image and create Dockerfile to setup the username and database initial setup.
# Use the official PostgreSQL image as the base image FROM postgres:15.9 # Set the default password for the PostgreSQL user ENV POSTGRES_PASSWORD=secret # Set the default username for PostgreSQL ENV POSTGRES_USER=username # Set the default database name for PostgreSQL ENV POSTGRES_DB=database # Copy the SQL script to initialize the database with fixtures COPY create_fixtures.sql /docker-entrypoint-initdb.d/create_fixtures.sql

2.3 Backend

So the backend application will monitor the input directory for any new HTML files and convert them to PDFs, saving them in the output directory and tracking them in the database.
backend/Dockerfile:
FROM python:3.10 # Set work directory WORKDIR /app # Install required dependencies COPY requirements.txt . RUN pip install -r requirements.txt # Copy the application code into the container COPY . . # Run the automation service CMD ["python", "app.py"]
backend/requirements.txt :
psycopg2-binary==2.9.7 xhtml2pdf==0.2.11 watchdog==2.1.6 # To monitor the input folder for new HTML files
backend/app.py :
So this is the main application that will monitor the input directory and process any new HTML files:
import os import time import psycopg2 from xhtml2pdf import pisa from watchdog.observers import Observer from watchdog.events import FileSystemEventHandler # Database configuration db_config = { 'dbname': os.getenv('POSTGRES_DB'), 'user': os.getenv('POSTGRES_USER'), 'password': os.getenv('POSTGRES_PASSWORD'), 'host': os.getenv('POSTGRES_HOST') } # Define the directories input_dir = '/input' output_dir = '/output' # Function to convert HTML to PDF def convert_html_to_pdf(html_file): pdf_file = os.path.join(output_dir, os.path.basename(html_file).replace('.html', '.pdf')) with open(html_file, "r") as html: with open(pdf_file, "wb") as pdf: pisa.CreatePDF(html, dest=pdf) return pdf_file # Function to track converted files in the database def track_converted_file(file_name): try: conn = psycopg2.connect(**db_config) cur = conn.cursor() cur.execute("INSERT INTO converted_files (file_name) VALUES (%s);", (file_name,)) conn.commit() cur.close() conn.close() except Exception as e: print(f"Error tracking file: {e}") # Watchdog event handler class Watcher(FileSystemEventHandler): def on_created(self, event): if event.is_directory: return if event.src_path.endswith(".html"): print(f"New HTML file detected: {event.src_path}") pdf_file = convert_html_to_pdf(event.src_path) track_converted_file(pdf_file) print(f"Converted to: {pdf_file}") # Set up the observer to monitor the input folder observer = Observer() observer.schedule(Watcher(), input_dir, recursive=False) observer.start() print("Monitoring the input directory for new HTML files...") try: while True: time.sleep(1) except KeyboardInterrupt: observer.stop() observer.join()
 
Add Elastic APM Integration
  1. Click Add integration on policy where you’re agent located on intended APM server
    1. notion image
  1. On search bar search for “APM”
    1. notion image
  1. Use Elastic APM in Fleet so the APM server installation are managed by Fleet. Click “APM Integration”
    1. notion image
  1. Click “Add Elastic APM”
    1. notion image
  1. Specify the host and port of your APM server. By the default APM server use port 8200
    1. notion image
  1. Set the agent policy where you will placed the APM Server integration. You can either create new policy or use existing policy with existing assigned hosts
    1. notion image
      Click “Save and deploy changes”. You will notice how many hosts/agents that will installed with APM Server
 
  1. Go back to Integrations > APM
    1. notion image
      Click on “Check APM Server status” to check installation status.
      notion image