Basic log analytic system & ELK stack

Basic log analytic system & ELK stack

First of all, we need to understand the basic log analytic system, it contains four main aspects;

  1. Generation
  2. Transport
  3. Storage
  4. Analytic

In the past above all mentioned components configured in one computer, logs are generated locally and store on the local hard drive, If some event occurs like error, performance-based issue, analytical purpose, audit; developers manually download the log files from the particular server and analyze using “grep” command, but nowadays above mentioned all component run as a service in a host.

Log analysis attempts to make an overall picture of data that is generated by the system. Creating such a log known as logging. Log analysis used for;

  • Debug purposes
  • Find Errors in the application
  • Auditing purposes
  • Troubleshooting
  • Analysis performance of the system
  • Analysis of online user behaviors

A system generates a stream of messages in time sequence and executed into log files and store on the disk or they can be transferred to a log collector. Most of the system generated Syslog follow logging standards defined in Internet Engineering Task Force. Each massage on Syslog contain below details

  • Facility code – It indicates the identifier of the sender application that is logging the message. The below table shows the list of few facility codes.
  • Severity – It indicates the logging level of the message. The below table shows the few severities that commonly use to log files
  • Host/ IP address – It indicates the identification of the sender
  • Timestamp – It indicates the time of message generated
  • Message – It indicates the message that the system generated

ELK stack

ELK stack is introduced by Elastic, Combining three open-source tools; Elasticsearch, Logstash, and Kibana. Using ELK stack can address common issues in log analytics. such as

  1. The correlation of logs from various sources
  2. Handle a large volume of log files.

The ELK uses,

  • Elasticsearch – It used for strong search and indexing and store the information
  • Logstash – It used to collect data from various sources
  • Kibana – It used to represent the collected data as an information

Many use cases used ELK stacks such as,

  1. Free and structured search
  2. Data analytic
  3. Business intelligence
  4. logs analysis and visualization using Kibana.

A family of log shippers called the beat is also includes in the stack, which pushes the log files to Logstash.

Implementation of ELK Stack
Basic log analytic system

Reference
https://logz.io/blog/15-tech-companies-chose-elk-stack/
https://logz.io/learn/complete-guide-elk-stack/#additional-resources
http://blog.logit.io/benefits-of-using-the-elk-stack-for-centralised-logging/