Brook Preloader

Blog

Log Management With ELK

Elastic Logstash and Kibana (ELK) is a log monitoring and management tool. It is a collection of three open source products for log analysis and visualization. Log management is helpful for developers, system administrators and DevOps engineers to make business decisions.

ELK is used for centralized logging while attempting to identify problems with servers or applications. It allows us to search all logs in a single place and helps to identify issues which occur in different files in different servers.

Log file

Log file is a file that records either messages or events that occur in an operating system/application. These messages are written to log file.

Logstash

Logstash is a log collector that accepts inputs from various sources from different environments. It collects input from log files, transforms it and ships data to elastic search for indexing. It consists of three parts – Input, Filter and Output.

  • Input – Collects data from log files
  • Filter – Performs actions on data
  • Output – Sends data to elastic search

The advantage of Logstash is, it provides various types of plugins to connect to input sources. It also offers centralized data processing.

Elasticsearch

Elasticsearch is a search and analysis engine based on Lucene search engine. It stores huge volumes of data to search and analyze. Elasticsearch is built with restful API and is a NoSQL database where data is stored in unstructured format in documents, instead of tables. In Elasticsearch, we can browse through the data, as well as delete documents. We need to follow the below three steps.

  • Indexing: In Elasticsearch data is saved index wise, so Elasticsearch uses these indexes to store and retrieve data.
  • Mapping: It is the process of setting index schema.
  • Searching: We can search data by query, using filters and aggregations.

Kibana:

Kibana is data visualization and alerting tool, which takes input from Elasticsearch. It is used for visualizing data and provides various graphs and interactive diagrams.

Beats:

Beats are installed on servers for sending operational data to Logstash. If we want to consume logs from different servers, installing Logstash on each server is not a good solution because it uses lot of resources. Here we can install file beat to read log and send data to Logstash.

Establishment of ELK Stack

Elasticsearch:

  1. Download and install the latest version of Elasticsearch
  2. Run command “elasticsearch.bat” in cmd and access it on localhost:9200

Kibana:

  1. Download and install the latest version of Kibana
  2. Open kibana.yml file and uncomment line Elasticsearch. url: “http://localhost:9200”
  3. Access it on localhost:5601

Logstash:

  1. Download and install the latest version of Logstash
  2. Start Logstash using command logstash -f logstash.conf

Filebeat:

  1. Download Filebeat and unzip it
  2. Open filebeat.yml and add log file path under path variable
  3. Start filebeat with command as filebeat.exe -c filebeat.yml

Monitoring Feature:

Elasticsearch runs on 9200 port. For checking indices, we can make use of the following link:
http://localhost:9200/_cat/indices

Kibana runs on 5601 port.
In the dashboard, we can search for error message in search tab. It shows us errors in any file and path of log file in which the error is present. Developer can check for the error and resolve it.

Alerting and Actions:

In Kibana sever, some tasks run in the background to check the conditions. These tasks are the alerts which check the conditions. If that condition matches, it performs some actions. These alerts have three parts:

  • Condition: In condition, we use “WHEN Max OF system.process.cpu.total.pct IS ABOVE 0.5”. Here alerts check for total system CPU process above 0.5.
  • Schedules: Here we have scheduled alerts to check for every 1 second.
  • Actions: It is needed to notify about the generated alert. To add an action first, we need to select the type of action. In Kibana we have slack, email and webhook as options for action. We have used email option for action using Gmail.

Note: Kibana action alerting through email, slack and webhook is not included in the basic package. For this, we require at least gold plan subscription.

Each action should specify connector first by clicking on add new in connection section.

Below are screenshots for editing alerts and connectors in Kibana.

Conclusion

Manually monitoring and analyzing large logs for multiple applications is not feasible. ELK provides a way to centralize storage and analyze large number of logs efficiently using search criteria and Kibana Query Language. We can perform various search and analyses on any data we want using ELK stack.

References

Contact for further details

Vasim Mujawar
Senior Specialist – Digital Enterprise Architecture
vasimm.in@mouritech.com
MOURI Tech

0 0 vote
Rating
guest
0 Comments
Inline Feedbacks
View all comments