Sunday, January 24, 2016

[ELK] An introduction to ELK Stack - Elasticsearch, Logstash, Kibana


... I had more than 5 servers continuously running on my computer when I was in college. My interest in PHP led me to install Apache, MySQL, and a mail server so that I could host multiple sites from my machine. Apart from these there were ftp, vnc etc. Whenever there was a problem with a server, I would go and check logs of that server. Even till 2 years back, I used to do same for other servers also. This time the list added app servers for Liferay.

I used to check logs for each servers individually, until I got to know about Splunk, a tool which my client was using to see all of the logs at one place. It was providing way more than just view, I could search logs for a defined time range etc. It was not collecting logs from just one server machine but almost 20 machines and for few machines, multiple servers. Troubleshooting was easy this way. I was not logging in to 20 different machines to check logs and look for possible problems. Honestly, I had no access to most of them. But I had access to Splunk, and I could query logs already indexed and could diagnose which server had problems.

But, Splunk (full features) is not free.

Then, what are the other options. Let me give you a hint in the image below..

Google (GOD) gave a hint : Look at ELK, once!

Out of curiosity, I just jumped to knowing ELK Stack rather than checking any search results. Instant, Google said once. :P

ELK = Elasticsearch, Logstash, Kibana.

Below image would tell you how this stack works.

 The above is a very very simplified representation of ELK stack.

Elasticsearch - Indexes the data sent to it. The core of elasticsearch is lucene.

Logstash - Its a data pipeline which can read data from a number of sources. There are more than 200 plugins available for logstash classified in four categories - Input, Output, Filter, Codec plugins.

Kibana - Whatever elasticsearch has indexed, Kibana gives opportunity to visualize that data in different forms. The data can be queried, be listed, be drawn as charts.

This was just a very basic introduction to these components. These components offer a lot more than just these. We will get to learn more about these components in future posts.

Until next time :)



Saturday, January 23, 2016

[ELK] Reading, Indexing and Visualizing Windows logs with ELK Stack

Hello Friends,

A system administrator knows how much system logs can help to troubleshoot critical problems. And what if those logs are indexed at one place and can be visualized in charts etc. Fun, isn't it?

Let's see what can help us to set it up..

1. Nothing special to be done at Elasticsearch just run it by executing elasticsearch.bat from bin directory.

2. Prepare a config so that Logstash can read windows logs. We need to add config in input plugin.

This eventlog plugin helps logstash to read windows logs. Windows logs are stored in binary format and can be accessed using only Win32 API. This plugin takes several configuration options but all of those are optional. These config options are codec, add_field, logfile, interval, tags, type out of which we are using logfile and type.

a) type has no default value and used for filter activation. The given type is stored as part of event and we can search events in Kibana using this.
b) logfile is an array of String and contains Application, Security, System as default values in array. In the config we have used only System.

3. Store this config to a custom file in conf in Logstash with name logstash-windowslogs.conf
and run logstash using

where complete config file will be as -

4. We're done with configuration of Logstash. Next is to read this in kibana. Here is our config to connect Kibana to Elasticsearch to read indices.


5. As soon as we start Logstash, it will start collecting all the events and index them to elasticsearch. Now lets run Kibana by executing kibana.bat in bin directory.

6. Once Kibana is started, we can open its interface in browser using http://localhost:5601/. We need to configure index pattern. By default, it shows logstash-* in index name or pattern field and @timestamp in Time-Field name. Hit create by keeping defaults.

7. Now click Discover on the top menu to see the results. It will show histogram of counts of event.

All the results are also listed below histogram but for security reasons, I have not shown them here..

That's it. Pretty clean, what you say?