ELK stack is quite capable solution for event, logs, data aggregation and parsing. It offers a very shiny yet highly flexible web frontend. You can extend it to limits you can think off. It is a perfect open source tool for end to end data analytics.So what is ELK exactly ? ELK is composed of three independent components.Logstash (collector): A very comprehensive and event collector and parser which works very well for syslog, SNMP and anything that crawls in computers.Elasticsearch (index data): A search engine based on Lucene. It provides a distributed, multitenant-capable full-text search engine with an HTTP web interface and schema-free JSON documents.Kibana (visualize): an open source data visualization plugin for Elasticsearch. It provides visualization capabilities on top of the content indexed on an Elasticsearch cluster. Users can create bar, line and scatter plots, or pie charts and maps on top of large volumes of data.In case you are not having java 8, please dowload and install it. I have installed java 8 (jre-8u131-linux-x64.tar.gz) in /opt/sw/java for ELK.Download ELK components
Logstash:
wget -c "https://artifacts.elastic.co/downloads/logstash/logstash-5.4.3.tar.gz"
Elasticsearch:
wget -c "https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.4.3.tar.gz"
Kibana:
wget -c "https://artifacts.elastic.co/downloads/kibana/kibana-5.4.3-linux-x86_64.tar.gz"
Explode archives
tar xvzf logstash-5.4.3.tar.gz
tar xvzf elasticsearch-5.4.3.tar.gz
tar xvzf kibana-5.4.3-linux-x86_64.tar.gz
Copy logstash, elasticsearch and kibana software in /opt/sw/ and create symlinks as following
ln -s /opt/sw/elasticsearch-5.4.3 /opt/sw/elasticsearch
ln -s /opt/sw/kibana-5.4.3-linux-x86_64/ /opt/sw/kibana
ln -s /opt/sw/logstash-5.4.3/ /opt/sw/logstash
Add following users
adduser --home /opt/sw/elasticsearch elasticsearch
adduser --home /opt/sw/kibana kibana
Change ownership of /opt/sw/kibana and /opt/sw/elasticsearch
chown -R kibana:kibana /opt/sw/kibana-5.4.3-linux-x86_64/ /opt/sw/kibana
chown -R elasticsearch:elasticsearch /opt/sw/elasticsearch-5.4.3/ /opt/sw/elasticsearch
Create log directories
mkdir /var/log/{elasticsearch,kibana}
change ownership of log directories
chown -R kibana:kibana /var/log/kibana/
chown -R elasticsearch:elasticsearch /var/log/elasticsearch/
Config Elasticsearch : /opt/sw/elasticsearch/config/elasticsearch.ymladd following lines
cluster.name: some_name
path.logs: /var/log/elasticsearch
Config logstash:
root@mka:~# cat /etc/logstash/conf.d/logstash-snmp.conf
input {
snmptrap {type => "snmptrap"
host => "0.0.0.0"port => 162
yamlmibdir => "/opt/logstash/vendor/bundle/jruby/1.9/gems/snmp-1.2.0/data/ruby/snmp/mibs"}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]}
stdout { codec => rubydebug }
}
root@mka:~#
cat /etc/logstash/conf.d/logstash-syslog.conf
input {
tcp {
port => 514
type => syslog
}
udp {
port => 514
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program} %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
Add following lines in /etc/rc.local to start ELK components at system startup
# ELK starup
export JAVA_HOME=/opt/sw/java/
/opt/sw/logstash/bin/logstash -f /etc/logstash/conf.d/ -l /var/log/logstash &
su - elasticsearch -c "export JAVA_HOME=/opt/sw/java/; /opt/sw/elasticsearch/bin/elasticsearch -v -Epath.conf=/opt/sw/elasticsearch/config/ &"
su - kibana -c "export JAVA_HOME=/opt/sw/java/; /opt/sw/kibana/bin/kibana -c /opt/sw/kibana/config/kibana.yml -l /var/log/kibana/kibana.log &"
# end
Once services are started you will see port 514(tcp/udp) and 162(udp) open on system to accept remote syslog and snmp-traps
root@mka:~# netstat -nap | grep 514
tcp6 0 0 :::514 :::* LISTEN 22572/java
udp6 0 0 :::514 :::* 22572/java
root@mka:~# netstat -nap | grep 162udp6 0 0 :::162 :::* 22572/java
Now, Point your web browser to http:/your_ip/:5601 and navigate to settingsHere select default logstash-* pattern and time field “@timestamp” then create index. Now navigate to discover tab to watch events in Kibana.And after all hard work you get following on platter