logstash + elasticsearch + kibana presentation on startit tech meetup

38
Logstash + Elasticsearch + Kibana Centralized Log server (as Splunk replacement) Marko Ojleski DevOps Engineer

Upload: startit

Post on 26-Jan-2015

153 views

Category:

Technology


4 download

DESCRIPTION

 

TRANSCRIPT

Logstash + Elasticsearch + Kibana Centralized Log server

(as Splunk replacement)

Marko Ojleski DevOps Engineer

$plunk

Business as usual, untill…

#Outage @03:00AM

Check logs….?!? 10 network devices

40 servers 100 logs

Massive RAGE

tail cat

grep sed awk sort uniq

and looots of |

tail -10000 access_log | awk '{print $1}' | sort | uniq -c | sort -n

it’s just too much

1. collect data 2. parse/filter 3. send data

Logstash written in JRuby Author: Jordan Sissel

output

parse/filter

input

1. collect data

30+ inputs

Logstash input

1. collect data

file syslog tcp udp zmq

redis log4j

Log shippers

Logstash Beaver (Python) Lumberjack (Go)

Woodchuck (Ruby) Nxlog (C)

Sample conf

input { tcp { type => “server1" host => "192.168.1.1" port => "5555" }

2. parse/filter

40+ filters

Logstash filters

2. parse/filter

grok

grep

json xml

csv

geoip

mutate key/value

Grok filter

REGEX pattern collection

Grok filter

Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-

9]{1,2}))(?![0-9])

Grok filter

(?<![0-9])(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9]{1,2})[.](?:25[0-5]|2[0-4][0-9]|[0-1]?[0-

9]{1,2}))(?![0-9])

IP

`$=`;$_=\%!;($_)=/(.)/;$==++$|;($.,$/,$,,$\,$",$;,$^,$#,$~,$*,$:,@%)=( $!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++;

$_++;$_++;($_,$\,$,)=($~.$"."$;$/$%[$?]$_$\$,$:$%[$?]",$"&$~,$#,);$,++ ;$,++;$^|=$";`$_$\$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$\$"$^$~$*.>&$=`

`$=`;$_=\%!;($_)=/(.)/;$==++$|;($.,$/,$,,$\,$",$;,$^,$#,$~,$*,$:,@%)=( $!=~/(.)(.).(.)(.)(.)(.)..(.)(.)(.)..(.)......(.)/,$"),$=++;$.++;$.++;

$_++;$_++;($_,$\,$,)=($~.$"."$;$/$%[$?]$_$\$,$:$%[$?]",$"&$~,$#,);$,++ ;$,++;$^|=$";`$_$\$,$/$:$;$~$*$%[$?]$.$~$*${#}$%[$?]$;$\$"$^$~$*.>&$=`

Just another Perl hacker.

Grok filter

120+ regex patterns

USERNAME IP

HOSTNAME SYSLOGTIMESTAMP

LOGLEVEL etc…

Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message

Grok filter

2.10.146.54 - 2013-12-01T13:37:57Z - some really boring message

%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message}

Grok filter

client => 2.10.146.54 time => 2013-12-01T13:37:57Z message = > some really boring message

Grok filter

input { tcp { type => “server1" host => "192.168.1.1" port => "5555" }

filter { if [type] == “server1" { grok { match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "} } }

3. send data

50+ outputs

Logstash output

3. send data

stdout

elastic redis mongo zmq tcp

statsd

1. RESTful api 2. JSON-oriented 3. Horizontal scale 4. HA 5. Full Text search 6. Based on Lucene

Elasticsearch Distributed RESTful

search server

Logstash => elasticsearch

input { tcp { type => “server1" host => "192.168.1.1" port => "5555" } filter { if [type] == “server1" { grok { match => { "message" => "%{IP:client} - %{TIMESTAMP_ISO8601:time} - %{GREEDYDATA:message} "} } } output { elasticsearch {} }

1. Clean and simple UI 2. Fully customizable 3. Bootstrap based 4. Old version running on Ruby 5. Milestone 3 fully rewritten in HTML/Angular.js

Kibana Awesome Elasticsearch

Web Frontend to search/graph

Real Life Scenarios

Scenario 1

L2 switch

Cisco ASA

L3 switch

Syslog broker (lightweight shipper)

Logstash (main log server)

Elasticsearch

Kibana

UDP

UDP

UDP

Scenario 2

Apache (lightweight shipper)

IIS (lightweight shipper)

Jboss (lightweight shipper)

Logstash (main log server)

Elasticsearch

Kibana

TCP

TCP

TCP