Technical Blog Post
Abstract
Log Analysis - Logstash 123 (Part 1)
Body
In this Part 1 of the "Logstash 123" series, let's talk about the INPUT plugin.
Eg.
input {
file {
type => "syslog"
path => ["/var/log/syslog-la.log"]
}
}
According to the requirement of the scalable data collection architecture of Log Analysis, you will need to configure the INPUT plugin on Logstash Receiver and Logstash Sender.
If you are not familiar to Logstash Receiver, please refer here.
If you are not familiar to Logstash Sender, please refer here.
In a nutshell, for Logstash Receiver, you will be (depending on your setup) receiving data from Log File Agent (LFA), syslog, or Filebeat.
As for Logstash Sender, you will be getting data from Kafka.
(1) INPUT section for Logstash Receiver
(a) Receiving data from Log File Agent (LFA)
## TCP Input plugin for IBM Log File Agent (LFA)
tcp {
port => 18989
type => "lfa"
}
In the example above, Logstash will be listening at port 18989 for data sent by LFA.
https://www.elastic.co/guide/en/logstash/2.2/plugins-inputs-tcp.html#plugins-inputs-tcp-port
(b) Receiving data from a local syslog
## TCP input plugin for Syslog data
file {
type => "syslog"
path => ["/var/log/syslog-la.log"]
}
In the example above, Logstash will read the "syslog" file at "/var/log/syslog-la.log".
https://www.elastic.co/guide/en/logstash/2.2/plugins-inputs-file.html
(c) Receiving data from Filebeat
## beats Input plugin for Filebeat data
beats {
port => 18979
#type => "filebeat-ip"
}
In the example above, Logstash will be listening at port 18979 for data sent by Filebeat.
https://www.elastic.co/guide/en/logstash/2.2/plugins-inputs-beats.html
(2) INPUT section for Logstash Sender
Since Logstash Sender can only pull data from Kafka, there is only one setup:
input {
## Kafka input plugin
## Create a copy for each topic in kafka (datasource in LA)
kafka {
## Zookeeper host and port
zk_connect => "IP_ADDRESS:17981"
group_id => "PUNE_WAS_SystemOut"
topic_id => "PUNE_WAS_SystemOut"
consumer_threads => 4
consumer_restart_on_error => true
consumer_restart_sleep_ms => 100
fetch_message_max_bytes => 500000
queue_size => 2000
auto_offset_reset => smallest
}
}
For more information about the individual parameters used, you can refer here.
If you are not familiar with Kafka, you can refer to the official documentation here.
That's all for now. Stay tuned for Part 2 of the "Logstash 123" series!
Subscribe and follow us for all the latest information directly on your social feeds:
|
|
|
Check out all our other posts and updates: | |
Academy Blogs: | https://goo.gl/eZjStB |
Academy Videos: | https://goo.gl/kJeFZE |
Academy Google+: | https://goo.gl/HnTs0w |
Academy Twitter : | https://goo.gl/DiJbvD |
UID
ibm11081509