1, 数据收集引擎:Logstash (监听端口以接收数据,并转发到目标端)
- a, 启动接收端口
- b, pipline作业
- c, 转发接收的数据到es仓库
cmd > cd logstash-7.12.0
cmd > logstash -e 'input { stdin { } } output { stdout {} }'
logstash -f first-pipeline.conf --config.reload.automatic
first-pipeline.conf 配置如下:
input {
beats {
port => "5044"
}
}
output {
stdout {
codec => rubydebug }
}
[2021-04-02T15:53:52,668][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {
"seconds"=>0.51}
[2021-04-02T15:53:52,684][INFO ][logstash.inputs.beats ][main] Starting input listener {
:address=>"0.0.0.0:5044"}
[2021-04-02T15:53:52,699][INFO ][logstash.javapipeline ][main] Pipeline started {
"pipeline.id"=>"main"}
[2021-04-02T15:53:52,790][INFO ][logstash.agent ] Pipelines running {
:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-04-02T15:53:52,807][INFO ][org.logstash.beats.Server][main][d8a9d29cbbead0a1f85f978a5c1bdc16ef566a6df07141bc471f85a33ec1ddce] Starting server on port: 5044
[2021-04-02T15:53:52,966][INFO ][logstash.agent ] Successfully started Logstash API endpoint {
:port=>9600}
2, 数据源端:Filebeat (发送日志数据给监听端口)
====1, D:/download/elk-stack/filebeat-7.11.2-windows-x86_64/README.md
./filebeat -c filebeat.yml -e
This will start Filebeat and send the data to your Elasticsearch
instance. To load the dashboards for Filebeat into Kibana, run:
./filebeat setup -e
====2, conf/filebeats.yml 配置如下:
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: true
reload.period: 10s
filebeat.inputs:
- type: log
paths:
- D:\download\elk-stack\logstash-7.11.2\logstash-tutorial-dataset
output.logstash:
hosts: ["localhost:5044"]
====3, 停止后,删除日志读取记录:从头读取整改文件
rm data/registry
./filebeat -e -c filebeat.yml -d "publish"
3, 数据目标端:Elasticsearch (索引被接收的数据)
- filebeat --> logstash --> elasticsearch
...
"clientip" => "86.1.76.62",
"input" => {
"type" => "log"
},
...
"clientip" => "86.1.76.62",
"input" => {
"type" => "log"
},
"httpversion" => "1.1",
"geoip" => {
"city_name" => "Burnley",
"timezone" => "Europe/London",
"longitude" => -2.2342,
"country_code3" => "GB",
"region_code" => "LAN",
logstash -f first-pipeline.conf --config.reload.automatic
first-pipeline.conf :
input {
beats {
port => "5044"
}
}
filter {
grok {
match => {
"message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
user => "elastic"
password => "123456"
}
}
[2021-04-02T16:49:02,269][INFO ][logstash.agent ] Pipelines running {
:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-04-02T16:49:02,347][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {
:manage_template=>{
"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{
"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{
"dynamic_templates"=>[{
"message_field"=>{
"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{
"type"=>"text", "norms"=>false}}}, {
"string_fields"=>{
"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{
"type"=>"text", "norms"=>false, "fields"=>{
"keyword"=>{
"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{
"@timestamp"=>{
"type"=>"date"}, "@version"=>{
"type"=>"keyword"}, "geoip"=>{
"dynamic"=>true, "properties"=>{
"ip"=>{
"type"=>"ip"}, "location"=>{
"type"=>"geo_point"}, "latitude"=>{
"type"=>"half_float"}, "longitude"=>{
"type"=>"half_float"}}}}}}}
[2021-04-02T16:49:02,375][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
[2021-04-02T16:49:02,519][INFO ][logstash.outputs.elasticsearch][main] Creating rollover alias <logstash-{
now/d}-000001>
curl -u elastic:123456 "http://localhost:9200/_cat/indices?pretty&v=true"
curl -u elastic:123456 -XGET 'localhost:9200/logstash-2021.04.02-000001/_search?pretty&q=response=200'