ELK搭建

到新公司一直都没很忙,没有太多时间,这周本来是上五天班,结果只上三天,有点像大学了,有天使周五放假,所以连着周末,就一样有三天了,在朋友下载了很多电影,很久没有去过电影院了,都不知道有哪些电影上映了,或者下映了,还下载了<<老友记>>,英语这硬伤,也是个时候正个八经的学下了.

虽然这个搭建起来了,可是…哈哈哈哈,我也不知道该怎么用

一:大概

  • 安装的三个软件都是默认设置
  • 使用的都是默认端口
  • 只是跑起来,看下效果
    image
  • 基本的意思是:logstash读取日志,放在es里,再用kibana读出来

二:安装运行ElasticSearch

  1. 下载tar.gz包并解压(6.3版本)
  2. Run bin/elasticsearch (or bin\elasticsearch.bat on Windows)
  3. 浏览器访问http://localhost:9200/
  4. 出现这样的启动完成了
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    {
    "name" : "8V2Xmlw",
    "cluster_name" : "elasticsearch",
    "cluster_uuid" : "fnWvBRwlRoSDJY-BOpYzkg",
    "version" : {
    "number" : "6.3.0",
    "build_flavor" : "default",
    "build_type" : "zip",
    "build_hash" : "424e937",
    "build_date" : "2018-06-11T23:38:03.357887Z",
    "build_snapshot" : false,
    "lucene_version" : "7.3.1",
    "minimum_wire_compatibility_version" : "5.6.0",
    "minimum_index_compatibility_version" : "5.0.0"
    },
    "tagline" : "You Know, for Search"
    }

三:安装运行Logstash

  1. 下载tar.gz包并解压(6.2.2版本,用6.3的没有启动成功)
  2. cd logstash-6.2.2
  3. bin/logstash(先启动试试,能不能正常启动,可以不用这一步)
  4. 在根目录创建logstash-simple.conf

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    # 配置输入为 beats
    input {
    beats {
    port => "5044"
    }

    }
    # 数据过滤
    filter {
    grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
    geoip {
    source => "clientip"
    }

    }
    # 输出到本机的 ES
    output {
    elasticsearch {
    hosts => [ "localhost:9200" ]
    }
    }
  5. 启动bin/logstash -f logstash-simple.conf --config.reload.automatic,要等一会才有反应

  6. 出现这样的基本就对了
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    [2018-06-16T22:40:40,039][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/anthony/下载/logstash-6.2.2/modules/fb_apache/configuration"}
    [2018-06-16T22:40:40,106][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/anthony/下载/logstash-6.2.2/modules/netflow/configuration"}
    [2018-06-16T22:40:41,926][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2018-06-16T22:40:43,839][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.2"}
    [2018-06-16T22:40:45,772][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
    [2018-06-16T22:40:52,182][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
    [2018-06-16T22:40:53,492][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2018-06-16T22:40:53,511][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
    [2018-06-16T22:40:54,259][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
    [2018-06-16T22:40:54,442][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
    [2018-06-16T22:40:54,450][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}

四:安装运行FileBeats

通过 FileBeats 收集目标日志,然后统一输出到 LogStash 做进一步的过滤,在由 LogStash 输出到 ES 中进行存储。

  1. 下载地址https://www.elastic.co/downloads/beats/filebeat(6.3版本)
  2. 编辑filebeat.yml
    找到,按照格式编辑,paths代表的是日志路径有两个output,注释掉elasticsearch output,打开logstash output

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    - type: log
    # Change to true to enable this input configuration.
    enabled: True

    # Paths that should be crawled and fetched. Glob based paths.
    paths:
    - /home/anthony/桌面/mylog/*.log


    #-------------------------- Elasticsearch output ------------------------------
    #output.elasticsearch:
    # Array of hosts to connect to. 注释掉这个
    #hosts: ["localhost:9200"]

    # Optional protocol and basic auth credentials.
    #protocol: "https"
    #username: "elastic"
    #password: "changeme"

    #----------------------------- Logstash output --------------------------------
    output.logstash:
    # The Logstash hosts,打开这个注释
    hosts: ["localhost:5044"]
  3. 运行

    1
    2
    3
    # FileBeat 需要以 root 身份启动,因此先更改配置文件的权限
    sudo chown root filebeat.yml
    sudo ./filebeat -e -c filebeat.yml -d "publish"
  4. 结果是看到打印出来一堆东西,最后是以complete结尾,就是成功了

五:安装运行kinbana

  1. 下载解压(6.3版本)
  2. 运行bin/kibana
  3. 选左边的导航栏,Discover,(创建索引)

第二步的时候选择@timestamp

  1. 再点Discover

  2. 就可以查看日志了