FileBeat 解決時間沖突問題

版本:FileBeat 6.3


問題

項目中配置了生成JSON格式的日志,方便導入到ELK中,生成格式如下:

{
    "@timestamp":"2018-06-29T16:24:27.555+08:00",
    "severity":"INFO",
    "service":"osg-sender",
    "trace":"",
    "span":"",
    "parent":"",
    "exportable":"",
    "pid":"4620",
    "thread":"restartedMain",
    "class":"o.s.c.a.AnnotationConfigApplicationContext",
    "rest":"Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@7a7df24c: startup date [Fri Jun 29 16:24:27 CST 2018]; root of context hierarchy"
}

然后再通過FileBeat直接將json文件的每一行導入到ElasticSearch中,但是FileBeat會自動生成 @timestamp 字段,代表導入時間,這樣就和LOG中的該字段沖突了。

解決辦法

兩種方式:

  1. 將LOG文件的@timestamp字段換個名字,比如 logDate,避免和FileBeat中的沖突,此時要為logDate在FileBeat的 fields.yml中添加索引字段配置,添加類型為日期的logDate字段,否則在Kibana中創建index pattern時無法看到該time filter。
  2. 不改變LOG內容,直接修改FileBeat的 fields.yml,把他原來的 @timestamp 字段改個名字,并增加 @timestamp 字段為日志使用。

我使用的第二種方式,修改fields.yml

- key: beat
  title: Beat
  description: >
    Contains common beat fields available in all event types.
  fields:

    - name: beat.name
      description: >
        The name of the Beat sending the log messages. If the Beat name is
        set in the configuration file, then that value is used. If it is not
        set, the hostname is used. To set the Beat name, use the `name`
        option in the configuration file.
    - name: beat.hostname
      description: >
        The hostname as returned by the operating system on which the Beat is
        running.
    - name: beat.timezone
      description: >
        The timezone as returned by the operating system on which the Beat is
        running.
    - name: beat.version
      description: >
        The version of the beat that generated this event.

    - name: "@timestamp-beat"
      type: date
      required: true
      format: date
      example: August 26th 2016, 12:35:53.332
      description: >
        The timestamp when the event log record was generated.

    - name: "@timestamp"
      type: date
      format: "yyyy-MM-dd'T'HH:mm:ss.SSSZZ"

將原來的@timestamp改成了@timestamp-beat,增加了新的@timestamp并指定了日期格式。

這樣在創建index pattern的時候,就能看到兩個時間過濾器了,一個是日志生成的時間,一個是filebeat導入的時間。


image.png

配置

貼一下項目中log的配置文件,以及FileBeat的配置
logback-spring.xml

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
  <include resource="org/springframework/boot/logging/logback/defaults.xml"/>
  ?
  <springProperty scope="context" name="springAppName" source="spring.application.name"/>
  <!-- Example for logging into the build folder of your project -->
  <property name="LOG_FILE" value="${BUILD_FOLDER:-log}/${springAppName}"/>?

  <!-- You can override this to have a custom pattern -->
  <property name="CONSOLE_LOG_PATTERN" value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}"/>

  <!-- Appender to log to console -->
  <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
      <!-- Minimum logging level to be presented in the console logs-->
      <level>DEBUG</level>
    </filter>
    <encoder>
      <pattern>${CONSOLE_LOG_PATTERN}</pattern>
      <charset>utf8</charset>
    </encoder>
  </appender>

  <!-- Appender to log to file -->?
  <appender name="flatfile" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${LOG_FILE}</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
      <fileNamePattern>${LOG_FILE}.%d{yyyy-MM-dd}.gz</fileNamePattern>
      <maxHistory>90</maxHistory>
    </rollingPolicy>
    <encoder>
      <pattern>${CONSOLE_LOG_PATTERN}</pattern>
      <charset>utf8</charset>
    </encoder>
  </appender>
  ?
  <!-- Appender to log to file in a JSON format -->
  <appender name="logstash" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>${LOG_FILE}.json</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
      <fileNamePattern>${LOG_FILE}.json.%d{yyyy-MM-dd}</fileNamePattern>
      <maxHistory>90</maxHistory>
    </rollingPolicy>
    <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
      <providers>
        <timestamp>
          <!--<fieldName>logDate</fieldName>-->
          <!--<pattern>yyyy-MM-dd HH:mm:ss.SSS</pattern>-->
        </timestamp>
        <pattern>
          <pattern>
            {
            "severity": "%level",
            "service": "${springAppName:-}",
            "trace": "%X{X-B3-TraceId:-}",
            "span": "%X{X-B3-SpanId:-}",
            "parent": "%X{X-B3-ParentSpanId:-}",
            "exportable": "%X{X-Span-Export:-}",
            "pid": "${PID:-}",
            "thread": "%thread",
            "class": "%logger{40}",
            "rest": "%message"
            }
          </pattern>
        </pattern>
        <stackTrace>
          <throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
            <maxDepthPerThrowable>30</maxDepthPerThrowable>
            <maxLength>2048</maxLength>
            <shortenedClassNameLength>20</shortenedClassNameLength>
            <exclude>^sun\.reflect\..*\.invoke</exclude>
            <exclude>^net\.sf\.cglib\.proxy\.MethodProxy\.invoke</exclude>
            <rootCauseFirst>true</rootCauseFirst>
          </throwableConverter>
        </stackTrace>
      </providers>
    </encoder>
  </appender>
  ?
  <root level="INFO">
    <appender-ref ref="console"/>
    <!-- uncomment this to have also JSON logs -->
    <appender-ref ref="logstash"/>
    <appender-ref ref="flatfile"/>
  </root>
</configuration>

filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /root/ogter/build/*.json.*
    - /root/ogter/log/*.json
  exclude_files: ['.gz$']
  json.keys_under_root: true
  json.overwrite_keys: true
  exclude_files: ['.gz$']
  
output.elasticsearch:
  hosts: ["192.168.1.17:9200"]
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容

  • Spring Cloud為開發人員提供了快速構建分布式系統中一些常見模式的工具(例如配置管理,服務發現,斷路器,智...
    卡卡羅2017閱讀 134,837評論 18 139
  • 本人陸陸續續接觸了ELK的1.4,2.0,2.4,5.0,5.2版本,可以說前面使用當中一直沒有太多感觸,最近使用...
    三杯水Plus閱讀 4,118評論 0 12
  • ################### Filebeat Configuration Example ######...
    ssdsss閱讀 9,763評論 1 1
  • 一代游戲大師逝世,將遺產繼承化身為通關游戲,你會為了彩蛋還是三只鑰匙? 對于一個非典型游戲沉浸者、科幻片、二次元的...
    千尋茵閱讀 310評論 0 0
  • 黑色的森林。 一個瘸腿的少婦,穿著黑色的麻布衣服坐在墨色的死河邊洗著衣物。 似乎一切都是黑的顏色。 小河中汩汩冒著...
    腐鳥閱讀 261評論 0 0