文章目錄
1. 背景
開啓Sleuth以後,我們就可以獲取請求的鏈路日誌,但是這些日誌都是存儲在各個微服務的文件系統上,對查詢分析不方便,需要通過ELK來幫我們解決這些事
2. ELK組成
- ElasticSearch:分佈式搜索引擎,快速搜索查找
- Logstash:對日誌蒐集過濾,統一存儲
- Kibana:提供web查詢頁面
3. 日誌架構
所有微服務產生的日誌放入隊列中,Logstash從隊列中獲取日誌進行過濾輸出到ElasticSearch中,最後由Kibana做展示
4. 搭建
4.1 統一日誌輸出
將微服務產生的日誌統一輸送到隊列中,以便後續Logstash來消費,這邊使用Redis做隊列,因爲Logstash的標配就是Redis
4.1.1 pom依賴
<dependency>
<groupId>com.cwbase</groupId>
<artifactId>logback-redis-appender</artifactId>
<version>1.1.5</version>
</dependency>
4.1.2 logback.xml配置
規範化日誌輸出,與指定存儲隊列,這邊使用redis
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<Target>System.out</Target>
<encoder>
<pattern>[%d{HH:mm:ss}][%t][%p][%c]-%m%n</pattern>
</encoder>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>info</level>
</filter>
</appender>
<appender name="LOGSTASH" class="com.cwbase.logback.RedisAppender">
<source>boot-sleuth</source>
<type>dev</type>
<host>127.0.0.1</host>
<key>logstash:redis</key>
<tags>dev</tags>
<mdc>true</mdc>
<location>true</location>
<callerStackIndex>0</callerStackIndex>
</appender>
<root level="info">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="LOGSTASH"/>
</root>
</configuration>
4.1.3 redis日誌查看
127.0.0.1:6379> keys *
1) "logstash:redis"
127.0.0.1:6379> type logstash:redis
list
127.0.0.1:6379> LRANGE logstash:redis 0 1
1) "{\"source\":\"boot-sleuth\",\"host\":\"PC-201808271158\",\"path\":null,\"type\":\"dev\",\"tags\":[\"dev\"],\"message\":\"Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@7d20d0b: startup date [Sat Apr 11 23:38:26 CST 2020]; root of context hierarchy\",\"@timestamp\":\"2020-04-11T23:38:26.802+0800\",\"logger\":\"org.springframework.context.annotation.AnnotationConfigApplicationContext\",\"level\":\"INFO\",\"thread\":\"main\",\"level\":\"INFO\",\"location\":{\"class\":\"org.springframework.context.support.AbstractApplicationContext\",\"method\":\"prepareRefresh\",\"file\":\"AbstractApplicationContext.java\",\"line\":\"588\"}}"
2) "{\"source\":\"boot-sleuth\",\"host\":\"PC-201808271158\",\"path\":null,\"type\":\"dev\",\"tags\":[\"dev\"],\"message\":\"JSR-330 'javax.inject.Inject' annotation found and supported for autowiring\",\"@timestamp\":\"2020-04-11T23:38:27.580+0800\",\"logger\":\"org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor\",\"level\":\"INFO\",\"thread\":\"main\",\"level\":\"INFO\",\"location\":{\"class\":\"org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor\",\"method\":\"<init>\",\"file\":\"AutowiredAnnotationBeanPostProcessor.java\",\"line\":\"153\"}}"
4.2 ElasticSearch安裝
4.2.1 下載路徑
4.2.2 配置環境變量
D:\elasticsearch-7.6.2\bin
4.2.3 啓動
elasticsearch.bat
4.3 Kibana安裝
4.3.1 下載路徑
4.3.2 配置環境變量
D:\kibana-7.6.2\bin
4.3.3 啓動
kibana.bat
4.4 Logstash安裝
4.4.1 下載路徑
4.4.2 配置環境變量
D:\logstash-7.6.2\bin
4.4.3 修改配置文件
D:\logstash-7.6.2\config在這個文件新建logstash.conf文件
input {
redis {
data_type => "list" #存儲類型
type => "redis-input"
key => "logstash:redis"#key值,後面要與spring boot中key保持一致
host => "127.0.0.1"
port => 6379
threads => 5 #啓動線程數量
codec => "json"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "springboot-elk" #index是定義將過濾後的日誌推送到Elasticsearch後存儲的名字
}
stdout { codec => rubydebug} #是否在控制檯打印
}
4.4.4 啓動
logstash -f D:\logstash-7.6.2\config\logstash.conf
4.5 Kibana界面配置
- 訪問http://localhost:5601
- 在Management頁籤添加Index Patterns
- 在Discover頁籤查看日誌信息
- 用戶可以在這裏搜索某個TraceId的日誌或者某個SpanId的日誌,可以追蹤鏈路