共计 5973 个字符,预计需要花费 15 分钟才能阅读完成。
很久没有发文了,明天水一篇操作文章分享一下教训吧,后续陆续推出深度好文
springboot+logStash+elasticsearch+kibana
版本
- elasticsearch 7.4.2
- logStash 7.4.2
- springboot 2.1.10
下载地址
抉择下载的产品和版本,进行下载即可
https://www.elastic.co/cn/downloads/past-releases
部署
启动 Elasticsearch
-
设置配置文件 elasticsearch
cluster.name: my-application node.name: node-1 path.data: /cxt/software/maces/7.4.2/elasticsearch-7.4.2/data path.logs: /cxt/software/maces/7.4.2/elasticsearch-7.4.2/logs network.host: 0.0.0.0 http.port: 9200 discovery.seed_hosts: ["127.0.0.1"] cluster.initial_master_nodes: ["node-1"]
-
启动
bin/elasticsearch
启动 Kibana
-
间接去 bin 目录下启动即可,都是本机启动无需批改配置
bin/kibana
启动 LogStash
-
config 文件夹下创立 springboot-log.conf, 该配置文件的作用是启动在本机 9600 端口,前面 springboot 利用能够间接往 9600 发送日志。input 为日志输出,output 为日志输入到 elasticsearch
input{ # 启动在 9600 端口,输入在控制台 tcp { mode => "server" host => "0.0.0.0" port => 9600 codec => json_lines } } output{ elasticsearch{hosts=>["192.168.123.166:9200"] index => "springboot-logstash-%{+YYYY.MM.dd}" } # stdout{ # codec => rubydebug # } }
-
启动
bin/logstash -f config/springboot-log.conf
启动 SpringBoot 利用
-
pom
<dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>7.0</version> </dependency>
-
testController 办法
@RestController public class TestController {public static final Logger log = LoggerFactory.getLogger(TestController.class); @RequestMapping("/test") public String test(){log.info("this is a log from springboot"); log.trace("this is a trace log"); return "success"; } }
-
启动类 main 办法退出主动生成日志代码
@SpringBootApplication public class ElkApplication {public static final Logger log = LoggerFactory.getLogger(ElkApplication.class); Random random = new Random(10000); public static void main(String[] args) {SpringApplication.run(ElkApplication.class, args); new ElkApplication().initTask(); } private void initTask() {Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(new Runnable() { @Override public void run() {log.info("seed info msg :" + random.nextInt(999999)); } }, 100, 100, TimeUnit.MILLISECONDS); } }
-
resource 新建 logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/base.xml" /> <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <!-- 配置 logStash 服务地址 --> <destination>192.168.123.166:9600</destination> <!-- 日志输入编码 --> <encoder charset="UTF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <timestamp> <timeZone>UTC</timeZone> </timestamp> <pattern> <pattern> { "logLevel": "%level", "serviceName": "${springAppName:-}", "pid": "${PID:-}", "thread": "%thread", "class": "%logger{40}", "detail": "%message" } </pattern> </pattern> </providers> </encoder> </appender> <root level="INFO"> <appender-ref ref="LOGSTASH" /> <appender-ref ref="CONSOLE" /> </root> </configuration>
验证
-
按程序启动好了之后关上 es-head 插件即可查看索引信息,其中有一条调用接口的信息,还有利用启动的音讯
-
Kibana 数据展现
-
设置索引规定
输出之后设置工夫戳匹配
-
展现数据, 抉择 Discover
-
退出 filebeat
-
logstash 新建 filebeat-logstash-log.conf
input{ beats { host => "192.168.123.166" port => 9600 } } output{ elasticsearch{hosts=>["192.168.123.166:9200"] index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" } }
-
启动
bin/logstash -f filebeat-logstash-log.conf
-
Filebeat 批改配置文件, 找到下方批改的中央批改即可,次要监听的日志文件和输入的 logstash 服务器地址
filebeat.inputs: - type: log enabled: true paths: - /cxt/codework/java/springboot-demo/logs/springboot-elk/2022-06-04/info.2022-06-04.0.log setup.kibana: Host: "192.168.123.166:5601" #----------------------------- Logstash output -------------------------------- output.logstash: # The Logstash hosts hosts: ["192.168.123.166:9600"]
-
springboot 应用程序配置生成日志文件的地位,resource 下新建 logback-spring-file.xml
<?xml version="1.0" encoding="UTF-8"?> <configuration debug="false" scan="false"> <!-- Log file path --> <property name="log.path" value="logs/springboot-elk"/> <!-- Console log output --> <appender name="console" class="ch.qos.logback.core.ConsoleAppender"> <encoder> <pattern>%d{MM-dd HH:mm:ss.SSS} %-5level [%logger{50}] - %msg%n </pattern> </encoder> </appender> <!-- Log file debug output --> <appender name="fileRolling_info" class="ch.qos.logback.core.rolling.RollingFileAppender"> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>${log.path}/%d{yyyy-MM-dd}/info.%d{yyyy-MM-dd}.%i.log</fileNamePattern> <TimeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP"> <maxFileSize>50MB</maxFileSize> </TimeBasedFileNamingAndTriggeringPolicy> </rollingPolicy> <encoder> <pattern>%date [%thread] %-5level [%logger{50}] %file:%line - %msg%n </pattern> </encoder> <!--<filter class="ch.qos.logback.classic.filter.LevelFilter"> <level>ERROR</level> <onMatch>DENY</onMatch> <onMismatch>NEUTRAL</onMismatch> </filter> --> </appender> <!-- Log file error output --> <appender name="fileRolling_error" class="ch.qos.logback.core.rolling.RollingFileAppender"> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>${log.path}/%d{yyyy-MM-dd}/error.%d{yyyy-MM-dd}.%i.log</fileNamePattern> <timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP"> <maxFileSize>50MB</maxFileSize> </timeBasedFileNamingAndTriggeringPolicy> </rollingPolicy> <encoder> <pattern>%date [%thread] %-5level [%logger{50}] %file:%line - %msg%n </pattern> </encoder> <filter class="ch.qos.logback.classic.filter.ThresholdFilter"> <level>ERROR</level> </filter> </appender> <!-- Level: FATAL 0 ERROR 3 WARN 4 INFO 6 DEBUG 7 --> <root level="info"> <!--{dev.start}--> <appender-ref ref="console"/> <!--{dev.end}--> <!--{alpha.start} <appender-ref ref="fileRolling_info" /> {alpha.end}--> <!-- {release.start}--> <appender-ref ref="fileRolling_info"/> <!-- {release.end}--> <appender-ref ref="fileRolling_error"/> </root> <!-- Framework level setting --> <!-- <include resource="config/logger-core.xml" />--> <!-- Project level setting --> <!-- <logger name="your.package" level="DEBUG" /> --> <logger name="org.springframework" level="INFO"></logger> <logger name="org.mybatis" level="INFO"></logger> </configuration>
-
application.yml 文件中指定 logback-spring-file.xml
logging: # 默认 logback-spring.xml 应用 logstash 传输到 es;# 改为 logback-spring-file.xml 传输日志到归档日志文件,应用 filebeat 监听日志 config: classpath:logback-spring-file.xml
- 到这流程就完结了,当初流程就是
好了,以上就是 ELK 的搭建流程,顺带应用 filebeat 监听日志文件的也做了,大体就是这个酱紫,良久没有肝文章了,前面会陆续推出更多深度实践好文,欢送关注微信公众号:《醉鱼 JAVA》一起提高学习
github