flume+kafka實時監控log日誌

一、提前環境準備

1、zookeeper

2、kafka

3、flume

4、帶日誌輸出的任意程序

二、需求說明

使用flume的exec source實時監控某個程序的日誌打印,並實時輸出到kafka;

三、具體操作

1、在conf下新建kafka-logger.conf

agent.sources = s1
agent.channels = c1
agent.sinks = k1

agent.sources.s1.type=exec
agent.sources.s1.command=tail -F /home/xx/xx/log/DataToCache.log

agent.channels.c1.type=memory
agent.channels.c1.capacity=10000
agent.channels.c1.transactionCapacity=100

#設置Kafka接收器
agent.sinks.k1.type= org.apache.flume.sink.kafka.KafkaSink
#設置Kafka的broker地址和端口號
agent.sinks.k1.brokerList=192.168.230.21:6667,192.168.230.22:6667,192.168.230.23:6667
#設置Kafka的Topic
agent.sinks.k1.topic=flume_kafka
#設置序列化方式
agent.sinks.k1.serializer.class=kafka.serializer.StringEncoder

agent.sources.s1.channels=c1
agent.sinks.k1.channel=c1

2、開啓flume

[root@master bin]# ./flume-ng agent --conf conf --conf-file ../conf/kafka-logger.conf --name agent -Dflume.root.logger=INFO,console

3、開啓kafka消費

[root@master bin]# ./kafka-console-consumer.sh --zookeeper master:2181,slaves1:2181,slaves2:2181 --topic flume_kafka

4、啓動帶日誌程序,並實時查看日誌輸出

[root@master log]# tail -f DataToCache.log
2020-04-21 16:03:13.267 [main                 ] INFO  com.ndsc.datatocache.DatatocacheApplication - Starting DatatocacheApplication v0.0.1-SNAPSHOT on master with PID 11018 (/home/whiteNameList/dataToCache/DataToCache-1.0.0.jar started by root in /home/whiteNameList/dataToCache)
2020-04-21 16:03:13.329 [main                 ] INFO  com.ndsc.datatocache.DatatocacheApplication - No active profile set, falling back to default profiles: default
2020-04-21 16:03:13.570 [main                 ] INFO  org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext - Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@574caa3f: startup date [Tue Apr 21 16:03:13 CST 2020]; root of context hierarchy
2020-04-21 16:03:15.554 [main                 ] INFO  org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$b461dd77] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-04-21 16:03:16.429 [main                 ] INFO  org.springframework.boot.web.embedded.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8099 (http)
2020-04-21 16:03:16.452 [main                 ] INFO  org.apache.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8099"]
2020-04-21 16:03:16.465 [main                 ] INFO  org.apache.catalina.core.StandardService - Starting service [Tomcat]
2020-04-21 16:03:16.465 [main                 ] INFO  org.apache.catalina.core.StandardEngine - Starting Servlet Engine: Apache Tomcat/8.5.32
2020-04-21 16:03:16.541 [localhost-startStop-1] INFO  org.apache.catalina.core.AprLifecycleListener - The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib]
2020-04-21 16:03:16.725 [localhost-startStop-1] INFO  org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
2020-04-21 16:03:16.726 [localhost-startStop-1] INFO  org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 3157 ms
2020-04-21 16:03:16.840 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.ServletRegistrationBean - Servlet dispatcherServlet mapped to [/]
2020-04-21 16:03:16.854 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'characterEncodingFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'hiddenHttpMethodFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'httpPutFormContentFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'requestContextFilter' to: [/*]
2020-04-21 16:03:17.152 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.388 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter - Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@574caa3f: startup date [Tue Apr 21 16:03:13 CST 2020]; root of context hierarchy
2020-04-21 16:03:17.504 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/getMap]}" onto public java.util.Map<java.lang.String, java.lang.String> com.ndsc.datatocache.Controller.GetMapController.getMap()
2020-04-21 16:03:17.507 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse)
2020-04-21 16:03:17.508 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest)
2020-04-21 16:03:17.582 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.582 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.940 [main                 ] INFO  org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Registering beans for JMX exposure on startup
2020-04-21 16:03:17.970 [main                 ] INFO  org.springframework.context.support.DefaultLifecycleProcessor - Starting beans in phase 2147483547
2020-04-21 16:03:18.007 [main                 ] INFO  org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: 
	auto.commit.interval.ms = 5000
	auto.offset.reset = earliest
	bootstrap.servers = [192.168.230.21:6667, 192.168.230.22:6667, 192.168.230.23:6667]
	check.crcs = true
	client.id = 
	connections.max.idle.ms = 540000
	enable.auto.commit = true
	exclude.internal.topics = true
	fetch.max.bytes = 52428800
	fetch.max.wait.ms = 500
	fetch.min.bytes = 1
	group.id = consumeGroup14
	heartbeat.interval.ms = 3000
	interceptor.classes = null
	internal.leave.group.on.close = true
	isolation.level = read_uncommitted
	key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
	max.partition.fetch.bytes = 1048576
	max.poll.interval.ms = 300000
	max.poll.records = 500
	metadata.max.age.ms = 300000
	metric.reporters = []
	metrics.num.samples = 2
	metrics.recording.level = INFO
	metrics.sample.window.ms = 30000
	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
	receive.buffer.bytes = 65536
	reconnect.backoff.max.ms = 1000
	reconnect.backoff.ms = 50
	request.timeout.ms = 305000
	retry.backoff.ms = 100
	sasl.jaas.config = null
	sasl.kerberos.kinit.cmd = /usr/bin/kinit
	sasl.kerberos.min.time.before.relogin = 60000
	sasl.kerberos.service.name = null
	sasl.kerberos.ticket.renew.jitter = 0.05
	sasl.kerberos.ticket.renew.window.factor = 0.8
	sasl.mechanism = GSSAPI
	security.protocol = PLAINTEXT
	send.buffer.bytes = 131072
	session.timeout.ms = 10000
	ssl.cipher.suites = null
	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
	ssl.endpoint.identification.algorithm = null
	ssl.key.password = null
	ssl.keymanager.algorithm = SunX509
	ssl.keystore.location = null
	ssl.keystore.password = null
	ssl.keystore.type = JKS
	ssl.protocol = TLS
	ssl.provider = null
	ssl.secure.random.implementation = null
	ssl.trustmanager.algorithm = PKIX
	ssl.truststore.location = null
	ssl.truststore.password = null
	ssl.truststore.type = JKS
	value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-21 16:03:18.095 [main                 ] INFO  org.apache.kafka.common.utils.AppInfoParser - Kafka version : 1.0.2
2020-04-21 16:03:18.095 [main                 ] INFO  org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : 2a121f7b1d402825

...

5、查看kafka消費打印

[root@master bin]# ./kafka-console-consumer.sh --zookeeper master:2181,slaves1:2181,slaves2:2181 --topic flume_kafka
Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].
2020-04-21 16:03:13.267 [main                 ] INFO  com.ndsc.datatocache.DatatocacheApplication - Starting DatatocacheApplication v0.0.1-SNAPSHOT on master with PID 11018 (/home/whiteNameList/dataToCache/DataToCache-1.0.0.jar started by root in /home/whiteNameList/dataToCache)
2020-04-21 16:03:13.329 [main                 ] INFO  com.ndsc.datatocache.DatatocacheApplication - No active profile set, falling back to default profiles: default
2020-04-21 16:03:13.570 [main                 ] INFO  org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext - Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@574caa3f: startup date [Tue Apr 21 16:03:13 CST 2020]; root of context hierarchy
2020-04-21 16:03:15.554 [main                 ] INFO  org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$b461dd77] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-04-21 16:03:16.429 [main                 ] INFO  org.springframework.boot.web.embedded.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8099 (http)
2020-04-21 16:03:16.452 [main                 ] INFO  org.apache.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8099"]
2020-04-21 16:03:16.465 [main                 ] INFO  org.apache.catalina.core.StandardService - Starting service [Tomcat]
2020-04-21 16:03:16.465 [main                 ] INFO  org.apache.catalina.core.StandardEngine - Starting Servlet Engine: Apache Tomcat/8.5.32
2020-04-21 16:03:16.541 [localhost-startStop-1] INFO  org.apache.catalina.core.AprLifecycleListener - The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib]
2020-04-21 16:03:16.725 [localhost-startStop-1] INFO  org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
2020-04-21 16:03:16.726 [localhost-startStop-1] INFO  org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 3157 ms
2020-04-21 16:03:16.840 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.ServletRegistrationBean - Servlet dispatcherServlet mapped to [/]
2020-04-21 16:03:16.854 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'characterEncodingFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'hiddenHttpMethodFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'httpPutFormContentFilter' to: [/*]
2020-04-21 16:03:16.855 [localhost-startStop-1] INFO  org.springframework.boot.web.servlet.FilterRegistrationBean - Mapping filter: 'requestContextFilter' to: [/*]
2020-04-21 16:03:17.152 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.388 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter - Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@574caa3f: startup date [Tue Apr 21 16:03:13 CST 2020]; root of context hierarchy
2020-04-21 16:03:17.504 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/getMap]}" onto public java.util.Map<java.lang.String, java.lang.String> com.ndsc.datatocache.Controller.GetMapController.getMap()
2020-04-21 16:03:17.507 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse)
2020-04-21 16:03:17.508 [main                 ] INFO  org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping - Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest)
2020-04-21 16:03:17.582 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.582 [main                 ] INFO  org.springframework.web.servlet.handler.SimpleUrlHandlerMapping - Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]
2020-04-21 16:03:17.940 [main                 ] INFO  org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Registering beans for JMX exposure on startup
2020-04-21 16:03:17.970 [main                 ] INFO  org.springframework.context.support.DefaultLifecycleProcessor - Starting beans in phase 2147483547
2020-04-21 16:03:18.007 [main                 ] INFO  org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: 
	auto.commit.interval.ms = 5000
	auto.offset.reset = earliest
	bootstrap.servers = [192.168.230.21:6667, 192.168.230.22:6667, 192.168.230.23:6667]
	check.crcs = true
	client.id = 
	connections.max.idle.ms = 540000
	enable.auto.commit = true
	exclude.internal.topics = true
	fetch.max.bytes = 52428800
	fetch.max.wait.ms = 500
	fetch.min.bytes = 1
	group.id = consumeGroup14
	heartbeat.interval.ms = 3000
	interceptor.classes = null
	internal.leave.group.on.close = true
	isolation.level = read_uncommitted
	key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
	max.partition.fetch.bytes = 1048576
	max.poll.interval.ms = 300000
	max.poll.records = 500
	metadata.max.age.ms = 300000
	metric.reporters = []
	metrics.num.samples = 2
	metrics.recording.level = INFO
	metrics.sample.window.ms = 30000
	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
	receive.buffer.bytes = 65536
	reconnect.backoff.max.ms = 1000
	reconnect.backoff.ms = 50
	request.timeout.ms = 305000
	retry.backoff.ms = 100
	sasl.jaas.config = null
	sasl.kerberos.kinit.cmd = /usr/bin/kinit
	sasl.kerberos.min.time.before.relogin = 60000
	sasl.kerberos.service.name = null
	sasl.kerberos.ticket.renew.jitter = 0.05
	sasl.kerberos.ticket.renew.window.factor = 0.8
	sasl.mechanism = GSSAPI
	security.protocol = PLAINTEXT
	send.buffer.bytes = 131072
	session.timeout.ms = 10000
	ssl.cipher.suites = null
	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
	ssl.endpoint.identification.algorithm = null
	ssl.key.password = null
	ssl.keymanager.algorithm = SunX509
	ssl.keystore.location = null
	ssl.keystore.password = null
	ssl.keystore.type = JKS
	ssl.protocol = TLS
	ssl.provider = null
	ssl.secure.random.implementation = null
	ssl.trustmanager.algorithm = PKIX
	ssl.truststore.location = null
	ssl.truststore.password = null
	ssl.truststore.type = JKS
	value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-21 16:03:18.095 [main                 ] INFO  org.apache.kafka.common.utils.AppInfoParser - Kafka version : 1.0.2
2020-04-21 16:03:18.095 [main                 ] INFO  org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : 2a121f7b1d402825

...

四、注意

1、flume控制檯則不會實時打印 .log的信息;

2、也可以使用spooldir source + kafka、taildir source + kafka,實現方式和上面類似;

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章