Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
Tags
- IntelliJ
- MSSQL
- 공정능력
- Sqoop
- Android
- SSL
- window
- 보조정렬
- JavaScript
- GIT
- xPlatform
- Spring
- Express
- SQL
- Eclipse
- Java
- plugin
- NPM
- mybatis
- es6
- table
- SPC
- mapreduce
- vaadin
- tomcat
- Kotlin
- react
- R
- Python
- hadoop
Archives
- Today
- Total
DBILITY
producer filebeat 테스트 본문
반응형
kafka를 data bus로 사용하고, hdfs 저장은 HDFSSinkConnector를 통해 저장할 수 있으니,
producer로 flume agent를 사용하지 않더라도 경우에 따라선 filebeat으로 대체도 가능하겠다.
테스트에 필요한 로그데이터는 flume을 통해 생성해서 file_roll sink로 ./logdata 디렉토리에 저장한다.
filebeat에서 log를 읽어 kafka out을 통해 kafka topic으로 보낸다.
kafka 토픽생성
[kafka@big-slave4 ~]$ kafka-topics.sh \
--zookeeper big-master:2181,big-slave1:2181,big-slave2:2181/kafka-cluster \
--topic filebeat-topic --partitions 1 --replication-factor 1 --create
Created topic "filebeat-topic ".
로그데이터 생성 flume properties 및 agent 실행
#Source
agent.sources = randomGen
agent.sources.randomGen.type = com.dbility.bigdata.flume.source.RandomDataGenerator
agent.sources.randomGen.batchSize = 1
agent.sources.randomGen.channels = memoryChannel
#Channel
agent.channels = memoryChannel
agent.channels.memoryChannel.type = memory
agent.channels.memoryChannel.capacity = 10000
agent.channels.memoryChannel.transactionCapacity = 10000
agent.channels.memoryChannel.byteCapacityBufferPercentage = 20
agent.channels.memoryChannel.byteCapacity = 800000
#Sink
agent.sinks = fileRollSink
agent.sinks.fileRollSink.channel = memoryChannel
agent.sinks.fileRollSink.type = file_roll
agent.sinks.fileRollSink.sink.directory = ./logdata
agent.sinks.fileRollSink.sink.pathManager.extension=log
agent.sinks.fileRollSink.sink.pathManager.prefix=flume2filebeat_
agent.sinks.fileRollSink.sink.rollInterval=300
agent.sinks.fileRollSink.sink.serializer=TEXT
agent.sinks.fileRollSink.batchSize=100
#agent 실행
E:\apache-flume-1.8.0-bin>bin\flume-ng agent -n agent -c conf -f conf\randomGen3.properties
E:\apache-flume-1.8.0-bin>set JAVA_HOME=E:\apache-flume-1.8.0-bin\jdk180\jre
E:\apache-flume-1.8.0-bin>set JAVA_OPTS=" -Xmx256M"
E:\apache-flume-1.8.0-bin>powershell.exe -NoProfile -InputFormat none -ExecutionPolicy unrestricted -File E:\apache-flume-1.8.0-bin\bin\flume-ng.ps1 agent -n agent -c conf -f conf\randomGen3.properties
WARN: Config directory not set. Defaulting to E:\apache-flume-1.8.0-bin\conf
Sourcing environment configuration script E:\apache-flume-1.8.0-bin\conf\flume-env.ps1
WARN: Did not find E:\apache-flume-1.8.0-bin\conf\flume-env.ps1
WARN: HADOOP_PREFIX or HADOOP_HOME not found
WARN: HADOOP_PREFIX not set. Unable to include Hadoop's classpath & java.library.path
WARN: HBASE_HOME not found
WARN: HIVE_HOME not found
Running FLUME agent :
class: org.apache.flume.node.Application
arguments: -n agent -f "E:\apache-flume-1.8.0-bin\conf\randomGen3.properties"
2018-05-12 10:46:10,359 (SinkRunner-PollingRunner-DefaultSinkProcessor) [DEBUG - org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:173)] Opening output stream for file .\logdata\flume2filebeat_1526089570279-1.log
2018-05-12 10:46:10,437 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:11,438 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:13,442 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:16,445 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:20,445 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:25,447 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:30,450 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:35,452 (PollableSourceRunner-RandomDataGenerator-randomGen) [INFO - com.dbility.bigdata.flume.source.RandomDataGenerator.doProcess(RandomDataGenerator.java:90)] processEvent batchSize : 1
2018-05-12 10:46:40,356 (conf-file-poller-0) [DEBUG - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:127)] Checking file:E:\apache-flume-1.8.0-bin\conf\randomGen3.properties for changes
filebeat.yml 설정 및 실행
filebeat.prospectors:
type: log
paths:
- e:\apache-flume-1.8.0-bin\logdata\*.log
multiline.pattern: ^\[
multiline.negate: true
multiline.match: after
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: true
setup.template.settings:
index.number_of_shards: 3
output.kafka :
hosts : ["big-slave2:9092","big-slave3:9092","big-slave4:9092"]
topic : "filebeat-topic"
partition.round_robin :
reachable_only : false
required_acks : 1
compression : none
max_message_bytes : 1000000
logging.level: debug
E:\filebeat-6.2.4-windows-x86_64>filebeat run -v -c filebeat.yml
kakfa consumer로 데이터 유입 확인
[kafka@big-slave4 ~]$ kafka-console-consumer.sh \
--bootstrap-server big-slave2:9092,big-slave3:9092,big-slave4:9092 \
--topic filebeat-topic \
--group filebeat-group-consumers \
--from-beginning
{"@timestamp":"2018-05-12T01:59:44.053Z","@metadata":{"beat":"filebeat","type":"doc","version":"6.2.4","topic":"filebeat-topic"},"prospector":{"type":"log"},"beat":{"name":"ROOKIE-PC","hostname":"ROOKIE-PC","version":"6.2.4"},"source":"e:\\apache-flume-1.8.0-bin\\logdata\\flume2filebeat_1526090377475-1.log","offset":213,"message":"0,randomGen-192.168.100.18,79h9-3gdpgml0-cpe1-vglh,2018-05-12 10:59:37\n1,randomGen-192.168.100.18,m2b7-uvg7y9w7-xjyb-spon,2018-05-12 10:59:38\n2,randomGen-192.168.100.18,8bwt-1gml0wnx-vhao-y3ee,2018-05-12 10:59:40"}
{"@timestamp":"2018-05-12T01:59:51.073Z","@metadata":{"beat":"filebeat","type":"doc","version":"6.2.4","topic":"filebeat-topic"},"offset":355,"message":"3,randomGen-192.168.100.18,1br3-xy95tnwe-7iuj-fuce,2018-05-12 10:59:43\n4,randomGen-192.168.100.18,gu06-dcvs0u84-zxtv-3463,2018-05-12 10:59:47","source":"e:\\apache-flume-1.8.0-bin\\logdata\\flume2filebeat_1526090377475-1.log","prospector":{"type":"log"},"beat":{"name":"ROOKIE-PC","hostname":"ROOKIE-PC","version":"6.2.4"}}
반응형
'bigdata > kafka' 카테고리의 다른 글
ksql 설치 (0) | 2018.05.17 |
---|---|
flume kafka sink test ( 풀럼 카프카 싱크 테스트 ) (0) | 2018.05.07 |
kafka-manager 설치 (0) | 2018.05.07 |
kafka 설치 (0) | 2018.05.06 |
Comments