linux 安装与kafka 集成

.1 kafka+zookeeper集群环境以安装

1.2 下载flume

本文使用flume1.7

下载地址:http://flume.apache.org/download.html

二.配置flume

2.1 上传flume

  1. #上传下载包至/opt/software
  2. cd /opt/software
  3. rz apache-flume-1.7.0-bin.tar.gz
  4. #解压
  5. tar -zxvf apache-flume-1.7.0-bin.tar.gz
  6. #复制apache-flume-1.7.0-bin至/usr/local
  7. cp -r apache-flume-1.7.0-bin flume

2.2 配置环境变量

  1. vi /etc/profile
  2. export FLUME_HOME=/usr/local/flume
  3. export path=$PATH:$FLUME_HOME/bin
  4. 保存退出
  5. 重新编译profile
  6. source /etc/profile

2.3 配置flume-env

  1. #添加jdk路径
  2. cd /usr/local/flume/conf
  3. cp -r flume-env.sh.template flume-env.sh
  4. vi flume-env.sh
  5. export JAVA_HOME=/usr/local/jdk

三.测试flume+kafka

kafka接受flume监控数据

3.1 配置flume-conf

cp -r flume-conf.properties.template flume-conf.properties

vi flume-conf.properties

  1. #配置文件信息
  2. agent.sources = s1
  3. agent.channels = c1
  4. agent.sinks = k1
  5. agent.sources.s1.type=exec
  6. agent.sources.s1.command=tail -F /opt/software/abc.log
  7. agent.sources.s1.channels=c1
  8. agent.channels.c1.type=memory
  9. agent.channels.c1.capacity=10000
  10. agent.channels.c1.transactionCapacity=100
  11. #设置Kafka接收器
  12. agent.sinks.k1.type= org.apache.flume.sink.kafka.KafkaSink
  13. #设置Kafka的broker地址和端口号
  14. agent.sinks.k1.brokerList=192.168.32.128:9092,192.168.32.131:9092,192.168.32.132:9092
  15. #设置Kafka的Topic
  16. agent.sinks.k1.topic=flumeTest
  17. #设置序列化方式
  18. agent.sinks.k1.serializer.class=kafka.serializer.StringEncoder
  19. agent.sinks.k1.channel=c1

3.2 编写简单Shell脚本abc.sh,并修改权限为可执行权限

  1. #定位
  2. cd /opt/software
  3. #新建abc.sh
  4. vi abc.sh
  5. for((i=0;i<=50000;i++));
  6. do echo “test-“+$i>>abc.log;
  7. done

chmod 755 abc.sh

3.3 启动zookeeper和kafka

详情见:http://blog.csdn.net/a123demi/article/details/70279296

3.4 kafka消费者监听flumeTest主题

bin/kafka-console-consumer.sh –zookeeper 192.168.32.128:2181,192.168.32.131:2181,192.168.32.132:2181 –topic flumeTest –from-beginning

3.5 启动flume

./bin/flume-ng agent -n agent -c conf -f conf/flume-conf.properties -Dflume.root.logger=INFO,console

注意:-n agent中要与flume-conf配置文件中agent对应

3.6 运行abc.sh

./abc.sh

3.7 输出结果

kafka消费者接受数据

    原文作者:小白中的搬运工
    原文地址: https://zhuanlan.zhihu.com/p/43339856
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞