[转]Spark-OpenTSDB 设置

https://libraries.io/github/SeelozInc/opentsdb-spark

opentsdb-spark

Module for accessing OpenTSDB data through Spark.

Installation.

On your client (SparkDriver)

Execute the following in your terminal

wgethttps://github.com/achak1987/opentsdb-spark/archive/master.zip

unzip master.zip

cd opentsdb-spark-master

sbt eclipse (if you dont have sbt, it can be installed fromwww.scala-sbt.org)

The project can be now imported into eclipse

On your cluster.

For each node in the cluster add the following into your com file

nano $Spark_Home/conf/spark-env.sh

copy the following to the end of the file.

export HBASE_HOME=/home/mahmoud/hbase-1.1.3

for dir in $HBASE_HOME/lib/*.jar

do

if [ “$dir” = “$HBASE_HOME/lib/netty-3.2.4.Final.jar” ] ; then

continue;

fi

if [ “$dir” = “$HBASE_HOME/lib/netty-all-4.0.23.Final.jar” ] ; then

continue;

fi

SPARK_CLASSPATH=”$SPARK_CLASSPATH:$dir”

done

System Info

Apache Hadoop 2.6.0, Apache Hbase 1.1.2, Apache Spark 1.6.0, Scala 2.10.5

    原文作者:阿甘run
    原文地址: https://www.jianshu.com/p/917d5dcd267f
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞