apache-hive-1.2.2安装

只需要在namenode节点中安装即可

配置环境变量

将如下内容添加到 ~/.bash_profile中

export HIVE_HOME=/root/software/apache-hive-1.2.1-bin
export PATH=$PATH:$HIVE_HOME/bin

拷贝mysql jdbc 包到hive/lib

cp mysql-connector-java-5.1.40-bin.jar  $HIVE_HOME/lib

Hive配置

配置hive-env.sh

拷贝hive-env.sh.template到hive-env.sh,添加如下内容:

export HADOOP_HOME=/usr/local/hadoop-2.6.5
export HIVE_CONF_DIR=/root/software/apache-hive-1.2.1-bin/conf

配置hive-site.xml

拷贝hive-default.xml.template到hive-site.xml,将hive-site.xml中ConnectionURL、ConnectionUserName、ConnectionPassword修改为以下内容。

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.2.180:3306/hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
    <description>username to use against metastore database</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hunter</value>
    <description>password to use against metastore database</description>
  </property>
</configuration>
  • 将warehouse.dir替换为 /hive/warehouse
  • 将local.scratchdir替换为 $HIVE_HOME/exec
  • 将downloaded.resources.dir替换为 $HIVE_HOME/downloadedresources

说明:这里使用的mysql已经安装好。

创建配置中的目录

hdfs中的目录

hdfs dfs -mkdir -p /hive/warehouse
hdfs dfs -mkdir -p /hive/log
hdfs dfs -mkdir -p /hive/tmp

本地文件系统中的目录

cd $HIVE_HOME
mkdir log exec downloadedresources

启动hive的matestore

启动hive metastore

hive --service metastore &

停止hive metastore

ps -ef|grep hive|grep -v grep|awk '{print "kill -9", $2}'|sh

进入hive交互CLI

hive
hive> create table t2(id int, name string);
OK
Time taken: 0.104 seconds
hive> show tables;
OK
t2
Time taken: 0.048 seconds, Fetched: 2 row(s)
hive> insert into t2 values(1,'ZhangSan');

错误解决方法

错误1:Hive报错:”/spark//lib/spark-assembly-*.jar: No such file or directory”
问题原因: 新版本的spark,hive没有及时支持更新.

问题解决:

vim $HIVE_HOME/bin/hive 

// 定位到位置,上面一行是原有的,下一行是修改的
#sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`
sparkAssemblyPath=`ls ${SPARK_HOME}/jars/*.jar`

错误2:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don’t support retries at the client level.

错误原因:数据库的字符集不对,应该是latin1;
解决方法:重建数据库;

mysql> show create database hivemeta\G
*************************** 1. row ***************************
       Database: hivemeta
Create Database: CREATE DATABASE `hive` /*!40100 DEFAULT CHARACTER SET utf8 */
1 row in set (0.00 sec)

mysql> drop database hive;
Query OK, 22 rows affected (0.16 sec)

mysql> create database hive;
Query OK, 1 row affected (0.00 sec)

mysql> show create database hive\G
*************************** 1. row ***************************
       Database: hive
Create Database: CREATE DATABASE `hive` /*!40100 DEFAULT CHARACTER SET latin1 */
1 row in set (0.00 sec)

参考

    原文作者:georgeguo
    原文地址: https://www.jianshu.com/p/0bde2e932e74
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞