hive3.1.2安装

1.下载

wget https://mirrors.tuna.tsinghua.edu.cn/apache/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz

2.解压

tar -zxvf apache-hive-3.1.2-bin.tar.gz

3.重命名

mv apache-hive-3.1.2-bin hive

配置hive

1.编辑hive-site.xml

#新建一个hive-site.xml文件
vim /usr/share/hive/conf/hive-site.xml
#加入如下内容(其中配置了MySQL用以存储元数据)
<configuration>
  
    <property>
      <name>hive.exec.scratchdir</name>
      <value>/home/hadoop/scratchdir</value>
    </property>

    <property>
      <name>hive.metastore.warehouse.dir</name>
      <value>/home/hadoop/warehouse</value>
    </property>

    <property>
      <name>hive.metastore.uris</name>
      <value>thrift://hadoop:9083</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.cj.jdbc.Driver</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://hadoop:3306/hive?createDatabaseIfNotExist=true&amp;useSSL=false</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionPassword</name>
      <value>hive</value>
    </property>

mysql配置

为什么配置MySQL,使用了hive为什么要加mysql?
1.下载MySQL的java版本的驱动解压,放到hive/lib文件内,并修改权限

wget https://downloads.mysql.com/archives/get/p/3/file/mysql-connector-java-8.0.11.tar.gz
tar -zxvf mysql-connector-java-8.0.11.tar.gz
cd mysql-connector-java-8.0.11
chmod 777 mysql-connector-java-8.0.11.jar
cp mysql-connector-java-8.0.11.jar /usr/share/hive/lib/

2.mysql中建立相应的用户与数据库

CREATE USER 'hive'@'%' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%';
DELETE FROM mysql.user WHERE user='';
flush privileges;
CREATE DATABASE hive charset=utf8;

加入环境变量

vim /etc/bashrc
#最后加入如下内容
export HIVE_HOME=/usr/share/hive
export PATH=$PATH:/usr/share/miniconda3/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$HBASE_HOME/bin/hbase
#保存退出后。执行:
source /etc/bashrc

启动hive

1.初始化schema

schematool -dbType mysql -initSchema

2.启动metastore服务

hive --service metastore &

3.进入hive

hive

报错解决

1.

(base) \[root@Centos bin\]# schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in \[jar:file:/usr/share/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: Found binding in \[jar:file:/usr/share/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type \[org.apache.logging.slf4j.Log4jLoggerFactory\]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5104)
    at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
    at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

1.1 SLF4J报错,是因为hadoop的slf4j 与hive的slf4j jar包发生了冲突,移除其中一个即可。

rm -rf /usr/share/hive/lib/slf4j-log4j12-1.7.25.jar

1.2 NoSuchMethodError,是因为hive内依赖的guava.jar和hadoop内的版本不一致造成的。比较一下哪个版本低,将版本低的替换即可(我的是hadoop的高)。

cd /usr/share/hadoop/share/hadoop/common/lib/
cp guava-27.0-jre.jar /usr/share/hive/lib/
rm /usr/share/hive/lib/版本低的guava.jar

2

/usr/share/hadoop/libexec/hadoop-functions.sh:行2366: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: 错误的替换
/usr/share/hadoop/libexec/hadoop-functions.sh:行2461: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: 错误的替换

hbase版本过高,可以换较低版本(我更换的是hbase1.6版本,没有了这个报错),也可以忽略


SyntaxError
199 声望20 粉丝