乐趣区

关于linux:Centos7安装hive312

hive3.1.2 装置

1. 下载

wget https://mirrors.tuna.tsinghua.edu.cn/apache/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz

2. 解压

tar -zxvf apache-hive-3.1.2-bin.tar.gz

3. 重命名

mv apache-hive-3.1.2-bin hive

配置 hive

1. 编辑 hive-site.xml

# 新建一个 hive-site.xml 文件
vim /usr/share/hive/conf/hive-site.xml
#退出如下内容(其中配置了 MySQL 用以存储元数据)<configuration>
  
    <property>
      <name>hive.exec.scratchdir</name>
      <value>/home/hadoop/scratchdir</value>
    </property>

    <property>
      <name>hive.metastore.warehouse.dir</name>
      <value>/home/hadoop/warehouse</value>
    </property>

    <property>
      <name>hive.metastore.uris</name>
      <value>thrift://hadoop:9083</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.cj.jdbc.Driver</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://hadoop:3306/hive?createDatabaseIfNotExist=true&amp;useSSL=false</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
    </property>

    <property>
      <name>javax.jdo.option.ConnectionPassword</name>
      <value>hive</value>
    </property>

mysql 配置

为什么配置 MySQL,应用了 hive 为什么要加 mysql?
1. 下载 MySQL 的 java 版本的驱动解压,放到 hive/lib 文件内,并批改权限

wget https://downloads.mysql.com/archives/get/p/3/file/mysql-connector-java-8.0.11.tar.gz
tar -zxvf mysql-connector-java-8.0.11.tar.gz
cd mysql-connector-java-8.0.11
chmod 777 mysql-connector-java-8.0.11.jar
cp mysql-connector-java-8.0.11.jar /usr/share/hive/lib/

2.mysql 中建设相应的用户与数据库

CREATE USER 'hive'@'%' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%';
DELETE FROM mysql.user WHERE user='';
flush privileges;
CREATE DATABASE hive charset=utf8;

退出环境变量

vim /etc/bashrc
#最初退出如下内容
export HIVE_HOME=/usr/share/hive
export PATH=$PATH:/usr/share/miniconda3/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$HBASE_HOME/bin/hbase
#保留退出后。执行:source /etc/bashrc

启动 hive

1. 初始化 schema

schematool -dbType mysql -initSchema

2. 启动 metastore 服务

hive --service metastore &

3. 进入 hive

hive

报错解决

1.

(base) \[root@Centos bin\]# schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in \[jar:file:/usr/share/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: Found binding in \[jar:file:/usr/share/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class\]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type \[org.apache.logging.slf4j.Log4jLoggerFactory\]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5104)
    at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
    at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

1.1 SLF4J 报错,是因为 hadoop 的 slf4j 与 hive 的 slf4j jar 包产生了抵触,移除其中一个即可。

rm -rf /usr/share/hive/lib/slf4j-log4j12-1.7.25.jar

1.2 NoSuchMethodError,是因为 hive 内依赖的 guava.jar 和 hadoop 内的版本不统一造成的。比拟一下哪个版本低,将版本低的替换即可(我的是 hadoop 的高)。

cd /usr/share/hadoop/share/hadoop/common/lib/
cp guava-27.0-jre.jar /usr/share/hive/lib/
rm /usr/share/hive/lib/ 版本低的 guava.jar

2

/usr/share/hadoop/libexec/hadoop-functions.sh: 行 2366: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: 谬误的替换
/usr/share/hadoop/libexec/hadoop-functions.sh: 行 2461: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: 谬误的替换 

hbase 版本过高,能够换较低版本 (我更换的是 hbase1.6 版本,没有了这个报错),也能够疏忽

退出移动版