首先确定以后hive的版本
例如:我目前的版本就是
hive-common-1.1.0-cdh5.16.2.jar
SpringBoot工程通过jdbc连贯Hive
留神:引入的Hive-jdbc依赖需与Hive版本保持一致。
顺次对应hive-common-1.1.0引入的Hive-jdbc版本为1.1.0(其中排除log4j依赖是为了解决原工程引入Hive依赖后导致日志不输入问题)
<dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId>
<version>1.1.0</version>
<exclusions> <exclusion> <groupId>org.eclipse.jetty.aggregate</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-slf4j-impl</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>org.apache.log4j</groupId> <artifactId>org.apache.hive.org.apache.log4j</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency>
关引入Hive-jdbc1.1.0再去尝试连贯同样会产生报错
报错信息如下
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:393)at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:187)at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:163)at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)at java.sql.DriverManager.getConnection(DriverManager.java:664)at java.sql.DriverManager.getConnection(DriverManager.java:247)at com.hundsun.duqa.admin.api.service.DuqaHiveJDBCTest.test2(DuqaHiveJDBCTest.java:42)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)at org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:75)at org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:86)at org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:84)at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:252)at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:94)at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70)at org.junit.runners.ParentRunner.run(ParentRunner.java:363)at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:191)at org.junit.runner.JUnitCore.run(JUnitCore.java:137)at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)at java.lang.ClassLoader.loadClass(ClassLoader.java:418)at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)at java.lang.ClassLoader.loadClass(ClassLoader.java:351)... 35 more
产生谬误的起因是咱们须要同样引入Hadoop-common包,这里我抉择的是2.6.5
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.6.5</version> <exclusions> <exclusion> <artifactId>servlet-api</artifactId> <groupId>javax.servlet</groupId> </exclusion> <exclusion> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-slf4j-impl</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>org.apache.log4j</groupId> <artifactId>org.apache.hive.org.apache.log4j</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency>
再去测试咱们的Demo,胜利了!
Demo代码如下:
public void test2() throws Exception{
// 注册数据库驱动,用的hive的jdbc,驱动名固定写死 Class.forName("org.apache.hive.jdbc.HiveDriver"); // 如果用的是hive2服务,则写jdbc:hive2,前面跟上hive服务器的ip以及端口号,端口号默认是10000 Connection conn = DriverManager.getConnection("jdbc:hive2://1服务器IP:10000/库名", "用户名", "明码"); Statement stat = conn.createStatement(); ResultSet rs = stat.executeQuery("select api_id from idc.ods_duas_data_services_mng where thedate = '20220424' and api_id regexp 'companyInfo'"); while (rs.next()) { String api_id = rs.getString("api_id"); System.out.println(api_id); } stat.close(); conn.close();}
完满解决SpringBoot通过jdbc连贯Hive问题!
如果本文对你有帮忙,别忘记给我个3连 ,点赞,转发,评论,,咱们下期见。
珍藏 等于白嫖,点赞才是真情。