java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

2015-07-16 09:19:57   作者:MangoCool   来源:MangoCool

用sbt打包Spark程序,并未将所有依赖都打入包中,把Spark应用放到集群中运行时,出现异常:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
        at SparkHbase$.main(SparkHbase.scala:34)
        at SparkHbase.main(SparkHbase.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 11 more

出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:

export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:
source spark-env.sh

并重启一下spark服务,就ok了!

其实还有一个方法,就是在你提交应用时增加--driver-class-path配置参数来设置driver的classpath:

./spark-submit --driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar  --class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar

注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:

15/08/14 09:22:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:444)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:442)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:442)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:430)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:365)
        at com.dtxy.data.SqlTest$.main(SqlTest.scala:27)
        at com.dtxy.data.SqlTest.main(SqlTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:444)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$8.apply(SparkConf.scala:442)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:442)
        at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:430)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:365)
        at com.dtxy.data.SqlTest$.main(SqlTest.scala:27)
        at com.dtxy.data.SqlTest.main(SqlTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO Utils: Shutdown hook called

到此为止,问题解决!


参考来源:http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html

标签: SBT Scala exception Spark

分享:

上一篇Waiting for lock on C:\Users\Administrator\.ivy2\.sbt.ivy.lock to be available...

下一篇org.apache.spark.SparkException: An application name must be set in your configuration

关于我

一个喜欢唱歌,热衷旅行,爱好电子产品的码农。没事,跟三五好友吼上几嗓子,约上几个背着行囊去露营或者宅在家里抱着孩子敲代码。

座右铭:当你的才华还撑不起你的野心的时候,你就应该静下心来学习,永不止步!

            人生之旅历途甚长,所争决不在一年半月,万不可因此着急失望,招精神之萎葸。


Copyright 芒果酷(mangocool.com) All rights reserved. 湘ICP备14019394号

免责声明:本网站部分文章转载其他媒体,意在为公众提供免费服务。如有信息侵犯了您的权益,可与本网站联系,本网站将尽快予以撤除。