MangoCool

bad symbolic reference. A signature in package.class refers to type compileTimeOnly [error] in package scala.annotation which is not available.

2015-07-22 10:09:24   作者:MangoCool   来源:MangoCool

在win7环境下,跑Spark应用的时候,因为不怎么会用sbt,在加载Sacla依赖的时候总不是自己指定的版本,版本是2.11.2版本的,看起来像是默认的。于是乎我没办法用自己想要的版本2.10.4版本的,即使强迫使用也会造成很多错误,所以我只能使用2.11.2版本的,但是将其打包放到集群上去运行时也还是报错了:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
        at SparkHbase$.main(SparkHbase.scala:95)
        at SparkHbase.main(SparkHbase.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

这是我的build.sbt文件的内容:

name := "sbtTest"

version := "1.0"

scalaVersion := "2.11.2"

libraryDependencies ++= Seq("org.apache.hadoop" % "hadoop-main" % "2.7.0",
  "org.apache.hbase" % "hbase-server" % "0.98.12.1-hadoop2",
  "org.apache.hbase" % "hbase-client" % "0.98.12.1-hadoop2",
  "org.apache.hbase" % "hbase-common" % "0.98.12.1-hadoop2",
  "org.apache.spark" % "spark-core_2.11" % "1.3.1"
)

当然我猜到造成这个错误的原因很有可能是因为我编译的版本(Scala-2.11.2)和我集群运行的版本(Scala-2.10.4)不一致而引起的。说明:我使用Spark-1.3.1版本的。

于是乎我又回到了最初的问题,就是想办法在本地加载的依赖是Scala-2.10.4版本的。

以下我感觉是因为版本不对引起的错误:

[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:55: Reference to method foreach in class Range should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]     for (j <- 0 until count.toInt)
[error]                 ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:55: Reference to method intWrapper in class LowPriorityImplicits should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]     for (j <- 0 until count.toInt)
[error]               ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:58: Reference to method raw in class Result should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]       val kvs = rs.raw
[error]                    ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:61: Reference to method getRow in class KeyValue should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]         val rowKey = new String(kv.getRow())
[error]                                    ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:62: Reference to method getFamily in class KeyValue should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]         val family = new String(kv.getFamily())
[error]                                    ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:63: Reference to method getQualifier in class KeyValue should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]         val column = new String(kv.getQualifier())
[error]                                    ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:64: Reference to method getValue in class KeyValue should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]         val value = new String(kv.getValue())
[error]                                   ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:68: java.lang.NumberFormatException
[error]           map.put(rowKey, v+Integer2int(Integer.valueOf(value)))
[error]                                                 ^
[error] D:\WorkSpace-datang\trunk\sbtTest\src\main\scala\SparkHbase.scala:71: java.lang.NumberFormatException
[error]           map.put(rowKey, Integer2int(Integer.valueOf(value)))
[error]                                               ^
[warn] 45 warnings found
[error] 10 errors found
[error] (compile:compile) Compilation failed
[error] Total time: 73 s, completed 2015-7-22 9:34:11
SBT project import
[warn] Binary version (2.11) for dependency org.scala-lang#scala-library;2.11.2 
[warn] in sbttest#sbttest$sources_javadoc_2.10;1.0 differs from Scala binary version in project (2.10). 
[warn] Binary version (2.11) for dependency org.scala-lang#scalap;2.11.0 
[warn] in sbttest#sbttest$sources_javadoc_2.10;1.0 differs from Scala binary version in project (2.10). 
[warn] Binary version (2.11) for dependency org.scala-lang#scala-compiler;2.11.0 
[warn] in sbttest#sbttest$sources_javadoc_2.10;1.0 differs from Scala binary version in project (2.10). 
[warn] Binary version (2.11) for dependency org.scala-lang#scala-reflect;2.11.2 
[warn] in sbttest#sbttest$sources_javadoc_2.10;1.0 differs from Scala binary version in project (2.10).

通过google以上错误和警告,基本都是反映是Scala版本的问题,但是并没有方法能帮助我直接解决的,于是我只能重新阅读sbt官方文档。

当我阅读到这章的时候:http://www.scala-sbt.org/0.12.4/docs/Howto/scala.html

我注意到其中有一点是讲如何取消自动加载Scala依赖方法:

autoScalaLibrary := false
看到这里,我终于明白我的sbt项目里面为什么下载2.11.2版本的Scala依赖了,于是我将2.11.2版本的Scala的依赖删除,build.sbt中加上以上配置,在指定我想要下载的Scala版本,build.sbt如下:
name := "sbtTest"

version := "1.0"
//指定编译的Scala版本
scalaVersion := "2.10.4"
//取消自动加载Scala依赖
autoScalaLibrary := false

libraryDependencies ++= Seq("org.apache.hadoop" % "hadoop-main" % "2.7.0",
  "org.scala-lang" % "scala-library" % "2.10.4",
  "org.scala-lang" % "scala-reflect" % "2.10.4",
  "org.scala-lang" % "scala-compiler" % "2.10.4",
  "org.scala-lang" % "scalap" % "2.10.4",
  "org.apache.hbase" % "hbase-server" % "0.98.12.1-hadoop2",
  "org.apache.hbase" % "hbase-client" % "0.98.12.1-hadoop2",
  "org.apache.hbase" % "hbase-common" % "0.98.12.1-hadoop2",
  "org.apache.spark" % "spark-core_2.10" % "1.3.1"
)
然后再刷新sbt工程,查看我的依赖库,终于sbt下载了我需要的Scala版本。执行sbt打包命令package。打包成功!

至此这个问题就解决了,不对或是不完善地方请多多指正以及多多谅解!

标签: SBT package spark exception Scala

分享:

上一篇Modules were resolved with conflicting cross-version suffixes in {file:/D:/WorkSpace/sbtTest/}sbttest

下一篇基于HDFS的Spark Sql简单示例

关于我

崇尚极简,热爱技术,喜欢唱歌,热衷旅行,爱好电子产品的一介码农。

座右铭

当你的才华还撑不起你的野心的时候,你就应该静下心来学习,永不止步!

人生之旅历途甚长,所争决不在一年半月,万不可因此着急失望,招精神之萎葸。

Copyright 2015- 芒果酷(mangocool.com) All rights reserved. 湘ICP备14019394号

免责声明:本网站部分文章转载其他媒体,意在为公众提供免费服务。如有信息侵犯了您的权益,可与本网站联系,本网站将尽快予以撤除。