java – jenkins中找不到Spark-version-info.properties

我正在开发一个使用spark-core lib的插件.当我将它作为
java应用程序运行时,它是可以的,但是当我在Jenkins中运行插件时,它会显示一个错误

    java.lang.ExceptionInInitializerError
    at org.apache.spark.package$.<init>(package.scala:91)
    at org.apache.spark.package$.<clinit>(package.scala)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:185)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:185)
    at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
    at org.apache.spark.SparkContext.logInfo(SparkContext.scala:74)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:185)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
    at com.plugin.goettingen_plugin.HelloWorldBuilder.perform(HelloWorldBuilder.java:88)
    at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:75)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
    at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:785)
    at hudson.model.Build$BuildExecution.build(Build.java:205)
    at hudson.model.Build$BuildExecution.doRun(Build.java:162)
    at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:537)
    at hudson.model.Run.execute(Run.java:1741)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
    at hudson.model.ResourceController.execute(ResourceController.java:98)
    at hudson.model.Executor.run(Executor.java:408)
Caused by: org.apache.spark.SparkException: Error while locating file spark-version-info.properties
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:75)
    at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:61)
    at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
    ... 23 more
Caused by: java.lang.NullPointerException
    at java.util.Properties$LineReader.readLine(Properties.java:434)
    at java.util.Properties.load0(Properties.java:353)
    at java.util.Properties.load(Properties.java:341)
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:64)
    ... 25 more

我使用以下代码启动spark会话:

SparkSession sparkSession = SparkSession.builder().appName("DP-App").master("local[2]").getOrCreate();

spark-core lib通过以下代码搜索名为package.java的类,该代码返回null:

InputStream resourceStream = Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties");

由于spark-version-info.properties存在于spark-core lib中,我试图将文件转移到Web-INF,但仍然无法加载文件.
是否有任何替代方法来加载文件并绕过库中的上述代码?

我的依赖是:

  <dependencies>
    <dependency>
      <groupId>org.jenkins-ci.plugins</groupId>
      <artifactId>credentials</artifactId>
      <version>1.9.4</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-mllib_2.11</artifactId>
      <version>2.0.1</version>
</dependency>
  </dependencies>

最佳答案 您缺少spark-version-info.properties文件.

所以只需在./core/target/extra-resources下创建一个

λ ~/workspace/big_data/spark/ master* ./build/spark-build-info ./core/target/extra-resources 2.1.1
λ ~/workspace/big_data/spark/ master* cat ./core/target/extra-resources/spark-version-info.properties
version=2.1.1
user=chanhle
revision=dec9aa3b37c01454065a4d8899859991f43d4c66
branch=master
date=2017-06-07T15:12:48Z
url=https://github.com/apache/spark

在IntelliJ上调试Spark时,我也遇到了同样的问题.

点赞