scala – sbt – 对象apache不是包org的成员

我想使用sbt部署并提交一个spark程序,但是它的抛出错误.

码:

package in.goai.spark

import org.apache.spark.{SparkContext, SparkConf}

object SparkMeApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("First Spark")
    val sc = new SparkContext(conf)
    val fileName = args(0)
    val lines = sc.textFile(fileName).cache
    val c = lines.count
    println(s"There are $c lines in $fileName")
  }
}

build.sbt

    name := "First Spark"

    version := "1.0"

    organization := "in.goai"

    scalaVersion := "2.11.8"

    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

    resolvers += Resolver.mavenLocal

在第一个/项目目录下

build.properties

bt.version=0.13.9

当我试图运行sbt包时,它的抛出错误如下所示.

[root@hadoop first]# sbt package
Loading project definition from /home/training/workspace_spark/first/project
Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes... [error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org [error] import org.apache.spark.{SparkContext, SparkConf} [error] ^ [error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf [error] val conf = new SparkConf().setAppName("First Spark") [error] ^ [error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext [error] val sc = new SparkContext(conf) [error] ^ [error] three errors found [error] (compile:compile) Compilation failed [error] Total time: 4 s, completed May 10, 2018 4:05:10 PM

我试过延伸到App也没有改变.

最佳答案 请从build.sbt中删除resolvers = Resolver.mavenLocal.由于在Maven上可以使用spark-core,因此我们不需要使用本地解析器.

之后,你可以试试sbt clean package.

点赞