spark源码编译异常

今天下载了spark的2.3.1版本的源码,准备对spark的源码进行编译,结果抛了错误,具体如下:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:3.0.0-M1:enforce (enforce-versions) on project spark-parent_2.11: Some Enforcer rules
 have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

根据提示信息,可得到是maven-enforcer-plugin这个插件强制约束了一些rules,查看pom文件,找到约束的rules

<rules>
    <requireMavenVersion>
      <version>${maven.version}</version>
    </requireMavenVersion>
    <requireJavaVersion>
      <version>${java.version}</version>
    </requireJavaVersion>
    <bannedDependencies>
      <excludes>
        <!--
          Akka depends on io.netty:netty, which puts classes under the org.jboss.netty
          package. This conflicts with the classes in org.jboss.netty:netty
          artifact, so we have to ban that artifact here. In Netty 4.x, the classes
          are under the io.netty package, so it's fine for us to depend on both
          io.netty:netty and io.netty:netty-all.
        -->
        <exclude>org.jboss.netty</exclude>
        <exclude>org.codehaus.groovy</exclude>
        <exclude>*:*_2.10</exclude>
      </excludes>
      <searchTransitive>true</searchTransitive>
    </bannedDependencies>
  </rules>

可以看到主要还是对maven和java版本的约束,查看机器的maven和java版本,果然发现和pom文件约束的版本不一样,解决办法:
1、更改pom文件中的maven或者java版本
2、更新机器的maven或者java版本
最好还是使用办法2,使用办法1的话,有可能自己机器中maven或者java版本本来就过低,也会导致有些代码无法兼容

    原文作者:程序猿null
    原文地址: https://www.jianshu.com/p/aabb62753111
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞