Hadoop Java对应版本号

Hadoop和Java的版本要一一对应,否则到时候会出现不支持的函数出现。下面这个是官方给出来的对应信息。

Java 版本支持

  • Apache Hadoop 3.x 版本 现在只支持 Java 8
  • Apache Hadoop 从2.7.x 到 2.x 版本支持Java 7 and 8
  • Apache Hadoop 最新更新的2.7版本需要Java 7. 它在 OpenJDK 和Oracle (HotSpot)’s JDK/JRE下都已经通过编译和检测,早期的版本 (2.6 或更早版本) 则支持Java 6.
  • Java 11 现阶段部分支持:
    • trunk (3.3.0-SNAPSHOT) 支持 Java 11 运行时:HADOOP-15338-Java 11 运行时支持RESOLVED
    • Hadoop在使用 Java 11编译时不支持:HADOOP-16795-Java 11 编译时支持 OPEN

JDKs/JVMs支持

通过检验的JDK

下表是已经已知在使用的 或已经通过检验的JDKs 版本:

版本号

状态

报告来源

oracle 1.7.0_15

Good

Cloudera

oracle 1.7.0_21

Good (4)

Hortonworks

oracle 1.7.0_45

Good

Pivotal

openjdk 1.7.0_09-icedtea

Good (5)

Hortonworks

oracle 1.6.0_16

Avoid (1)

Cloudera

oracle 1.6.0_18

Avoid

Many

oracle 1.6.0_19

Avoid

Many

oracle 1.6.0_20

Good (2)

LinkedIn, Cloudera

oracle1.6.0_21

Good (2)

Yahoo!, Cloudera

oracle 1.6.0_24

Good

Cloudera

oracle 1.6.0_26

Good(2)

Hortonworks, Cloudera

oracle 1.6.0_28

Good

LinkedIn

oracle 1.6.0_31

Good(3, 4)

Cloudera, Hortonworks

如果是新项目的话,在没有项目包袱的情况下,建议尽量选用Hadoop3.X以上的版本会更加适合。如果是老项目的话,想进行项目升级的话,要注意了,因为Java版本的一些变化导致代码有些不兼容。

下面是来自官网的JDK1.8版本修改导致的不兼容问题。

Java 不兼容的变更

下面表格文档供升级的Hadoop集群的Java版本的用户使用 . 它记录了影响Apache Hadoop的Java的变更.

Java 8

版本号

不兼容的变更

相关JDK bug系统 票证

关联的JIRAs

1.8.0_242

The visibility of sun.nio.ch.SocketAdaptor is changed from public to package-private.TestIPC#testRTEDuringConnectionSetup is affected.

JDK-8237177

HADOOP-15787-[JDK11] TestIPC.testRTEDuringConnectionSetup failsRESOLVED

1.8.0_242

Kerberos Java client will fail by “Message stream modified (41)”

when the client requests a renewable ticket and the KDC

returns a non-renewable ticket. If your principal is not allowed to

obtain a renewable ticket, you must remove “renew_lifetime”

setting from your krb5.conf.

JDK-8131051

1.8.0_171

In Apache Hadoop 2.7.0 to 2.7.6, 2.8.0 to 2.8.4, 2.9.0 to 2.9.1,

3.0.0 to 3.0.2, and 3.1.0, KMS fails by java.security.

UnrecoverableKeyException due toEnhanced KeyStore Mechanisms.

You need to set the system property “jceks.key.serialFilter” to the

following value to avoid this error:

java.lang.Enum;java.security.KeyRep;java.security.
KeyRep T y p e ; j a v a x . c r y p t o . s p e c . S e c r e t K e y S p e c ; o r g . a p a c h e . h a d o o p . c r y p t o . k e y . J a v a K e y S t o r e P r o v i d e r Type;javax.crypto.spec.SecretKeySpec; org.apache.hadoop.crypto.key.JavaKeyStoreProvider Type;javax.crypto.spec.SecretKeySpec;org.apache.hadoop.crypto.key.JavaKeyStoreProviderKeyMetadata;!*”

HADOOP-15473-Configure serialFilter in KeyProvider to avoid UnrecoverableKeyException caused by JDK-8189997RESOLVED

1.8.0_191

All DES cipher suites were disabled. If you are explicitly using DEC cipher suites, you need to change cipher suite to a strong one.

HADOOP-16016-TestSSLFactory#testServerWeakCiphers sporadically fails in precommit buildsRESOLVED

    原文作者:普通网友
    原文地址: https://blog.csdn.net/m0_67393619/article/details/123933614
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞