apache-storm – 线程“main”中的异常java.lang.IllegalAccessError:在storm拓扑中

当我运行我的风暴拓扑来读取Kafka主题并将数据写入Hive表时,我收到了IllegalAccesserror:错误详情如下:

Exception in thread "main" java.lang.IllegalAccessError: tried to access method org.apache.logging.log4j.core.lookup.MapLookup.newMap(I)Ljava/util/HashMap; from class org.apache.logging.log4j.core.lookup.MainMapLookup
        at org.apache.logging.log4j.core.lookup.MainMapLookup.<clinit>(MainMapLookup.java:37)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.logging.log4j.core.util.ReflectionUtil.instantiate(ReflectionUtil.java:185)
        at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:65)
        at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:346)
        at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:161)
        at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:359)
        at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:420)
        at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
        at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
        at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
        at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
        at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
        at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
        at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
        at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:277)
        at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:288)
        at org.apache.storm.hive.bolt.mapper.DelimitedRecordHiveMapper.<clinit>(DelimitedRecordHiveMapper.java:39)
        at com.macys.smt.apm.storm.topology.HiveORCTopology.run(HiveORCTopology.java:79)
        at com.macys.smt.apm.storm.topology.HiveORCTopology.main(HiveORCTopology.java:46)`

我的风暴拓扑代码:

public class HiveORCTopology {
    public static final String KAFKA_SPOUT_ID = "kafka-spout";
    public static final String HIVE_PROCESS_BOLT_ID = "hive-process-bolt";
    public static final String HIVE_BOLT_ID = "hive-bolt";


    public static void main(String[] args) {
        HiveORCTopology hot = new HiveORCTopology();
        hot.run();
    }

    public void run(){

        String kafkaTopic = "APMPathZip";
        BrokerHosts brokerHosts = new ZkHosts("sandbox.hortonworks.com:2181");
        String kafkaConsumerGroup = "APM-CALLTRACE-KAFKA-SPOUT";

        SpoutConfig spoutConfig = new SpoutConfig(brokerHosts, kafkaTopic, "/kafkastorm",kafkaConsumerGroup);


        //SpoutConfig spoutConfig = new SpoutConfig(new ZkHosts("localhost:2181"),
        //      kafkaTopic, "/kafka_storm", "StormSpout");

        //spoutConfig.useStartOffsetTimeIfOffsetOutOfRange = true;
        //spoutConfig.startOffsetTime = System.currentTimeMillis();

        //KafkaSpout kafkaSpout = new KafkaSpout(spoutConfig);

        // Hive connection configuration
        //String metaStoreURI = "thrift://one.hdp:9083";
        String metaStoreURI = "thrift://sandbox.hortonworks.com:9083";
        //String metaStoreURI = "http://localhost:8080/";
        String dbName = "default";
        String tblName = "apmpathorc";
        // Fields for possible partition
        //  String[] partNames = {"name"};
        // Fields for possible column data
        String[] colNames = {"id", "endtime", "host", "starttime", "appservername", 
                "appname", "class", "method", "eventdate", "executiontime", "threadid"};
        // Record Writer configuration

        DelimitedRecordHiveMapper mapper = new DelimitedRecordHiveMapper()
        .withColumnFields(new Fields(colNames));
        //          .withPartitionFields(new Fields(partNames));

        HiveOptions hiveOptions;
        hiveOptions = new HiveOptions(metaStoreURI, dbName, tblName, mapper)
        .withTxnsPerBatch(2)
        .withBatchSize(100)
        .withIdleTimeout(10)
        .withCallTimeout(10000000);

        TopologyBuilder builder = new TopologyBuilder();

        builder.setSpout(KAFKA_SPOUT_ID, new KafkaSpout(spoutConfig));
        builder.setBolt(HIVE_PROCESS_BOLT_ID, new HiveDataBolt()).shuffleGrouping(KAFKA_SPOUT_ID);
        builder.setBolt(HIVE_BOLT_ID, new HiveBolt(hiveOptions)).shuffleGrouping(HIVE_PROCESS_BOLT_ID);

        String topologyName = "StormHiveStreamingTopology";
        Config config = new Config();
        config.setNumWorkers(1);
        config.setMessageTimeoutSecs(60);
        try {
            StormSubmitter.submitTopology(topologyName, config, builder.createTopology());
        } catch (AlreadyAliveException | InvalidTopologyException | AuthorizationException ex) {
            Logger.getLogger(HiveORCTopology.class.getName()).log(Level.SEVERE, null, ex);
        }
    }

}

和POM.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.macys.smt</groupId>
    <artifactId>APM-Storm2</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>APM-Storm2</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <repositories>
        <repository>
            <id>clojars.org</id>
            <url>http://clojars.org/repo</url>
        </repository>
    </repositories>
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.apache.zookeeper</groupId>
                <artifactId>zookeeper</artifactId>
                <version>3.4.5</version>
                <scope>provided</scope>
                <exclusions>
                    <exclusion>
                        <groupId>org.slf4j</groupId>
                        <artifactId>slf4j-log4j12</artifactId>
                    </exclusion>
                    <exclusion>
                        <groupId>log4j</groupId>
                        <artifactId>log4j</artifactId>
                    </exclusion>
                </exclusions>
            </dependency>
        </dependencies>
    </dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <scope>system</scope>
            <version>1.7.0</version>
            <systemPath>C:\\Program Files\\Java\\jdk1.7.0_76\\lib\\tools.jar</systemPath>
        </dependency>
        <dependency>
            <groupId>com.macys.smt</groupId>
            <artifactId>APM-Common</artifactId>
            <version>0.0.1-SNAPSHOT</version>
        </dependency>
        <dependency>
            <groupId>com.macys.smt</groupId>
            <artifactId>HBaseDAO</artifactId>
            <version>0.0.1-SNAPSHOT</version>
        </dependency>
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-core</artifactId>
            <version>0.10.0</version>
            <scope>provided</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-kafka</artifactId>
            <version>0.10.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka_2.10</artifactId>
            <version>0.8.2.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.storm</groupId>
            <artifactId>storm-hive</artifactId>
            <version>0.10.0</version>
        </dependency>

        <!-- <dependency> <groupId>org.apache.storm</groupId> <artifactId>storm-hive</artifactId> 
            <version>0.9.3.2</version> </dependency> -->
        <dependency>
            <groupId>org.apache.hive.hcatalog</groupId>
            <artifactId>hive-hcatalog-streaming</artifactId>
            <version>1.2.0</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.hive.hcatalog</groupId>
            <artifactId>hive-hcatalog-core</artifactId>
            <version>2.0.0</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-cli</artifactId>
            <version>1.2.0</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <!-- <dependency> <groupId>org.apache.calcite</groupId> <artifactId>calcite-core</artifactId> 
            <version>0.9.2-incubating</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> 
            <artifactId>slf4j-log4j12</artifactId> </exclusion> </exclusions> </dependency> -->

        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-core</artifactId>
            <version>2.5</version>
        </dependency>


    </dependencies>
    <build>
        <plugins>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>

            <!-- Bind the maven-assembly-plugin to the package phase this will create 
                a jar file without the storm dependencies suitable for deployment to a cluster. -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <mainClass></mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

请帮帮我.
谢谢.

最佳答案 当我尝试在我的风暴群集中部署拓扑时,我遇到了同样的错误:检查并重新检查您提交的所有jar是否没有任何风暴可能已经在类路径中提供的重复项.

在我的情况下,一旦我确定slf4j jar不在提交的罐子里,我再试一次,它工作得很好.

正如我在你的pom.xml中看到的那样,你必须将log4j范围更改为提供的范围

<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-core</artifactId>
    <version>2.5</version>
    <scope>provided</scope>
</dependency>

在任何情况下,您还应该检查风暴已经提交了哪些罐子并将其从您的依赖项中删除.

当我在暴风雨中运行我的拓扑时,风暴会进行此调用:

    Running: /usr/local/java/jdk1.8.0_91/bin/java -client -Ddaemon.name= -Dstorm.options= -Dstorm.home=/home/ubuntu/apache-storm-1.0.1 -Dstorm.log.dir=/home/ubuntu/apache-storm-1.0.1/logs -Djava.library.path= -Dstorm.conf.file= 
    -cp /home/ubuntu/apache-storm-1.0.1/lib/disruptor-3.3.2.jar:
/home/ubuntu/apache-storm-1.0.1/lib/kryo-3.0.3.jar:
/home/ubuntu/apache-storm-1.0.1/lib/asm-5.0.3.jar:
/home/ubuntu/apache-storm-1.0.1/lib/storm-rename-hack-1.0.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/servlet-api-2.5.jar:
/home/ubuntu/apache-storm-1.0.1/lib/slf4j-api-1.7.7.jar:
/home/ubuntu/apache-storm-1.0.1/lib/reflectasm-1.10.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/log4j-api-2.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/storm-core-1.0.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/log4j-core-2.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/objenesis-2.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/log4j-slf4j-impl-2.1.jar:
/home/ubuntu/apache-storm-1.0.1/lib/log4j-over-slf4j-1.6.6.jar:
/home/ubuntu/apache-storm-1.0.1/lib/minlog-1.3.0.jar:
/home/ubuntu/apache-storm-1.0.1/lib/clojure-1.7.0.jar:
storm-1.0.0-jar-with-dependencies.jar:
/home/ubuntu/apache-storm-1.0.1/conf:
/home/ubuntu/apache-storm-1.0.1/bin -Dstorm.jar=<your-jar-with-dependencies> <your main class>

在这里你可以看到log4j-core-2.1.jar是由storm提供的,产生一个碰撞,导致你看到的异常.请参阅this其他相关问题.

    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞