exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

1、虽然,不是大错,还说要贴一下,由于我运行run-example streaming.NetworkWordCount localhost 9999的测试案例,出现的错误,第一感觉就是Spark没有启动导致的:

  1 18/04/23 03:21:58 ERROR SparkContext: Error initializing SparkContext.
  2 java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
  3     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  4     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  5     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  6     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  7     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
  8     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
  9     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
 10     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 11     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 12     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 13     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 14     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 15     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 16     at java.lang.reflect.Method.invoke(Method.java:606)
 17     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 18     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 19     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 20     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
 21     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
 22     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
 23     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
 24     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 25     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
 26     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
 27     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
 28     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:864)
 29     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
 30     at org.apache.spark.examples.streaming.NetworkWordCount$.main(NetworkWordCount.scala:47)
 31     at org.apache.spark.examples.streaming.NetworkWordCount.main(NetworkWordCount.scala)
 32     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 33     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 34     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 35     at java.lang.reflect.Method.invoke(Method.java:606)
 36     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
 37     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
 38     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
 39     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
 40     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 41 Caused by: java.net.ConnectException: Connection refused
 42     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 43     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
 44     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 45     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
 46     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
 47     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
 48     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
 49     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
 50     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
 51     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
 52     ... 31 more
 53 18/04/23 03:21:59 INFO SparkUI: Stopped Spark web UI at http://192.168.19.131:4040
 54 18/04/23 03:21:59 INFO DAGScheduler: Stopping DAGScheduler
 55 18/04/23 03:21:59 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
 56 18/04/23 03:21:59 INFO MemoryStore: MemoryStore cleared
 57 18/04/23 03:21:59 INFO BlockManager: BlockManager stopped
 58 18/04/23 03:21:59 INFO BlockManagerMaster: BlockManagerMaster stopped
 59 18/04/23 03:21:59 INFO SparkContext: Successfully stopped SparkContext
 60 Exception in thread "main" java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
 61     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 62     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 63     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 64     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 65     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
 66     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
 67     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
 68     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 69     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 70     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 71     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 72     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 73     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 74     at java.lang.reflect.Method.invoke(Method.java:606)
 75     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 76     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 77     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 78     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
 79     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
 80     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
 81     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
 82     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 83     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
 84     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
 85     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
 86     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:864)
 87     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
 88     at org.apache.spark.examples.streaming.NetworkWordCount$.main(NetworkWordCount.scala:47)
 89     at org.apache.spark.examples.streaming.NetworkWordCount.main(NetworkWordCount.scala)
 90     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 91     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 92     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 93     at java.lang.reflect.Method.invoke(Method.java:606)
 94     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
 95     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
 96     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
 97     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
 98     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 99 Caused by: java.net.ConnectException: Connection refused
100     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
101     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
102     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
103     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
104     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
105     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
106     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
107     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
108     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
109     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
110     ... 31 more
111 18/04/23 03:21:59 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
112 18/04/23 03:21:59 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
113 18/04/23 03:21:59 INFO ShutdownHookManager: Shutdown hook called
114 18/04/23 03:21:59 INFO ShutdownHookManager: Deleting directory /tmp/spark-7ef5c2da-0b57-4553-a9f9-6e215885c7ba
115 18/04/23 03:21:59 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

2、启动Spark的脚本命令:

[hadoop@slaver1 spark-1.5.1-bin-hadoop2.4]$ sbin/start-all.sh

[hadoop@slaver2 ~]$ run-example streaming.NetworkWordCount localhost 9999

    原文作者:别先生
    原文地址: https://www.cnblogs.com/biehongli/p/8919475.html
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞