hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection

 
hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection   这个问题一般是在hadoop2.x版本里会出现,
Hadoop的datanode需要访问namenode的jobhistory server,如果没有修改,则默认为0.0.0.0:10020,则可以修改mapred-site.xml文件:

<property>
    <name>mapreduce.jobhistory.address</name>

       <!– 配置实际的Master主机名和端口–>
    <value>0.0.0.0:10020</value>
</property>

<property>
    <name>mapreduce.jobhistory.webapp.address</name>

       <!– 配置实际的Master主机名和端口–>
    <value>0.0.0.0:19888</value>
</property>

最后不要忘记了启动jobhistory

 
$HADOOP_HOME/sbin/mr-jobhistory-daemon.sh start historyserver
 
 
********************************************************************
a
:C:\words.txt  文件内容:  
hello alamp s
hello qq
 
hello xx
 
hello aa
 
hello swk
 
hello zbj
 
hello blm
 
blm   xixi
zbj  hehe
swk   haha
b. java  
@Test
     
public
void
testUpload()
throws
IllegalArgumentException, IOException {
          FSDataOutputStream
out
=
fs
.create(
new
Path(
                   
“hdfs://itcast01:9000/upload2”
));
          FileInputStream
in
=
new
FileInputStream(
new
File(
“c:/words.txt”
));
          IOUtils.copyBytes(
in
,
out
, 2048,
true
);
     }
《hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection》
package cn.itcast.hadoop.mr;
 
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
 
 
public class WordCount {
 
    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        conf.setInt(“mapreduce.client.submit.file.replication”, 20);
        Job job = Job.getInstance(conf);
 
        //notice
        job.setJarByClass(WordCount.class);
 
        //set mapper`s property
        job.setMapperClass(WCMapper.class);
        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(LongWritable.class);
        FileInputFormat.setInputPaths(job, new Path(“/upload2/”));
 
        //set reducer`s property
        job.setReducerClass(WCReducer.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(LongWritable.class);
        FileOutputFormat.setOutputPath(job, new Path(“/usr/mapperReduce”));
 
        //submit
        job.waitForCompletion(true);
    }
 
}
c.打jar包                             
《hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection》                              
《hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection》       
[root@itcast01 usr]# hadoop jar mr.jar 17/05/20 16:44:09 INFO client.RMProxy: Connecting to ResourceManager at itcast01/192.168.233.128:8032 17/05/20 16:44:10 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/05/20 16:44:11 INFO input.FileInputFormat: Total input paths to process : 1 17/05/20 16:44:11 INFO mapreduce.JobSubmitter: number of splits:1 17/05/20 16:44:11 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495288787912_0006 17/05/20 16:44:12 INFO impl.YarnClientImpl: Submitted application application_1495288787912_0006 17/05/20 16:44:12 INFO mapreduce.Job: The url to track the job: http://itcast01:8088/proxy/application_1495288787912_0006/ 17/05/20 16:44:12 INFO mapreduce.Job: Running job: job_1495288787912_0006 17/05/20 16:44:55 INFO mapreduce.Job: Job job_1495288787912_0006 running in uber mode : false 17/05/20 16:44:55 INFO mapreduce.Job:  map 0% reduce 0% 17/05/20 16:46:24 INFO mapreduce.Job:  map 100% reduce 0%
17/05/20 16:47:02 INFO mapreduce.Job:  map 100% reduce 100%
17/05/20 16:47:03 INFO mapreduce.Job: Job job_1495288787912_0006 completed successfully 17/05/20 16:47:06 INFO mapreduce.Job: Counters: 49         File System Counters                 FILE: Number of bytes read=435                 FILE: Number of bytes written=186471                 FILE: Number of read operations=0                 FILE: Number of large read operations=0                 FILE: Number of write operations=0                 HDFS: Number of bytes read=210                 HDFS: Number of bytes written=78                 HDFS: Number of read operations=6                 HDFS: Number of large read operations=0                 HDFS: Number of write operations=2         Job Counters                 Launched map tasks=1                 Launched reduce tasks=1                 Data-local map tasks=1                 Total time spent by all maps in occupied slots (ms)=65092                 Total time spent by all reduces in occupied slots (ms)=32649                 Total time spent by all map tasks (ms)=65092                 Total time spent by all reduce tasks (ms)=32649                 Total vcore-seconds taken by all map tasks=65092                 Total vcore-seconds taken by all reduce tasks=32649                 Total megabyte-seconds taken by all map tasks=66654208                 Total megabyte-seconds taken by all reduce tasks=33432576         Map-Reduce Framework                 Map input records=17                 Map output records=32                 Map output bytes=365                 Map output materialized bytes=435                 Input split bytes=93                 Combine input records=0                 Combine output records=0                 Reduce input groups=13                 Reduce shuffle bytes=435                 Reduce input records=32                 Reduce output records=13                 Spilled Records=64                 Shuffled Maps =1                 Failed Shuffles=0                 Merged Map outputs=1                 GC time elapsed (ms)=290                 CPU time spent (ms)=5530                 Physical memory (bytes) snapshot=284258304                 Virtual memory (bytes) snapshot=1685770240                 Total committed heap usage (bytes)=136515584         Shuffle Errors                 BAD_ID=0                 CONNECTION=0                 IO_ERROR=0                 WRONG_LENGTH=0                 WRONG_MAP=0                 WRONG_REDUCE=0         File Input Format Counters                 Bytes Read=117         File Output Format Counters                 Bytes Written=78  
[root@itcast01 usr]# hadoop fs -ls / Found 9 items -rw-r–r–   1 root supergroup  153512879 2017-05-17 05:34 /jdk -rw-r–r–   1 root supergroup         62 2017-05-20 13:54 /score_in drwx——   – root supergroup          0 2017-05-17 06:15 /tmp -rw-r–r–   3 root supergroup         32 2017-05-17 14:47 /upload
-rw-r–r–   3 root supergroup        117 2017-05-20 16:35 /upload2 drwxr-xr-x   – root supergroup          0 2017-05-20 16:44 /usr drwxr-xr-x   – root supergroup          0 2017-05-17 06:18 /wcout -rw-r–r–   1 root supergroup         70 2017-05-17 06:12 /words drwxr-xr-x   – root supergroup          0 2017-05-20 15:23 /xx [root@itcast01 usr]# hadoop fs -cat /usr/upload2/part-r-00000 cat: `/usr/upload2/part-r-00000′: No such file or directory     [root@itcast01 usr]# hadoop fs -ls /usr/ Found 4 items drwxr-xr-x   – root supergroup          0 2017-05-17 14:54 /usr/local drwxr-xr-x   – root supergroup          0 2017-05-20 16:47 /usr/mapperReduce drwxr-xr-x   – root supergroup          0 2017-05-20 16:08 /usr/swk drwxr-xr-x   – root supergroup          0 2017-05-17 11:25 /usr/test [root@itcast01 usr]# hadoop fs -ls /usr/mapperReduce Found 2 items -rw-r–r–   1 root supergroup          0 2017-05-20 16:47 /usr/mapperReduce/_SUCCESS
-rw-r–r–   1 root supergroup         78 2017-05-20 16:46 /usr/mapperReduce/part-r-00000
[root@itcast01 usr]# hadoop fs -cat /usr/mapperReduce/part-r-00000         11
aa      1
alamp   1
blm     2
haha    1
hehe    1
hello   7
qq      1
s       1
swk     2
xixi    1
xx      1
zbj     2 [root@itcast01 usr]#

点赞