hadoop2.6.5运行wordcount实例

运行wordcount实例

在/tmp目录下生成两个文本文件,上面随便写两个单词。

cd /tmp/
mkdir file
cd file/
echo "Hello world" > file1.txt
cp file1.txt file2.txt

在hadoop平台中新建 /input 目录,这里不是系统的 / 目录。

cd /usr/local/hadoop-2.6.5
./bin/hadoop fs -mkdir /input
hadoop fs -ls /

drwxr-xr-x  - root supergroup          0 2018-01-04 09:32 /input

执行 ./bin/hadoop fs -mkdir /input 可能会遇到一个错误:

报错:
mkdir: Cannot create directory /input. Name node is in safe mode.
# 解决办法
bin/hadoop  dfsadmin -safemode leave 
______________________________________________________
leave - 强制NameNode离开安全模式
enter - 进入安全模式
get - 返回安全模式是否开启的信息
wait - 等待,一直到安全模式结束。

把创建的文件放到hadoop平台的 /input中。

./bin/hadoop fs -put /tmp/file* /input 
hadoop fs -ls /input/file

Found 2 items
-rw-r--r--  1 root supergroup        12 2018-01-04 09:32 /input/file/file1.txt
-rw-r--r--  1 root supergroup        12 2018-01-04 09:32 /input/file/file2.txt

然后运行hadoop自带的example。

./bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /input/file/ /output/wordcount1
_________________________________________
18/01/04 09:54:09 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
18/01/04 09:54:10 INFO input.FileInputFormat: Total input paths to process : 2
18/01/04 09:54:10 INFO mapreduce.JobSubmitter: number of splits:2
18/01/04 09:54:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1515028606802_0002
18/01/04 09:54:10 INFO impl.YarnClientImpl: Submitted application application_1515028606802_0002
18/01/04 09:54:10 INFO mapreduce.Job: The url to track the job: http://iz2ze31g42iypc75mm363gz:8088/proxy/application_1515028606802_0002/
18/01/04 09:54:10 INFO mapreduce.Job: Running job: job_1515028606802_0002
18/01/04 09:54:18 INFO mapreduce.Job: Job job_1515028606802_0002 running in uber mode : false
18/01/04 09:54:18 INFO mapreduce.Job:  map 0% reduce 0%
18/01/04 09:54:27 INFO mapreduce.Job:  map 100% reduce 0%
18/01/04 09:54:33 INFO mapreduce.Job:  map 100% reduce 100%
18/01/04 09:54:33 INFO mapreduce.Job: Job job_1515028606802_0002 completed successfully
18/01/04 09:54:33 INFO mapreduce.Job: Counters: 49
    File System Counters
        FILE: Number of bytes read=54
        FILE: Number of bytes written=322109
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=238
        HDFS: Number of bytes written=16
        HDFS: Number of read operations=9
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters 
        Launched map tasks=2
        Launched reduce tasks=1
        Data-local map tasks=2
        Total time spent by all maps in occupied slots (ms)=12913
        Total time spent by all reduces in occupied slots (ms)=3521
        Total time spent by all map tasks (ms)=12913
        Total time spent by all reduce tasks (ms)=3521
        Total vcore-milliseconds taken by all map tasks=12913
        Total vcore-milliseconds taken by all reduce tasks=3521
        Total megabyte-milliseconds taken by all map tasks=13222912
        Total megabyte-milliseconds taken by all reduce tasks=3605504
    Map-Reduce Framework
        Map input records=2
        Map output records=4
        Map output bytes=40
        Map output materialized bytes=60
        Input split bytes=214
        Combine input records=4
        Combine output records=4
        Reduce input groups=2
        Reduce shuffle bytes=60
        Reduce input records=4
        Reduce output records=2
        Spilled Records=8
        Shuffled Maps =2
        Failed Shuffles=0
        Merged Map outputs=2
        GC time elapsed (ms)=363
        CPU time spent (ms)=1360
        Physical memory (bytes) snapshot=499748864
        Virtual memory (bytes) snapshot=6301630464
        Total committed heap usage (bytes)=301146112
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters 
        Bytes Read=24
    File Output Format Counters 
        Bytes Written=16

查看结果

./bin/hdfs dfs -cat /output/wordcount1/*  
Hello    2
world    2
    原文作者:_YaoQi
    原文地址: https://www.jianshu.com/p/ac3eaa2d2d7d
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞