hadoop mapreduce 词频统计

在linux系统下桌面创建一个WordCount1.java文件,并上传到hdfs文件系统

  • 创建hdfs源文件夹路径
  • 上传wordcount统计元数据
  • 编译class文件
  • 生成jar包
  • 使用jar包,进行统计词频
#[root@master Desktop]hadoop fs -mkdir -p /user/root/wordcount_in
#[root@master Desktop]# hadoop fs -put wordcount /user/root/wordcount_in
#[root@master Desktop]/opt/java/jdk1.7.0_76/bin/javac wordcount1.java
#[root@master Desktop]/opt/java/jdk1.7.0_76/bin/jar cvf wc1.jar WordCount1*class
#[root@master Desktop] hadoop jar wc1.jar WordCount1 hdfs://192.168.137.121:9000/user/root/wordcount_in/wordcount hdfs://192.168.137.121:9000/user/root/wordcount_res

wordcount 源文件内容如下 源码中不能有注释

[root@master Desktop]# cat wordcount 
ha ha ha
he he he
i am marvis
i need you now
where you are
love you more than i can say
i'm waitting for you for so much long time
o  o o o o o  o o  o
哈 哈 哈 
你 是 谁

WordCount1.java

import java.io.*;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class WordCount1
{

    public static class WordCountMapper
    extends Mapper<Object,Text,Text,IntWritable>
    {

        private final static IntWritable one = new IntWritable(1);
        private Text word = new Text();

        public void map(Object key,Text value,Context context)
                throws IOException, InterruptedException {

            String[] words = value.toString().split(" ");

            for (String str: words)
            {
                word.set(str);
                context.write(word,one);

            }

        }
    }

    public static class WordCountReducer
    extends Reducer<Text,IntWritable,Text,IntWritable> {
        public void reduce(Text key,Iterable<IntWritable> values,Context context)
                throws IOException, InterruptedException {

            int total=0;
            for (IntWritable val : values){
                total++;
            }
            context.write(key, new IntWritable(total));
        }

    }

    public static void main (String[] args) throws Exception{
        Configuration conf = new Configuration();

        conf.set("mapreduce.job.jar","wc1.jar");

        Job job = new Job(conf, "word count");
        job.setJarByClass(WordCount1.class);
        job.setMapperClass(WordCountMapper.class);
        job.setReducerClass(WordCountReducer.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(IntWritable.class);
        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));
        System.exit(job.waitForCompletion(true) ? 0 : 1);
    }

}

《hadoop mapreduce 词频统计》 原始文件
《hadoop mapreduce 词频统计》 javac编译文件
《hadoop mapreduce 词频统计》 jar编译文件
《hadoop mapreduce 词频统计》 统计成功的标识.png
《hadoop mapreduce 词频统计》 运行最终结果.png

错误处理

《hadoop mapreduce 词频统计》 文件已经存在报错

此错误表示hdfs中已经存在文件在路径hdfs://192.168.137.121:9000/user/root/wordcount_res下,因此需要删除。或者是制定一个新的路径

javac 运行报错

《hadoop mapreduce 词频统计》 javac运行报错

此时是因为haddoop的编译类没有找到编译路径

[root@master Desktop]# /usr/hadoop-2.6.4/bin/hadoop classpath
/usr/hadoop-2.6.4/etc/hadoop:/usr/hadoop-2.6.4/share/hadoop/common/lib/*:/usr/hadoop-2.6.4/share/hadoop/common/*:/usr/hadoop-2.6.4/share/hadoop/hdfs:/usr/hadoop-2.6.4/share/hadoop/hdfs/lib/*:/usr/hadoop-2.6.4/share/hadoop/hdfs/*:/usr/hadoop-2.6.4/share/hadoop/yarn/lib/*:/usr/hadoop-2.6.4/share/hadoop/yarn/*:/usr/hadoop-2.6.4/share/hadoop/mapreduce/lib/*:/usr/hadoop-2.6.4/share/hadoop/mapreduce/*:/usr/hadoop-2.6.4/contrib/capacity-scheduler/*.jar

将上面的hadoop类路径编译添加到~/.bash_profile

《hadoop mapreduce 词频统计》 编辑.bash_profile.png
《hadoop mapreduce 词频统计》 防火墙未关闭导致的问题.png

关闭防火墙

[root@slaver2 ~]# service iptables stop
iptables: Setting chains to policy ACCEPT: filter [  OK  ]
iptables: Flushing firewall rules: [  OK  ]
iptables: Unloading modules: [  OK  ]
    原文作者:天堂宝宝_V
    原文地址: https://www.jianshu.com/p/180ceb429085
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞