欧美一级特黄大片做受成人-亚洲成人一区二区电影-激情熟女一区二区三区-日韩专区欧美专区国产专区

hadoopMapReducejava示例

wordcount工作流程
input-> 拆分Split->映射map->派發(fā)Shuffle->縮減reduce->output
hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar wordcount 10803060234.txt /output

成都創(chuàng)新互聯(lián)是一家集網(wǎng)站建設(shè),北鎮(zhèn)企業(yè)網(wǎng)站建設(shè),北鎮(zhèn)品牌網(wǎng)站建設(shè),網(wǎng)站定制,北鎮(zhèn)網(wǎng)站建設(shè)報(bào)價(jià),網(wǎng)絡(luò)營(yíng)銷(xiāo),網(wǎng)絡(luò)優(yōu)化,北鎮(zhèn)網(wǎng)站推廣為一體的創(chuàng)新建站企業(yè),幫助傳統(tǒng)企業(yè)提升企業(yè)形象加強(qiáng)企業(yè)競(jìng)爭(zhēng)力??沙浞譂M(mǎn)足這一群體相比中小企業(yè)更為豐富、高端、多元的互聯(lián)網(wǎng)需求。同時(shí)我們時(shí)刻保持專(zhuān)業(yè)、時(shí)尚、前沿,時(shí)刻以成就客戶(hù)成長(zhǎng)自我,堅(jiān)持不斷學(xué)習(xí)、思考、沉淀、凈化自己,讓我們?yōu)楦嗟钠髽I(yè)打造出實(shí)用型網(wǎng)站。

package wordcount;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

public class Test {

public Test() {
    // TODO Auto-generated constructor stub
}

public static void main(String[] args) throws Exception {
    // TODO Auto-generated method stub

    Configuration conf = new Configuration();
    conf.set("fs.defaultFS","hdfs://172.26.19.40:9000");
    conf.set("mapreduce.job.jar", "target/wc.jar");
    conf.set("mapreduce.framework.name", "yarn");
    conf.set("yarn.resourcemanager.hostname", "hmaster");
    conf.set("mapreduce.app-submission.cross-platform", "true");
    Job job = Job.getInstance(conf);
    job.setMapperClass(WordMapper.class);
    job.setReducerClass(WordReducer.class);

    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(IntWritable.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(LongWritable.class);

    FileInputFormat.setInputPaths(job, "");
    FileOutputFormat.setOutputPath(job, new Path(""));

    job.waitForCompletion(true);
}

}

package wordcount;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class WordMapper extends Mapper<LongWritable, Text, Text, IntWritable> {

@Override
protected void map(LongWritable key, Text value, Mapper<LongWritable, Text, Text, IntWritable>.Context context)
        throws IOException, InterruptedException {
    String lineValue = value.toString();
    String[] words = lineValue.split(" ");
    IntWritable cIntWritable = new IntWritable(1);
    for(String word : words) {
        context.write(new Text(word), cIntWritable);
    }
}

}

package wordcount;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

public class WordReducer extends Reducer<Text, IntWritable, Text, LongWritable> {

@Override
protected void reduce(Text key, Iterable<IntWritable> values,
        Reducer<Text, IntWritable, Text, LongWritable>.Context context) throws IOException, InterruptedException {

     Long tmpCount = 0L;
     for(IntWritable value : values) {
         tmpCount = tmpCount + value.get();
     }

     context.write(key, new LongWritable(tmpCount));

}

}

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.skcc</groupId>
<artifactId>wordcount</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>wordcount</name>
<description>count the word</description>

<properties>
    <project.build.sourceencoding>UTF-8</project.build.sourceencoding>
    <hadoop.version>2.7.3</hadoop.version>
</properties>
<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.12</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
</dependencies>

</project>

網(wǎng)站欄目:hadoopMapReducejava示例
URL網(wǎng)址:http://aaarwkj.com/article34/ipdese.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供電子商務(wù)全網(wǎng)營(yíng)銷(xiāo)推廣網(wǎng)站設(shè)計(jì)公司、搜索引擎優(yōu)化、靜態(tài)網(wǎng)站、標(biāo)簽優(yōu)化

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶(hù)投稿、用戶(hù)轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)

網(wǎng)站建設(shè)網(wǎng)站維護(hù)公司
国产免费一级av剧情| 久草亚洲一区二区三区av| 中文字幕五月久久婷热| 成人黄色动作片在线观看| 亚洲日本高清一二三区| 欧美日韩午夜久久免费| 人妻有码av中文字幕久久| 亚洲 欧美 日韩一区| 国产三级自拍视频在线观看网站 | 国产真实乱偷精品视频免| 免费毛片一区二区三区| 伊人久久亚洲精品综合| 91精品一久久香蕉国产| 午夜体内射精免费视频| 欧美日韩国产看片一区二区| 乱色精品熟女一区二区三区| 日本精品动漫一区二区三区| 色综合av男人的天堂| 和富婆啪啪一区二区免费看| 亚洲一区二区三区av蜜桃| 精品国产伦一区二区三区在线| 日本加勒比系列在线播放| 五月婷婷丁香花综合网| 日本亚洲精品在线观看| 中文字幕亚洲无级av| 中日韩中文字幕一区二区| 打开网址国语一级黄色片| 欧美色精品人妻视频在线| 手机在线观看av网站| 日本三卡=卡无人区| 宅男视频在线观看视频| 色吊丝二区三区中文字幕| 日韩爱视频一区二区| 中文字幕人妻丝袜一区一三区| 国产美女主播在线精品一区| 欧美日韩精品人妻一区| 国产午夜在线观看免费视频 | 日韩欧美在线观看一区二区| 日韩黄av在线免费观看| 国产二区三区在线播放| 无人区乱码一区二区三区|