欧美一级特黄大片做受成人-亚洲成人一区二区电影-激情熟女一区二区三区-日韩专区欧美专区国产专区

基于cdh的Kafka配置及部署(詳細(xì),成功運(yùn)行)-創(chuàng)新互聯(lián)

一、下載

http://archive.cloudera.com/kafka/parcels/2.2.0/

創(chuàng)新互聯(lián)是一家業(yè)務(wù)范圍包括IDC托管業(yè)務(wù),網(wǎng)站空間、主機(jī)租用、主機(jī)托管,四川、重慶、廣東電信服務(wù)器租用,光華機(jī)房服務(wù)器托管,成都網(wǎng)通服務(wù)器托管,成都服務(wù)器租用,業(yè)務(wù)范圍遍及中國(guó)大陸、港澳臺(tái)以及歐美等多個(gè)國(guó)家及地區(qū)的互聯(lián)網(wǎng)數(shù)據(jù)服務(wù)公司。
wget http://archive.cloudera.com/kafka/parcels/2.2.0/KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel
wget http://archive.cloudera.com/kafka/parcels/2.2.0/KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel.sha1

二、校驗(yàn)

[hadoop@hadoop003 softwares]$ sha1sum KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel
359509e028ae91a2a082adfad5f64596b63ea750  KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel
[hadoop@hadoop003 softwares]$ cat KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel.sha1
359509e028ae91a2a082adfad5f64596b63ea750

校驗(yàn)碼相同,說(shuō)明文件在下載過(guò)程中沒(méi)有任何損壞,可正常使用

三、解壓并設(shè)置軟連接

[hadoop@hadoop003 softwares]$ tar -zxf  KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel -C ~/app
[hadoop@hadoop003 app]$ ln -s /home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/ /home/hadoop/app/kafka

四、重要目錄說(shuō)明

[hadoop@hadoop003 kafka]$ pwd
/home/hadoop/app/kafka
[hadoop@hadoop003 kafka]$ ll
total 20
drwxr-xr-x 2 hadoop hadoop 4096 Jul 7  2017 bin
drwxr-xr-x 5 hadoop hadoop 4096 Jul 7  2017 etc
drwxr-xr-x 3 hadoop hadoop 4096 Jul 7  2017 lib
drwxr-xr-x 2 hadoop hadoop 4096 Jul 7  2017 meta
###     kafka配置文件目錄,我們修改配置文件就在這里修改
[hadoop@hadoop003 kafka]$ ll etc/kafka/conf.dist/
total 48
-rw-r--r-- 1 hadoop hadoop  906 Jul 7  2017 connect-console-sink.properties
-rw-r--r-- 1 hadoop hadoop  909 Jul 7  2017 connect-console-source.properties
-rw-r--r-- 1 hadoop hadoop 2760 Jul 7  2017 connect-distributed.properties
-rw-r--r-- 1 hadoop hadoop  883 Jul 7  2017 connect-file-sink.properties
-rw-r--r-- 1 hadoop hadoop  881 Jul 7  2017 connect-file-source.properties
-rw-r--r-- 1 hadoop hadoop 1074 Jul 7  2017 connect-log4j.properties
-rw-r--r-- 1 hadoop hadoop 2061 Jul 7  2017 connect-standalone.properties
-rw-r--r-- 1 hadoop hadoop 4369 Jul 7  2017 log4j.properties
-rw-r--r-- 1 hadoop hadoop 5679 Jun  1 01:24 server.properties
-rw-r--r-- 1 hadoop hadoop 1032 Jul 7  2017 tools-log4j.properties

###     kafka功能目錄
[hadoop@hadoop003 kafka]$ ll lib/kafka/
total 112
drwxr-xr-x 2 hadoop hadoop  4096 Jul 7  2017 bin
drwxr-xr-x 2 hadoop hadoop  4096 Jul 7  2017 cloudera
lrwxrwxrwx 1 hadoop hadoop    43 Jun  1 02:11 config -> /etc/kafka/conf  #注意這是紅色
-rw-rw-r-- 1 hadoop hadoop 48428 Jun  1 02:17 KAFKA-2.2.0-1.2.2.0.p0.68-el6.parcel
drwxr-xr-x 2 hadoop hadoop 12288 Jul 7  2017 libs
-rwxr-xr-x 1 hadoop hadoop 28824 Jul 7  2017 LICENSE
drwxrwxr-x 2 hadoop hadoop  4096 Jun  1 01:39 logs
-rwxr-xr-x 1 hadoop hadoop   336 Jul 7  2017 NOTICE
drwxr-xr-x 2 hadoop hadoop  4096 Jul 7  2017 site-docs
### config軟連接此時(shí)默認(rèn)鏈接的getaway的配置文件,也就是CM客戶端的配置文件,因?yàn)槲覀儧](méi)有使用cm,所以也就沒(méi)有自動(dòng)生成/etc/kafka/conf 故報(bào)錯(cuò)閃爍紅色
###     bin目錄下是kafka的相關(guān)腳本,例如server啟動(dòng)關(guān)閉&&consumer&&producer的啟動(dòng)腳本

五、修改配置文件

# 第一步:
[hadoop@hadoop003 kafka] cd etc/kafka/conf.dist

# 第二步:

vim ?server.properties

# 第三步:(主要修改其中的6個(gè)參數(shù))

broker.id=0 ?#標(biāo)示符

log.dirs=/home/hadoop/app/kafka/logs ?#數(shù)據(jù)保存的位置

log.retention.hours=168 ?#數(shù)據(jù)的保留時(shí)間(168 hours=7天)

zookeeper.connect=hadoop001:2181,hadoop002:2181,hadoop003:2181/kafka
# zookeeper存儲(chǔ)kafka數(shù)據(jù)的位置
delete.topic.enable=true?#可以刪除已創(chuàng)建主題

六、啟動(dòng)kafka

[hadoop@hadoop003 kafka]$ lib/kafka/bin/kafka-server-start.sh /home/hadoop/app/kafka/etc/kafka/conf.dist/server.properties 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/lib/kafka/libs/slf4j-log4j12-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/KAFKA-2.2.0-1.2.2.0.p0.68/lib/kafka/libs/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
log4j:ERROR Could not read configuration file from URL [file:lib/kafka/bin/../config/log4j.properties].
java.io.FileNotFoundException: lib/kafka/bin/../config/log4j.properties (No such file or directory)
    at java.io.FileInputStream.open0(Native Method)
    at java.io.FileInputStream.open(FileInputStream.java:195)
    at java.io.FileInputStream.<init>(FileInputStream.java:138)
    at java.io.FileInputStream.<init>(FileInputStream.java:93)
    at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
    at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
    at org.apache.kafka.common.utils.Utils.<clinit>(Utils.java:59)
    at kafka.Kafka$.getPropsFromArgs(Kafka.scala:41)
    at com.cloudera.kafka.wrap.Kafka$.main(Kafka.scala:72)
    at com.cloudera.kafka.wrap.Kafka.main(Kafka.scala)
log4j:ERROR Ignoring configuration file [file:lib/kafka/bin/../config/log4j.properties].
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (kafka.server.KafkaConfig).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

出現(xiàn)了bug找不到配置文件

java.io.FileNotFoundException: lib/kafka/bin/../config/log4j.properties

由于 lrwxrwxrwx 1 hadoop hadoop 43 Jun 1 02:11 config -> /etc/kafka/conf 找不到,所以要指定成etc/kafka/conf.dist/

[hadoop@hadoop003 kafka]$ rm lib/kafka/config
[hadoop@hadoop003 kafka]$ ln -s  /home/hadoop/app/kafka/etc/kafka/conf.dist/ /home/hadoop/app/kafka/lib/kafka/config

基于cdh的Kafka配置及部署(詳細(xì),成功運(yùn)行)

重新啟動(dòng)

[hadoop@hadoop003 kafka]$ nohup kafka-server-start.sh /home/hadoop/app/kafka/etc/kafka/conf.dist/server.properties > /home/hadoop/app/kafka/server-logs/kafka-server.log 2>&1 &

沒(méi)有報(bào)錯(cuò)信息了。。。

另外有需要云服務(wù)器可以了解下創(chuàng)新互聯(lián)scvps.cn,海內(nèi)外云服務(wù)器15元起步,三天無(wú)理由+7*72小時(shí)售后在線,公司持有idc許可證,提供“云服務(wù)器、裸金屬服務(wù)器、高防服務(wù)器、香港服務(wù)器、美國(guó)服務(wù)器、虛擬主機(jī)、免備案服務(wù)器”等云主機(jī)租用服務(wù)以及企業(yè)上云的綜合解決方案,具有“安全穩(wěn)定、簡(jiǎn)單易用、服務(wù)可用性高、性價(jià)比高”等特點(diǎn)與優(yōu)勢(shì),專為企業(yè)上云打造定制,能夠滿足用戶豐富、多元化的應(yīng)用場(chǎng)景需求。

網(wǎng)站標(biāo)題:基于cdh的Kafka配置及部署(詳細(xì),成功運(yùn)行)-創(chuàng)新互聯(lián)
分享網(wǎng)址:http://aaarwkj.com/article44/dpiehe.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供用戶體驗(yàn)、企業(yè)建站、Google、動(dòng)態(tài)網(wǎng)站、軟件開(kāi)發(fā)域名注冊(cè)

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)

外貿(mào)網(wǎng)站建設(shè)
男人的天堂久久精品激情| 国产高清av免费观看| 青青青在线视频观看华人| 少妇人妻系列中文在线| 丁香婷婷激情五月天综合| 一区二区三区四区中文在线| 日韩欧美亚洲一级黄片| 日韩中文字幕欧美国产| 亚洲男人天堂日本一区| 国产91高清免费视频| 免费av不卡一区二区| 欧美黄色日本一区二区| 亚洲中文字幕一区乱码| 国产原创剧情av网址| 国语自产拍在线观看不卡| 午夜精品一区二区三区亚洲| 亚洲日本久久久午夜精品| 99国产精品热久久婷婷| 三级日本午夜在线观看| 粉嫩护士国产在线观看| 热精品韩国毛久久久久久| 蜜桃成人一区二区三区| 日韩精品人成在线播放| 国产日韩欧美视频在线观看| 凹凸国产精品熟女视频| 日韩少妇一级淫片免费| 亚洲国产不卡一区二区三区| 日本一区二区三区视频| 亚洲日本乱码一区二区三| 亚洲av丰满熟妇在线观看| 久草福利视频免费播放| 久热在线这里只有精品| 欧美亚洲午夜一二综合| 未满十八周岁禁看视频| 日韩精品一区福利合集| 亚洲精品一区二区影院| 97精品免费视频观看| 亚洲国产中文日韩欧美在线| 亚洲欧美一区日韩尤物| 2020亚洲欧美日韩在线| 伊人色综合久久天天五月婷|