大数据简介,概念局部
概念局部,倡议之前没有任何大数据相干常识的敌人浏览
大数据概论
什么是大数据
大数据(Big Data)是指无奈在肯定工夫范畴内用惯例软件工具进行捕获、治理和解决的数据汇合,是须要新解决模式能力具备更强的决策力、洞察发现力和流程优化能力的海量、高增长率和多样化的信息资产
粗略解读以下
- 惯例软件工具:例如 JavaEE、Mysql(500-1000w数据)即便构建 Mysql 集群,集群中节点的数量也不是有限减少的。
- 海量、高增长率:数据自身基数大,每天新退出的数据也多
- 多样化:除了文本数据外,还有图片、视频等数据
次要解决海量数据的存储和海量数据的剖析计算问题
大数据的特点
- 大量:依据 IDC “数字宇宙” 报告 预计到2020年,寰球数据使用量将达到35.2ZB
- 高速:在海量的数据背后,解决数据的效率就是企业的生命
多样
- 结构化数据:数据库、文本为主
- 非结构化数据:网络日志、音频、视频、图片、地理位置信息
- 低价值密度:价值密度的高下与数据总量的大小成反比,疾速对有价值数据“提纯”
大数据利用场景
- 物流仓储:大数据分析系统助力商家精细化经营、晋升销量、节约老本
批发:剖析用户生产习惯,为用户购买商品提供方便,从而晋升商品销量
- 国外案例:纸尿裤+啤酒,剖析过往的商超订单发现纸尿裤+啤酒同时呈现的概率很高,因而将这两样物品放在一起,刺激更多的人生产
- 游览:深度联合大数据能力与游览行业需要,共建游览产业智慧治理、智慧服务和智慧营销的将来
- 商品广告举荐:给用户举荐可能喜爱的商品(千人千面)
- 保险、金融、房地产:数据挖掘、危险预测
大数据发展前景
十八大:施行国家大数据倒退策略
十九大:推动互联网、大数据、人工智能和实体经济深度交融
Hadoop
简介
Hadoop 是 Apache 基金会开发的分布式系统基础架构,次要解决海量数据的存储和剖析计算问题。与大数据所钻研的方向一样。狭义上来说,Hadoop通常指的是 Hadoop 的生态圈。我了解就和 Javaer 常说的 Spring Cloud 一样,并不特指一个技术,而是一些携手解决一个简单问题的汇合
创始人 Doug Cutting ,借鉴谷歌三篇论文
GFS ---> HDFS 解决了数据存储的问题
Map-Reduce ---> MR 解决了数据分析计算的问题
BigTable ---> HBase NoSQL数据库
三大发行版本
Apache
最原始,最根底版本,适宜入门学习
Cloudera
CDH版本、2008年成立,2009年Doug Cutting 加盟。 次要在大型互联网企业中应用,收费应用,付费保护
Hortonworks
2011年成立,文档较好,市场份额小
劣势
- 高可靠性:底层保护多个数据正本(default 3)
- 高扩展性:在集群间分配任务数据,可不便的扩大数以千计的节点
- 高效性:在MapReduce的思维下,Hadoop是并行工作的
- 高容错性:可能主动将失败的工作重新分配
组成
2.x 时代的 Hadoop 将数据的计算剖析和资源调度进行解耦,独立进去 Yarn 专门负责 CPU、内存、硬盘、网络等资源的调度
HDFS 架构概述
- NameNode(nn) 存储文件的元数据,如文件名,目录构造,文件属性(生成工夫,正本数,文件权限)以及每个文件的块列表和块所在的 DataNode 等
- DataNode(dn) 在本地文件系统存储文件块数据,以及块数据的校验和
- Secondary NameNode(2nn) 监控HDFS状态的辅助后台程序,每隔一段时间获取HDFS元数据的快照
YARN 架构概述
MapReduce 架构概述
MapReduce 将计算过程分为两个阶段:Map 和 Reduce
- Map 阶段并行处理数据
- Reduce 阶段对 Map 后果进行汇总
大数据技术生态体系
装置
倡议在新的虚拟机实现,不便后续克隆、搭建集群
装置 hadoop 之前倡议设置 linux IP 为动态 IP、必须装置 Java 以及配置环境变量、倡议敞开防火墙(本人测试的时候)
虚拟机网络配置
我本人的网络配置 vim /etc/sysconfig/network-scripts/ifcfg-ens33
, 网络采纳 NAT 形式。个别 NAT模式对应的虚构网卡是 vmnet8,
主机中 vmnet8 对应的虚构网卡 ipv4 地址须要设置动态 IP, 并且该 IP 与虚拟机网络编辑器中 vmnet8 的 子网IP 、 以及上面配置中 IPADDR,GATEWAY,DNS1 前三段要统一。且第四段不能一样。如图所示
- 主机虚构网卡配置
- 虚拟机网络编辑器
- 虚拟机网络配置文件
TYPE="Ethernet"PROXY_METHOD="none"BROWSER_ONLY="no"BOOTPROTO="static"DEFROUTE="yes"IPV4_FAILURE_FATAL="no"IPV6INIT="yes"IPV6_AUTOCONF="yes"IPV6_DEFROUTE="yes"IPV6_FAILURE_FATAL="no"IPV6_ADDR_GEN_MODE="stable-privacy"NAME="ens33"UUID="5f66ee29-f43f-4761-abec-bd0656e25e09"DEVICE="ens33"ONBOOT="yes"IPV6_PRIVACY="no"IPADDR="192.168.100.104"GATEWAY="192.168.100.2"DNS1="192.168.100.2"
我装置的版本是 2.10.1
下载下面的压缩包在 linux 服务器,解压后放在 /opt/module 目录下
配置环境变量
export HADOOP_HOME=/opt/module/hadoopexport PATH=$PATH:$HADOOP_HOME/binexport PATH=$PATH:$HADOOP_HOME/sbin
刷新 profile 文件
source /etc/profile
测试装置后果
# hadoopUsage: hadoop [--config confdir] [COMMAND | CLASSNAME] CLASSNAME run the class named CLASSNAME or where COMMAND is one of: fs run a generic filesystem user client version print the version jar <jar> run a jar file note: please use "yarn jar" to launch YARN applications, not this command. checknative [-a|-h] check native hadoop and compression libraries availability distcp <srcurl> <desturl> copy file or directories recursively archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive classpath prints the class path needed to get the Hadoop jar and the required libraries credential interact with credential providers daemonlog get/set the log level for each daemon trace view and modify Hadoop tracing settingsMost commands print help when invoked w/o parameters.
运行模式
- 本地模式,单节点 Java 过程,个别用于调试
- 伪分布式模式,适宜计算机性能不是十分强劲的敌人应用(16GB内存以下)
- 分布式
本地模式
如果你的 Hadoop 包是从官网下载的正式包,默认状况下。Hadoop 配置的都是本地运行模式
官网 Grep 案例
http://hadoop.apache.org/docs/r2.10.1/hadoop-project-dist/hadoop-common/SingleCluster.html
$ mkdir input$ cp etc/hadoop/*.xml input$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.10.1.jar grep input output 'dfs[a-z.]+'$ cat output/*
呈现上面的内容,则本地模式运行胜利
1 dfsadmin
正如咱们看到的那样
官网 WordCount 案例
创立输出目录(源目录) wcinput , 新建文本文件 wc.txt
# mkdir wcinput# cd wcinput/# touch wc.input# vim wc.inputlvbanqihao ake libai libaihanxin wuya hanxin zhangsandirenjie guanyu guanyu zhangfeichengjisihan jing
执行 examples 中的 wordcount 性能
# hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.10.1.jar wordcount wcinput/ wcoutput
查看后果
# cat wcoutput/*ake 1chengjisihan 1direnjie 1guanyu 2hanxin 2jing 1libai 2lvbanqihao 1wuya 1zhangfei 1zhangsan 1
伪分布式模式
批改 ${HADOOP_HOME}/etc/hadoop 下的配置文件。含意见正文,
HDFS的配置和操作
配置
core-site.xml
<!-- 指定HDFS中NameNode的地址 --><property><name>fs.defaultFS</name> <value>hdfs://linux101:9000</value></property><!-- 指定Hadoop运行时产生文件的存储目录 --><property> <name>hadoop.tmp.dir</name> <value>/opt/module/hadoop/data/tmp</value></property>
hdfs-site.xml
<!-- 指定HDFS正本的数量 --><property> <name>dfs.replication</name> <value>1</value></property>
hadoop-env.sh
批改 JAVA_HOME 门路为 JDK 门路
- 通过
echo $JAVA_HOME
命令显示已配置的 JAVA_HOME 门路。复制 - vim 关上 hadoop-env.sh 批改
启动集群
- 格式化 NameNode
仅在第一次启动须要(格式化NameNode,会产生新的集群id,导致NameNode和DataNode的集群id不统一,集群找不到已往数据。所以,格局NameNode时,肯定要先删除data数据和log日志,而后再格式化NameNode)
bin/hdfs namenode -format
- 启动 NameNode
sbin/hadoop-daemon.sh start namenode
- 启动 DataNode
sbin/hadoop-daemon.sh start datanode
查看集群
- 首先通过 JDK 的 jps 命令查看 NameNode 和 DataNode 过程是否启动
- 接着拜访该服务器的 50070 端口(确保防火墙曾经敞开,并且主机和虚拟机能够相互拜访)
- 查看日志文件 ${HADOOP_HOME}/logs
操作集群
在HDFS文件系统上创立一个input文件夹
bin/hdfs dfs -mkdir -p /user/keats/input
将测试文件内容上传到文件系统上
bin/hdfs dfs -put wcinput/wc.input /user/keats/input/
查看上传的文件是否正确
bin/hdfs dfs -ls /user/keats/input/bin/hdfs dfs -cat /user/keats/ input/wc.input
运行 MapReduce 程序
bin/hadoop jarshare/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar wordcount /user/keats/input/ /user/keats/output
查看输入后果
bin/hdfs dfs -cat /user/keats/output/*
将测试文件内容下载到本地
hdfs dfs -get /user/keats/output/part-r-00000 ./wcoutput/
删除输入后果
hdfs dfs -rm -r /user/keats/output
YARN的操作和配置
配置
配置 JAVA_HOME
${HADOOP_HOME}/etc/hadoop/
- yarn-env.sh
- mapred-env.sh
配置 yarn-site.xml
<!-- Reducer获取数据的形式 --><property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value></property><!-- 指定YARN的ResourceManager的地址 --><property><name>yarn.resourcemanager.hostname</name><value>linux101</value></property>
配置: (对mapred-site.xml.template重新命名为) mapred-site.xml
mv mapred-site.xml.template mapred-site.xml<!-- 指定MR运行在YARN上 --><property> <name>mapreduce.framework.name</name> <value>yarn</value></property>
启动集群
- 启动前必须保障NameNode和DataNode曾经启动
启动ResourceManager
sbin/yarn-daemon.sh start resourcemanager
启动NodeManager
sbin/yarn-daemon.sh start nodemanager
配置历史服务器
为了查看程序的历史运行状况,须要配置一下历史服务器。具体配置步骤如下
- 配置 mapred-site.xml
<!-- 历史服务器端地址 --><property><name>mapreduce.jobhistory.address</name><value>linux101:10020</value></property><!-- 历史服务器web端地址 --><property> <name>mapreduce.jobhistory.webapp.address</name> <value>linux101:19888</value></property>
- 启动历史服务器
sbin/mr-jobhistory-daemon.sh start historyserver
- jps 查看历史服务器是否启动
- 查看JobHistory http://linux101:19888/jobhistory
配置日志的汇集
日志汇集概念:利用运行实现当前,将程序运行日志信息上传到HDFS零碎上。
日志汇集性能益处:能够不便的查看到程序运行详情,不便开发调试。
留神:开启日志汇集性能,须要重新启动NodeManager 、ResourceManager和HistoryManager
配置 yarn-site.xml
<!-- 日志汇集性能使能 --><property> <name>yarn.log-aggregation-enable</name> <value>true</value></property><!-- 日志保留工夫设置7天 --><property> <name>yarn.log-aggregation.retain-seconds</name> <value>604800</value></property>
重启 NodeManager 、ResourceManager和HistoryManager
没有 restart 命令,只能先关后开 或者本人写个脚本
敞开
sbin/yarn-daemon.sh stop resourcemanagersbin/yarn-daemon.sh stop nodemanagersbin/mr-jobhistory-daemon.sh stop historyserver
敞开实现后,执行 jps 进行验证
关上
sbin/yarn-daemon.sh start resourcemanagersbin/yarn-daemon.sh start nodemanagersbin/mr-jobhistory-daemon.sh start historyserver
测试
删除HDFS上曾经存在的输入文件
bin/hdfs dfs -rm -R /user/keats/output
执行 wordcount
hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.10.1.jar wordcount /user/keats/input /user/keats/output
查看日志 http://linux101:19888/jobhistory
配置文件阐明
Hadoop配置文件分两类:默认配置文件和自定义配置文件,只有用户想批改某一默认配置值时,才须要批改自定义配置文件,更改相应属性值
默认配置文件
要获取的默认文件
文件寄存在Hadoop的jar包中的地位
[core-default.xml]
hadoop-common-2.7.2.jar/ core-default.xml
[hdfs-default.xml]
hadoop-hdfs-2.7.2.jar/ hdfs-default.xml
[yarn-default.xml]
hadoop-yarn-common-2.7.2.jar/ yarn-default.xml
[mapred-default.xml]
hadoop-mapreduce-client-core-2.7.2.jar/ mapred-default.xml
自定义配置文件
core-site.xml、hdfs-site.xml、yarn-site.xml、mapred-site.xml四个配置文件寄存在$HADOOP_HOME/etc/hadoop这个门路上,用户能够依据我的项目需要从新进行批改配置
**齐全分布式
克隆虚拟机
个别都通过克隆虚拟机的形式来模仿多台物理机,去模仿齐全分布式。(钱多任性的土豪除外)
- 首先确保被克隆虚拟机以及配置动态IP、装置 JDK、敞开防火墙、配置好了 hosts 文件
- 须要克隆 3 台虚拟机
- ip 对应着 102、103、104
- 更改主机名, centos7 应用这个命令:
hostnamectl set-hostname linux103
- 另外 vmware15 + centos7 在批改 IP 并重启后, mac 地址会主动变动,这个不必手动批改(视频中 centos6 老师是手动改的)
全副搞定后,主机通过 moba 是能够通过虚拟机主机名连贯任何一个虚拟机的(主机的 hosts 文件也须要配置)
装置 JDK(scp 命令学习)
scp(secure copy)平安拷贝
这块其实依照我方才的克隆操作,jdk 曾经装置配置好。然而视频中老师是克隆的空虚拟机。我想大略是次要为了教大家 scp (secure copy) 命令
scp -r [username@hostname1:]/x/xxx [username@hostname2:]/x/xxx
示意平安的从 hostname1 递归拷贝文件到 hostname2 服务器。
其中 [username@hostname1:] 示意能够省略,省略的时候示意以后服务器本地的文件夹/文件。username 示意近程主机的用户名,hostname 示意近程主机的主机名
rsync 近程同步工具
与 scp 不同的中央有两处
一是该工具仅同步差别的文件。雷同的文件不做操作
二是该工具只能操作本机和另外一台机器之间的同步,不能操作两个其余服务器
留神:如果本机文件门路对应的是一个文件,而内部机器对应的是一个不存在的文件夹。则该文件内容会被拷贝成文件夹名称的文件
rsync -rvl /x/xxx [username@hostname2:]/x/xxx
选项
性能
-r
递归
-v
显示复制过程
-l
拷贝符号连贯
xsync集群散发脚本
学会了 rsync 命令之后,就能够进行集群中两个服务器之间的文件同步了,然而对于正式环境中动辄几十成千盈百台服务器来说,手敲命令同步文件必定不事实。因而须要写一个散发命令的脚本
该脚本会读取输出脚本后的第一个参数(要散发的文件所在目录)
在以后用户目录下,如果是 root 则在 root 目录下创立 bin 文件夹,而后创立 xsync 文件,内容如下
#!/bin/bash#1 获取输出参数个数,如果没有参数,间接退出pcount=$#if((pcount==0)); thenecho no args;exit;fi #2 获取文件名称p1=$1fname=`basename $p1`echo fname=$fname #3 获取下级目录到绝对路径pdir=`cd -P $(dirname $p1); pwd`echo pdir=$pdir #4 获取以后用户名称user=`whoami` #5 循环for((host=103; host<105; host++)); do echo ------------------- linux$host -------------- rsync -rvl $pdir/$fname $user@linux$host:$pdirdone
批改脚本具备可执行权限 chmod +x xsync
集群配置
集群部署布局
- NameNode 和 SecondaryNameNode 占用的内存是相当的。比拟耗内存。因而须要离开
- ResourceManager 也比拟耗内存
linux102
linux103
linux104
HDFS
NameNode DataNode
DataNode
SecondaryNameNode DataNode
YARN
NodeManager
ResourceManager NodeManager
NodeManager
配置集群
外围配置文件 core-site.xml
<!-- 指定HDFS中NameNode的地址 --><property> <name>fs.defaultFS</name> <value>hdfs://linux102:9000</value></property><!-- 指定Hadoop运行时产生文件的存储目录 --><property> <name>hadoop.tmp.dir</name> <value>/opt/module/hadoop/data/tmp</value></property>
HDFS配置文件
配置hadoop-env.sh
vi hadoop-env.shexport JAVA_HOME=/opt/module/jdk1.8
配置hdfs-site.xml
<property> <name>dfs.replication</name> <value>3</value></property><!-- 指定Hadoop辅助名称节点主机配置 --><property> <name>dfs.namenode.secondary.http-address</name> <value>linux104:50090</value></property>
YARN配置文件
配置yarn-env.sh
vi yarn-env.shexport JAVA_HOME=/opt/module/jdk1.8
配置yarn-site.xml
<!-- Reducer获取数据的形式 --><property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value></property><!-- 指定YARN的ResourceManager的地址 --><property> <name>yarn.resourcemanager.hostname</name> <value>linux103</value></property>
MapReduce配置文件
配置mapred-env.sh
vi mapred-env.shexport JAVA_HOME=/opt/module/jdk1.8
配置mapred-site.xml
在该文件中减少如下配置
<!-- 指定MR运行在Yarn上 --><property> <name>mapreduce.framework.name</name> <value>yarn</value></property>
在集群上散发配置好的Hadoop配置文件
xsync /opt/module/hadoop/
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/sz7epigs/f...
https://github.com/sz7epigs/f...
https://github.com/sz7epigs/f...
https://github.com/sz7epigs/f...
https://github.com/sz7epigs/f...
https://github.com/qrpin3u9/t...
https://github.com/qrpin3u9/t...
https://github.com/sz7epigs/s...
https://github.com/qrpin3u9/p...
https://github.com/qqzul2a6/u...
https://github.com/qrpin3u9/q...
https://github.com/qqzul2a6/i...
https://github.com/sz7epigs/s...
https://github.com/qrpin3u9/a...
https://github.com/qqzul2a6/z...
https://github.com/sz7epigs/z...
https://github.com/qrpin3u9/y...
https://github.com/qqzul2a6/e...
https://github.com/sz7epigs/z...
https://github.com/qqzul2a6/c...
https://github.com/qrpin3u9/f...
https://github.com/sz7epigs/k...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/qrpin3u9/h...
https://github.com/sz7epigs/n...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/j...
https://github.com/sz7epigs/n...
https://github.com/qrpin3u9/h...
https://github.com/qqzul2a6/j...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/lty6dp09/u...
https://github.com/jdzce2g3/o...
https://github.com/qvbuqqvw/x...
https://github.com/jdzce2g3/o...
https://github.com/lty6dp09/u...
https://github.com/qrpin3u9/m...
https://github.com/qqzul2a6/c...
https://github.com/sz7epigs/n...
https://github.com/qqzul2a6/c...
https://github.com/qqzul2a6/a...
https://github.com/qrpin3u9/j...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/qrpin3u9/j...
https://github.com/qrpin3u9/j...
https://github.com/sz7epigs/g...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/sz7epigs/g...
https://github.com/qqzul2a6/a...
https://github.com/qrpin3u9/j...
https://github.com/sz7epigs/g...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qrpin3u9/j...
https://github.com/qqzul2a6/a...
https://github.com/sz7epigs/g...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/qvbuqqvw/n...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qrpin3u9/d...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qvbuqqvw/n...
https://github.com/qqzul2a6/y...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qvbuqqvw/n...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/sz7epigs/v...
https://github.com/qqzul2a6/y...
https://github.com/lty6dp09/i...
https://github.com/qrpin3u9/d...
https://github.com/sz7epigs/v...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/jdzce2g3/c...
https://github.com/qqzul2a6/y...
https://github.com/sz7epigs/v...
https://github.com/qvbuqqvw/n...
https://github.com/qrpin3u9/d...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/jdzce2g3/c...
https://github.com/lty6dp09/i...
https://github.com/sz7epigs/v...
https://github.com/qrpin3u9/d...
https://github.com/qvbuqqvw/n...
https://github.com/lty6dp09/i...
https://github.com/qqzul2a6/y...
https://github.com/jdzce2g3/c...
https://github.com/qrpin3u9/d...
https://github.com/qvbuqqvw/n...
查看文件散发状况
cat /opt/module/hadoop/etc/hadoop/core-site.xml
集群单点启动
如果集群是第一次启动,须要格式化NameNode . 如果格式化遇到问题,须要从新排查问题后,从新格式化
因为我的虚拟机在复制之前,做过伪分布式的测试。残留了 data logs 目录,须要删除三个虚拟机上的 data/ logs/ 目录,如果你是依照我的步骤走的,也须要通通删除后,再执行格式化操作
hadoop namenode -format
之后启动三个节点的 102 节点的 namenode 和 三个节点的 datanode, 关上 linux102:50070 查看是否启动胜利
配置ssh登录
生成公钥和私钥,进入以后用户根目录下的 .ssh 目录,我应用的 root 用户
ssh-keygen -t rsa
将公钥拷贝到要免密登录的指标机器上
ssh-copy-id linux102ssh-copy-id linux103ssh-copy-id linux104
对于以后机器(例如 Linux102)通过 ssh 拜访本人时,也是须要明码的。想要无秘拜访也须要将本人的公钥追加到本人的认证文件前面
ssh文件夹下(~/.ssh)的文件性能解释
known_hosts
记录ssh拜访过计算机的公钥(public key)
id_rsa
生成的私钥
id_rsa.pub
生成的公钥
authorized_keys
寄存受权过得无密登录服务器公钥
群起集群
配置slaves
cd /opt/module/hadoop/etc/hadoop/vim slaves
配置所有 datanode 服务器,不容许有空格
linux102linux103linux104
分发给 103 102
xsync slaves
启动集群
sbin/start-dfs.sh
敞开集群
sbin/stop-dfs.sh
启动YARN
老师视频里提到必须应用 103 节点起,我实测 2.10.1 版本是能够在 102 服务器起的,不知是 sh 文件更新还是 root 账户的起因
sbin/start-yarn.sh
Web端测试 SecondaryNameNode
浏览器中输出:http://linux104:50090/status.html
测试集群
上传小文件
hdfs dfs -put wcinput/wc.input /
上传大文件
hdfs dfs -put /opt/software/hadoop-2.10.1.tar.gz /
大文件占了多个 block,如果下载的话多个块又会被合并下载下来
理论文件存储在 ${HADOOP_HOME}/data/tmp 门路下的子子子子子子目录中,感兴趣的读者能够进去看看
/opt/module/hadoop/data/tmp/dfs/data/current/BP-1473062949-192.168.100.102-1608214270058/current/finalized/subdir0/subdir0
集群工夫同步
工夫服务器配置
查看 ntp 是否装置
rpm -qa|grep ntp
机器都是克隆的,查下 102 就可ntp-4.2.6p5-10.el6.centos.x86_64fontpackages-filesystem-1.41-1.1.el6.noarchntpdate-4.2.6p5-10.el6.centos.x86_64
批改 ntp 配置文件
- 受权局域网网段上的所有机器能够从这台机器上查问和同步工夫
vim /etc/ntp.conf
当该节点失落网络连接,仍然能够采纳本地工夫作为工夫服务器为集群中的其余节点提供工夫同步
# 关上这段正文,将 IP 第三段改成本人虚拟机局域网第三段restrict 192.168.100.0 mask 255.255.255.0 nomodify notrap# 增加上面的内容:当该节点失落网络连接,仍然能够采纳本地工夫作为工夫服务器为集群中的其余节点提供工夫同步server 127.127.1.0fudge 127.127.1.0 stratum 10
- 受权局域网网段上的所有机器能够从这台机器上查问和同步工夫
批改 /etc/sysconfig/ntpd 文件,让硬件工夫与零碎工夫一起同步
vim /etc/sysconfig/ntpd减少内容SYNC_HWCLOCK=yes
- 重启 ntpd 服务
service ntpd status
查看状态 start 启动 - 设置 ntpd 开机自启
chkconfig ntpd on
其余服务器配置
1天1次
0 0 * * 1-7 /usr/sbin/ntpdate linux102
批改服务器工夫
date -s "2017-9-11 11:11:11"
通过10分钟期待,看工夫是否同步(倡议测试批改 cron 表达式,使同步距离放大)