Hadoop 3安装

图片[1]-Hadoop 3安装-JoyCode 编程小战

首先安装jdk8,然后下载安装包:

wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3.4.tar.gz

创建hadoop账号:

useradd -m hadoop

创建hadoop用户密码:

passwd hadoop

切换到hadoop用户,配置ssh免密登录:

ssh-keygen -t rsa
cd ~/.ssh
cp id_rsa.pub authorized_keys

创建环境变量:

vi /etc/profile

加入下面配置:

export JAVA_HOME=/usr/lib/jvm/java-openjdk
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JAVA_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH


export HADOOP_HOME=/home/hadoop/hadoop-3.2.2
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export PATH=${PATH}:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native/

修改core-site.xml:

<configuration>
    <!--指定fs的缺省名称-->
    <property>
      <name>fs.default.name</name>
      <value>hdfs://192.168.56.100:9820</value>
    </property>
    <!--指定HDFS的(NameNode)的缺省路径地址,localhost:是计算机名,也可以是ip地址-->
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://192.168.56.100:9820</value>
    </property>
    <!-- 指定hadoop运行时产生文件的存储目录(以个人为准) -->
    <property>
      <name>hadoop.tmp.dir</name>
      <value>/usr/local/hadoop/tmp</value>
    </property>
</configuration>

修改配置文件hdfs-site.xml:

<configuration>
<!-- 指定HDFS副本的数量 -->
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
      <name>dfs.name.dir</name>
      <value>/usr/local/hadoop/hdfs/name</value>
    </property>
    <property>
      <name>dfs.data.dir</name>
      <value>/usr/local/hadoop/hdfs/data</value>
    </property>
    <property>
        <name>dfs.http.address</name>
        <value>192.168.56.100:9870</value>
    </property>
</configuration>

修改配置文件mapred-site.xml:

<configuration>
    <!-- 指定mr运行在yarn上 -->
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

修改配置文件yarn-site.xml:

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>

修改hadoop-env.sh:

export JAVA_HOME=/usr/lib/jvm/java-openjdk

格式化namenode:

hdfs namenode -format

启动hadoop hdfs:

start-dfs.sh

启动yarn:

start-yarn.sh

浏览器打开(请自行更换实际地址):

http://192.168.56.100:8088/

http://192.168.56.100:9870/

运行示例:

hadoop jar hadoop-mapreduce-examples-3.3.6.jar pi 10 50

参考:

© 版权声明
THE END
喜欢就支持一下吧
点赞0 分享