一、升级前准备
dfs.namenode.dir目录)、HDFS数据目录及所有配置文件(core-site.xml、hdfs-site.xml、yarn-site.xml等),防止升级过程中数据丢失。二、具体升级步骤
./sbin/stop-all.sh./sbin/stop-dfs.sh)和YARN(./sbin/stop-yarn.sh)。hadoop-3.3.6.tar.gz),解压至指定目录(如/usr/local/):wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.6/hadoop-3.3.6.tar.gz
tar -zxvf hadoop-3.3.6.tar.gz -C /usr/local/
hadoop-old)作为备份,将新版本目录重命名为hadoop:mv /usr/local/hadoop /usr/local/hadoop-old
mv /usr/local/hadoop-3.3.6 /usr/local/hadoop
etc/hadoop/目录下)复制到新版本的对应目录,检查并修改以下关键配置:
core-site.xml:更新fs.defaultFS(HDFS地址)、hadoop.tmp.dir(临时目录)等路径;hdfs-site.xml:调整dfs.namenode.name.dir(NameNode元数据目录)、dfs.datanode.data.dir(DataNode数据目录)等路径;yarn-site.xml:配置ResourceManager(yarn.resourcemanager.hostname)和NodeManager参数。~/.bashrc(或/etc/profile),更新HADOOP_HOME指向新版本目录,并添加$HADOOP_HOME/bin、$HADOOP_HOME/sbin到PATH:export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
使配置生效:source ~/.bashrc。./sbin/start-dfs.sh;./bin/start-dfs.sh -upgrade;./bin/hadoop dfsadmin -upgradeProgress status(或details查看详细信息)。hdfs dfs -ls /(查看根目录是否正常);yarn application -list(查看是否有任务运行);jps(确认NameNode、DataNode、ResourceManager、NodeManager等进程正常)。三、后续操作
finalizeUpgrade命令删除旧版本备份,释放存储空间:./bin/hadoop dfsadmin -finalizeUpgrade
rollback命令回退到旧版本(仅能在finalizeUpgrade前执行):./bin/start-dfs.sh -rollback
注意:finalizeUpgrade后旧版本备份会被删除,回滚功能失效。