程序员就是把一切手工做的事情变成让计算机来做,从而可以让自己偷偷懒。

以下就是个非常low的hive文件夹同步程序,至于节点超过100个或者1000个的,可以加个循环了。

#!/bin/sh

#================ hive 安装包同步 =================#
# 该脚本用来将name节点的hive文件夹同步到data节点   #
# 当hive安装包变动时,需要同步data节点,否则oozie  #
# 通过shell调用hive程序时,会因为分配的节点hive安  #
# 装包不同步而引起错误                             #
#==================================================#

# 1.清理旧的hive
ssh -t hadoop@dwprod-dataslave1 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave2 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave3 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave4 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave5 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave6 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave7 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave8 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave9 rm -r /opt/local/hive
ssh -t hadoop@dwprod-dataslave10 rm -r /opt/local/hive

# 2.拷贝新的hive
scp -r -q /opt/local/hive hadoop@dwprod-dataslave1:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave2:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave3:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave4:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave5:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave6:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave7:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave8:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave9:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave10:/opt/local/

 

 

 

相关文章:

  • 2021-06-07
  • 2022-01-27
  • 2021-08-25
  • 2022-12-23
  • 2022-12-23
  • 2021-10-12
猜你喜欢
  • 2022-12-23
  • 2021-09-14
  • 2021-12-14
  • 2022-12-23
  • 2022-12-23
  • 2021-12-14
  • 2022-01-06
相关资源
相似解决方案