Linux之——Shell多机器分发文件以及执行命令

网上看到的,很实用的工具。

首先需要设置好机器之间的ssh无密码通讯。

有三个文件,分别是
1.deploy.conf
2.deploy.sh
3.runRemoteCmd.sh

注意:
这三个文件都放在/home/hadoop/tools目录下;
deploy.conf配置文件的第一列是服务器的host名字,后面是对它的标识;

其中deploy.conf是配置文件:

#### NOTES
# There is crontab job using this config file which would compact log files and remove old log file.
# please be  carefully while modifying this file until you know what crontab exactly do
#hdp
cdhnode1,all,master,zk,
cdhnode2,all,slave,zk,
cdhnode3,all,slave,zk,jn,kfk,
cdhnode4,all,slave,jn,kfk,
cdhnode5,all,slave,jn,kfk,

deploy.sh是多机器分发脚本:

#!/bin/bash
#set -x

if [ $# -lt 3 ]
then
  echo "Usage: ./deply.sh srcFile(or Dir) descFile(or Dir) MachineTag"
  echo "Usage: ./deply.sh srcFile(or Dir) descFile(or Dir) MachineTag confFile"
  exit
fi

src=$1
dest=$2
tag=$3
if [ 'a'$4'a' == 'aa' ]
then
  confFile=/home/hadoop/tools/deploy.conf
else
  confFile=$4
fi

if [ -f $confFile ]
then
  if [ -f $src ]
  then
    for server in `cat $confFile|grep -v '^#'|grep ','$tag','|awk -F',' '{print $1}'`
    do
       scp $src $server":"${dest}
    done
  elif [ -d $src ]
  then
    for server in `cat $confFile|grep -v '^#'|grep ','$tag','|awk -F',' '{print $1}'`
    do
       scp -r $src $server":"${dest}
    done
  else
      echo "Error: No source file exist"
  fi

else
  echo "Error: Please assign config file or run deploy.sh command with deploy.conf in same directory"
fi

runRemoteCmd.sh是多机器执行命令脚本:

#!/bin/bash
#set -x

if [ $# -lt 2 ]
then
  echo "Usage: ./runRemoteCmd.sh Command MachineTag"
  echo "Usage: ./runRemoteCmd.sh Command MachineTag confFile"
  exit
fi

cmd=$1
tag=$2
if [ 'a'$3'a' == 'aa' ]
then

  confFile=/home/hadoop/tools/deploy.conf
else
  confFile=$3
fi

if [ -f $confFile ]
then
    for server in `cat $confFile|grep -v '^#'|grep ','$tag','|awk -F',' '{print $1}'`
    do
       echo "*******************$server***************************"
       ssh $server "source ~/.bash_profile; $cmd"
    done
else
  echo "Error: Please assign config file or run deploy.sh command with deploy.conf in same directory"
fi

使用方法:

##分发hadoop包到标识为slave服务器的/home/hadoop/app/目录下:
deploy.sh hadoop /home/hadoop/app/ slave

##让标识为zk的所有服务器执行/home/hadoop/app/zookeeper/bin/zkServer.sh start命令
runRemoteCmd.sh "/home/hadoop/app/zookeeper/bin/zkServer.sh start" zk

猜你喜欢

转载自blog.csdn.net/l1028386804/article/details/81148773