作者:kafei | 来源:互联网 | 2023-09-11 09:51
1.环境准备1)虚拟机为VM102)Linux系统为centos6.53)Hadoop为hadoop-2.6.0-cdh5.7.04)JDK为jdk-7u80-linux-x64.
1.环境准备
1)虚拟机为VM10
2)Linux系统为centos6.5
3)Hadoop为hadoop-2.6.0-cdh5.7.0
4)JDK为jdk-7u80-linux-x64.tar.gz
5)Maven为apache-maven-3.3.9-bin.zip
6)MySQL为mysql-5.6.23-linux-glibc2.5-x86_64.tar.gz
【以上环境必须提前装备好】
mysql已经部署在用户mysqladmin下,家目录:/usr/local/mysql,用户名root,密码123456
详细步骤见之前文章《MySQL部署》
2.Hive-1.1.0-cdh5.7.0编译
1)下载
下载/hive-1.1.0-cdh5.7.0-src.tar.gz
下载地址:http://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.7.0-src.tar.gz
2)上传(rz)
上传hive-1.1.0-cdh5.7.0-src.tar.gz至/home/hadoop/source
3)解压
[hadoop@hadoop001 source]$ tar -zxvf hadoop-2.6.0-cdh5.7.0-src.tar.gz
解压后注意观察解压后文件夹的用户、用户组对不对,不对的话chown -R XXX:XXX dir一下
4)编译
[hadoop@hadoop001 ~]$ cd source/hive-1.1.0-cdh5.7.0
[hadoop@hadoop001 hive-1.1.0-cdh5.7.0]$ pwd
/home/hadoop/source/hive-1.1.0-cdh5.7.0
[hadoop@hadoop001 hive-1.1.0-cdh5.7.0]$ mvn clean package -DskipTests -Phadoop-2 -Pdist
3.hive部署
1)解压编译好的hive安装包
将编译好的apache-hive-1.1.0-cdh5.7.0-bin.tar.gz移至/home/hadoop/software
【也可以直接下载hive安装包,下载链接:http://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.7.0.tar.gz】
[hadoop@hadoop001 software]$ tar -zxvf apache-hive-1.1.0-cdh5.7.0-bin.tar.gz -C ~/app/
2)配置环境变量
[hadoop@hadoop001 ~]$ vi .bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
export JAVA_HOME=/usr/java/jdk1.7.0_80
export MVN_HOME=/home/hadoop/app/apache-maven-3.3.9
export FINDBUGS_HOME=/home/hadoop/app/findbugs-1.3.9
export PROTOC_HOME=/usr/local/protobuf
export HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0
export HIVE_HOME=/home/hadoop/app/hive-1.1.0-cdh5.7.0
export PATH=$HIVE_HOME/bin:$HADOOP_HOME/bin:$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MVN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 ~]$ source .bash_profile
3)修改配置文件
[hadoop@hadoop001 ~]$ cd /home/hadoop/app/hive-1.1.0-cdh5.7.0/conf
[hadoop@hadoop001 conf]$ cp hive-env.sh.template hive-env.sh
[hadoop@hadoop001 conf]$ vi hive-env.sh
HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0
[hadoop@hadoop001 conf]$ vi hive-site.xml
4)拷贝驱动:
cp mysql-connector-java-5.1.27.jar $HIVE_HOME/lib
驱动下载地址:http://search.maven.org
搜索mysql-connector-java
选择5.1.27版本下载
4.启动hdfs和mysql
[hadoop@hadoop001 ~]$ cd $HADOOP_HOME/sbin
[hadoop@hadoop001 sbin]$ ./start-dfs.sh
[hadoop@hadoop001 ~]$ jps
4066 DataNode
5510 Jps
4201 SecondaryNameNode
3976 NameNode
[root@hadoop001 ~]# su – mysqladmin
[mysqladmin@hadoop001 ~]$ service mysql start
Starting MySQL…………. [ OK ]
5.启动hive
[hadoop@hadoop001 ~]$ cd /home/hadoop/app/hive-1.1.0-cdh5.7.0/bin
[hadoop@hadoop001 bin]$ ./hive
hive> show tables;
OK
Time taken: 0.171 seconds
配置文件里的数据库ruozedata_basic03在mysql里可以看到
mysql> show databases;