• 安装MySQL

输入以下命令安装MySQL:

sudo apt-get install mysql-server

sudo apt install mysql-client

sudo apt install libmysqlclient-dev

输入以下命令检验是否安装成功:

sudo netstat -tap | grep mysql

出现下图,即表示安装成功:

hadoop3.11下安装Hive

启动MySQL:

service mysql start

登入MySQL

mysql -u root -p

登录时成功:

hadoop3.11下安装Hive

若登录不成功,可进入安全模式修改密码:

sudo /etc/init.d/mysql stop

sudo /usr/bin/mysqld_safe --skip-grant-tables --skip-networking &

> use mysql;

> update user set authentication_string=PASSWORD("这里输入你要改的密码") where User='root'; #更改密码

> update user set plugin="mysql_native_password"; #如果没这一行可能也会报一个错误,因此需要运行这一行

> flush privileges; #更新所有操作权限

> quit;

修改成功后,关闭MySQL服务,再重启MySQL服务,使用密码重新登录即可。

 

  • 安装Hive

将 hive 解压到/usr/local/hadoop 下,并将文件重命名为 hive:

sudo tar -zxvf apache-hive-3.1.1-bin.tar.gz -C /usr/local/Hadoop

cd /usr/local/Hadoop

mv apache-hive-3.1.1-bin hive

修改环境变量/etc/profile:

sudo gedit /$HOME/.profile

在文件末尾加入

export HIVE_HOME=/usr/local/Hadoop/hive

export PATH=$PATH:$HIVE_HOME/bin

export HADOOP_HOME=/usr/local/hadoop

执行:

source /$HOME/.profile

下载 mysql 的驱动包:mysql-connector-java-5.1.47.tar.gz

解压之后把mysql-connector-java-5.1.47-bin.jar 放置到$HIVE_HOME\lib 目录

启动 hadoop 的 dfs 和 yarn:

start-dfs.sh

start-yarn.sh

创建 HDFS 目录并赋予权限:

cd/usr/Hadoop/ bin

hdfs dfs -mkdir -p /usr/hive/warehouse

hdfs dfs -mkdir -p /usr/hive/tmp

hdfs dfs -mkdir -p /usr/hive/log

hdfs dfs -chmod 777 /usr/hive/warehouse

hdfs dfs -chmod 777 /usr/hive/tmp

hdfs dfs -chmod 777 /usr/hive/log

          修改hive配置:

cd /usr/local/Hadoop/hive/conf

cp hive-env.sh.template hive-env.sh

cp hive-default.xml.template hive-site.xml

sudo gedit hive-env.sh

末尾加入:

export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_191

export HADOOP_HOME=/usr/local/hadoop

export HIVE_HOME=/usr/local/hadoop/hive

打开hive-site.xml:

sudo gedit hive-site.xml

 

输入以下内容:

 

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

 

 <property>

    <name>datanucleus.schema.autoCreateAll</name>

    <value>true</value>

  </property>

   <property>

        <name>hive.metastore.warehouse.dir</name>

        <value>/usr/local/hive/warehouse</value>      

        <description>location of default database for the warehouse</description>

    </property>

 

    <property>

       <name>javax.jdo.option.ConnectionURL</name>

       <value>jdbc:mysql://localhost:3306/hive</value>                          

        <description>JDBC connect string for a JDBC metastore</description>

    </property>

    <property>

        <name>javax.jdo.option.ConnectionDriverName</name>

        <value>com.mysql.jdbc.Driver</value>

        <description>Driver class name for a JDBC metastore</description>

    </property>

    <property>

       <name>javax.jdo.option.ConnectionPassword </name>

       <value>hadoop</value>

    </property>

    <property>

        <name>javax.jdo.option.ConnectionUserName</name>

        <value>hadoop</value>

        <description>Username to use against metastore database</description>

     </property>

</configuration>

同时在 hive-site.xml 中修改有关于 system:java.io.tmpdir 的值的字段,全部改成/usr/hive/tmp

启动Hadoop,启动hive出现了错误:exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net. 解决办法是下载MySQL驱动,mysql-connector-java-5.1.23-bin.jar放在hive/lib下即可。

 

 

 

  • 运行

准备一个学生的成绩表students.txt

hadoop3.11下安装Hive

把students.txt上传到HDFS上:

cd /usr/local/Hadoop/bin

hdfs dfs –put / usr/local/Hadoop/input/students.txt /usr/local/hive/tmp

在hive中建表:

hive> create table st4(id INT name string age int score int) row format delimited fields

terminated by ' ' lines terminated by '\n';

hadoop3.11下安装Hive

将hdfs中的数据导入st4表中:

hive> load data inpath '/usr/local/hadoop/hive/tmp/students.txt' into table st4;

hadoop3.11下安装Hive

按照成绩进行排序(降序):

hive> select * from st4 order by score desc;

hadoop3.11下安装Hive

求全班平均成绩:

hive> select avg(score) from st4;

hadoop3.11下安装Hive

相关文章:

  • 2022-01-05
  • 2021-10-27
  • 2021-08-07
  • 2022-12-23
  • 2021-12-19
  • 2021-06-22
猜你喜欢
  • 2022-12-23
  • 2021-11-18
  • 2021-07-31
  • 2021-08-15
  • 2021-06-05
  • 2022-12-23
  • 2021-11-20
相关资源
相似解决方案