温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

Hive数据库的安装

发布时间:2021-08-20 22:26:00 来源:亿速云 阅读:156 作者:chen 栏目:大数据

这篇文章主要介绍“Hive数据库的安装”,在日常操作中,相信很多人在Hive数据库的安装问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答”Hive数据库的安装”的疑惑有所帮助!接下来,请跟着小编一起来学习吧!

======一.Hive数据库的安装======
1.首先需要安装以上hadoop环境。
2.安装mysql环境存储Hive的元数据,因为默认的元数据是存放在derby(只支持一个链接,用于测试)实际环境用mysql。
3.安装环境使用centos 6.5  IP为:192.168.0.12

======二.安装mysql数据库存储Hive元数据======

yum install mysql-server

mysql -uroot -p
create database hive;  
update mysql.user set password=PASSWORD ('root') where User='root';
flush privileges;



======三.安装Hive======
需要Java环境,上述Hadoop已经配置。

cd /data/hadoop
wget -c http://114.242.101.2:808/hive/apache-hive-2.3.2-bin.tar.gz
tar xf apache-hive-2.3.2-bin.tar.gz
mv apache-hive-2.3.2-bin hive
chown -R hadoop:hadoop hive



设置Hive环境变量hadoop我已经设置
vim /etc/profile
#hive
export HIVE_HOME=/data/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH

soure /etc/profile




======四.修改Hive配置文件======

su - hadoop
cd /data/hadoop/hive/conf
mv hive-default.xml.template hive-site.xml
清空文件中<configuration></configuration>之间的内容并加入下列内容:
<property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>root</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>root</value>
</property>

把MySQL的JDBC驱动包复制到Hive的lib目录下
cd /data/hadoop/hive/lib/
wget -c http://114.242.101.2:808/hive/mysql-connector-java-5.1.44-bin.jar

======五. Hive在HDFS上的默认存储路径======
官网说明:
Hive uses Hadoop, so:

    you must have Hadoop in your path OR
    export HADOOP_HOME=<hadoop-install-dir>

In addition, you must use below HDFS commands to create
 /tmp and /user/hive/warehouse (aka hive.metastore.warehouse.dir)
and set them chmod g+w before you can create a table in Hive.

su - hadoop
cd /data/hadoop/hadoop-2.7.4
./bin/hadoop fs -mkdir       /tmp
./bin/hadoop fs -mkdir  -p   /user/hive/warehouse
./bin/hadoop fs -chmod g+w   /tmp
./bin/hadoop fs -chmod g+w   /user/hive/warehouse

======六.运行Hive======
出现以下表明运行成功。
[hadoop@localhost hadoop]$ hive
which: no hbase in (/data/hadoop/hadoop-2.7.4/bin:/data/hadoop/hive/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/bin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/hadoop/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/data/hadoop/hive/lib/hive-common-2.3.2.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>
    >
    >
    >
    
    



======七.初始化Hive数据库======
su -hadoop
schematool -initSchema -dbType mysql


======八.运行Hive操作命令======
hive> CREATE TABLE pokes (foo INT, bar STRING);
hive> CREATE TABLE invites (foo INT, bar STRING) PARTITIONED BY (ds STRING);

hive> SHOW TABLES;
hive> SHOW TABLES '.*s';
hive> DESCRIBE invites;

hive> ALTER TABLE events RENAME TO 3koobecaf;
hive> ALTER TABLE pokes ADD COLUMNS (new_col INT);
hive> ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT 'a comment');
hive> ALTER TABLE invites REPLACE COLUMNS (foo INT, bar STRING, baz INT COMMENT 'baz replaces new_col2');
hive> ALTER TABLE invites REPLACE COLUMNS (foo INT COMMENT 'only keep the first column');

hive> DROP TABLE pokes;

到此,关于“Hive数据库的安装”的学习就结束了,希望能够解决大家的疑惑。理论与实践的搭配能更好的帮助大家学习,快去试试吧!若想继续学习更多相关知识,请继续关注亿速云网站,小编会继续努力为大家带来更多实用的文章!

向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI