1. rpm安装mysql
启动mysql:/etc/init.d/mysql start
添加系统启动:/sbin/chkconfig --add mysql
create database hive;
grant all on hive.* to hive@'%' identified by 'hive';
GRANT ALL PRIVILEGES ON hive.* TO 'hive'@'localhost' IDENTIFIED BY 'hive' WITH GRANT OPTION;
注意localhost 主机名
flush privileges;
Hive配置(只需要主机上配置)
etc/profile下的配置
JAVA_HOME=/usr/java/jdk1.6.0_30
HADOOP_HOME=/opt/hadoop13/hadoop-1.0.3
HIVE_HOME=/opt/hadoop13/hive-0.9.0 ****
PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$PATH:$HIVE_HOME/bin ***
CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export JAVA_HOME HADOOP_HOME
export HBASE_HEAPSIZE=128
export HBASE_MANAGES_ZK=false
export PATH
export CLASSPATH
source /etc/profile
元数据迁移到mysql
主机上先安装mysql,用于存放hadoop元数据
Mysql驱动
下载一个数据包mysql-connector-java-5.1.18-bin.jar,放到hive的lib目录下,
创建用户
Mysql创建用户hive,密码为hive
Mysql下创建元数据库:hive
修改hive数据库连接
在hive的conf目录下创建文件hive-site.xml配置文件(数据库连接等信息自定义):
<?xml version="1.0"?>
<?xml-stylesheettype="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hive.metastore.local</name>
<value>true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>datanucleus.fixedDatastore</name>
<value>false</value>
</property>
</configuration>
Hive启动运行
Cd /app/hive
./hive
查看数据表:
Show tables;
Ok,配置完成!
可能会出现的错误
1 、FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Access denied for user 'root'@'master' (using password: YES)
NestedThrowables:
java.sql.SQLException: Access denied for user 'root'@'master' (using password: YES)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
这次坑爹了,从terminal运行mysql -uroot -p是可以进去的呀
看了一下hive-site.xml,配置mysql用户名密码的信息为:
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property>
没错呀我的用户名是hive,密码为hive
然后就是查是否是权限开放的有问题,
进去mysql:mysql -uhive-p
输入密码
于是GRANT 发现
GRANT ALL PRIVILEGES ON hive.* TO 'hive'@'localhost' IDENTIFIED BY 'hive' WITH GRANT OPTION; 由于本地主机名 是master
改为
GRANT ALL PRIVILEGES ON hive.* TO 'hive'@'master' IDENTIFIED BY 'hive' WITH GRANT OPTION;
再测试通过 OK
2 、 FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
这个错误出现的原因是Hive自带的lib中没有mysql的jar,我用的是mysql-connector-java-5.1.18-bin.jar,拷贝到HIVE_HOME/lib中
3、
FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: The
connection property 'useUnicode' only accepts values of the form:
'true', 'false', 'yes' or 'no'. The value
'true;characterEncoding=UTF-8;createDatabaseIfNotExist=true' is not in
this set.
NestedThrowables:
java.sql.SQLException: The connection property 'useUnicode' only accepts
values of the form: 'true', 'false', 'yes' or 'no'. The value
'true;characterEncoding=UTF-8;createDatabaseIfNotExist=true' is not in
this set.
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
这个提示还是很明确的,找到hive-site.xml中的
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive?useUnicode=true;characterEncoding=UTF-8;createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
修改为
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://210.51.7.31:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
4 在安装过程中,第一次启动Hive没有成功,后来在网上查到原因如下,并成功解决:
错误如下:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
解决方法是,安装Hadoop时,修改Hadoop目录下/conf/hadoop-env.sh时,添加HADOOP_CLASSPATH变量覆盖了原有的变量,改成如下的形式即可:
HADOOP_CLASSPATH=$HADOOP_CLASSPATH:....
红色为添加部分。问题解决。