CentOS6.5上hive安裝過程與常見錯誤調試

CentOS6.5上hive安裝過程                    

準備:已安裝hadoop,這裏我使用hadoop-2.6.0單進僞分佈式,將hive安裝在hadoop用戶下

第一部分:安裝配置mysql

1.確保系統中已安裝了MySQL,可以先嚐試啓動MySQL服務,一般在root權限下啓動
  service mysqld start
  若顯示:mysqld: unrecognized service則是mysql沒有安裝,在這裏我選擇用yum命令安裝mysql
  yum -y install mysql-server mysql mysql-devel 
  命令將:mysql-server、mysql、mysql-devel都安裝好,當結果顯示爲“Complete!”即安裝完畢。
詳細過程:http://jingyan.baidu.com/article/c74d600079be530f6a595dc3.html

2.進入MySQL,一定先啓動mysql,否則會報錯:

  ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)


  mysql -uroot -p   密碼默認爲空

3.配置HIVE元數據庫,以及創建用戶名,分配網絡訪問權限。
  insert into mysql.user(Host,User,Password) values("hadoop","hive",password("hive"));
  create database hive;
  grant all on hive.* to hive@'%'  identified by 'hive';
  grant all on hive.* to hive@'localhost'  identified by 'hive';
  grant all on hive.* to hive@'hadoop'  identified by 'hive';
  flush privileges;


4.mysql -uhive -p
  show databases;
  use hive;
  create table test (name int);
  show tables;
  drop table test;


第二部分:安裝配置hive
1.導入hive安裝包,在/home/hadoop目錄下,解壓縮
  tar -zxvf apache-hive-1.2.1-bin.tar.gz 
  mv apache-hive-1.2.1-bin hive

2.配置環境變量,在hadoop用戶下,配置.bash_profile
  CLASSPATH=.:$JAVA_HOME/lib:$HADOOP_HOME/lib:/home/hadoop/hive/lib
  HIVE_HOME=/home/hadoop/hive
  PATH=$PATH:$HIVE_HOME/bin

  配置好後source .bash_profile,用echo $HIVE_HOME命令,若顯示/home/hadoop/hive,則配置正確。

3.配置hive-site.xml和hive-env.sh.template兩個文件,hive-site.xml文件沒有,需要自己複製出來
  cd /home/hadoop/hive/conf
  cp hive-default.xml.template hive-site.xml


  vim hive-site.xml,可以輸入:/javax/來快速定位
   
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    <description>password to use against metastore database</description>
  </property>
   
   <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
   </property>

   <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    <description>Username to use against metastore database</description>
   </property>
  
   <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>

  vim hive-env.sh.template

  # Set HADOOP_HOME to point to a specific hadoop install directory
  HADOOP_HOME=/home/hadoop/hadoop/hadoop-2.6.0
  
  # Hive Configuration Directory can be controlled by:
  export HIVE_CONF_DIR=/home/hadoop/hive/conf


4.將mysql-connector-java-5.1.22-bin.jar導入/hive/lib中,不導入會報錯


5.故障處理1:報錯Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: 

             java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D


  解決辦法:
  在hive下創建臨時IO的tmp文件夾。然後將路徑配置到hive-site.xml的下列參數中
  <property>
    <name>hive.querylog.location</name>
    <value>/home/hadoop/hive/iotmp</value>
    <description>Location of Hive run time structured log file</description>
  </property>
  
  <property>
    <name>hive.exec.local.scratchdir</name>
    <value>/home/hadoop/hive/iotmp</value>
    <description>Local scratch space for Hive jobs</description>
  </property>
  
  <property>
    <name>hive.downloaded.resources.dir</name>
    <value>/home/hadoop/hive/iotmp</value>
    <description>Temporary local directory for added resources in the remote file system.</description>
  </property>



故障處理2:報錯[ERROR] Terminal initialization failed; falling back to unsupported

   java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected


   解決辦法:
   將hive下的新版本jline的JAR包拷貝到hadoop下:
   cp /hive/apache-hive-1.1.0-bin/lib/jline-2.12.jar /home/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib
    -rw-r--r-- 1 root root   87325 Mar 10 18:10 jline-0.9.94.jar.bak
    -rw-r--r-- 1 root root  213854 Mar 11 22:22 jline-2.12.jar


6.啓動hadoop,此處我是使用僞分佈式,所以要先進入hadoop-2.6.0目錄下
  sbin/start-all.sh
  
  之後,可以啓動hive
  hive 
 或 hive -hiveconf hive.root.logger=DEBUG,console 輸出bug,顯示在控制檯 
 或 hive --hiveconf  hive.cli.print.current.db=true 同時啓動數據庫












































發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章