install hive in fedora


1 /etc/profile

export JAVA_HOME=/home/[email protected]/software/jdk1.8.0_121

PATH=$PATH:$JAVA_HOME/bin
export SCALA_HOME=/usr/share/scala
export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar
export PATH

export HADOOP_HOME=/home/[email protected]/software/hadoop-2.7.0
export HIVE_HOME=/home/[email protected]/software/hive-0.10.0
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop


export PATH=$PATH:$HBASE_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin


2

cp hive-default.xml.template hive-default.xml

cp hive-default.xml.template hive-site.xml

cp hive-exec-log4j2.properties.template hive-exec-log4j.properties

cp hive-log4j2.properties.template hive-log4j.properties

cp beeline-log4j.properties.template beeline-log4j.properties


即:把幾個帶.template後綴的模板文件,複製一份變成不帶.template的配置文件,注意hive-default.xml.template這個要複製二份,一個是hive-default.xml,另一個是hive-site.xml,其中hive-site.xml爲用戶自定義配置,hive-default.xml爲全局配置,hive啓動時,-site.xml自定義配置會覆蓋-default.xml全局配置的相同配置項。

下面爲hive-site.xml:


<configuration>

    <property>
        <name>hive.metastore.local</name>
        <value>true</value>
    </property>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:MySQL://127.0.0.1:3306/hive?characterEncoding=UTF-8</value>
    </property>
    
    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
    </property>
    
    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>root</value>
    </property>
    
    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>Hrs12345</value>
    </property>
    
   
    <property>
        <name>hive.exec.scratchdir</name>
        <value>/tmp/hive</value>                  // in hadoop fs
    </property>
    
    <property>
        <name>hive.exec.local.scratchdir</name>
        <value>/home/[email protected]/hive/tmp</value>  //in linux fs
    </property>


    <property>
        <name>hive.downloaded.resources.dir</name>
        <value>/home/[email protected]/hive/tmp/${hive.session.id}_resources</value> //in linux
    </property>


    <property>
        <name>hive.metastore.warehouse.dir</name>
        <value>/user/hive/warehouse</value>     //in hadoop fs
    </property>
</configuration>


另:上面的配置文件中,有一些關於目錄的參數,先提前把目錄建好,

hive.exec.local.scratchdir
hive.downloaded.resources.dir




3、測試及驗證
把mysql-connector.jar考入 hive的libexec下的lib裏面
先在myswql建立好庫hive
mysql》create database hive ;alter database hive character set latin1;

a) 創建表測試

hive>create table test(id int);
hive》drop table track_log;
create table track_log (
id                         string ,
url                        string ,
referer                    string ,
keyword                    string ,
type                       string ,
guid                       string ,
pageId                     string ,
moduleId                   string ,
linkId                     string ,
attachedInfo               string ,
sessionId                  string ,
trackerU                   string ,
trackerType                string ,
ip                         string ,
trackerSrc                 string ,
cookie                     string ,
orderCode                  string ,
trackTime                  string ,
endUserId                  string ,
firstLink                  string ,
sessionViewNo              string ,
productId                  string ,
curMerchantId              string ,
provinceId                 string ,
cityId                     string )  
PARTITIONED BY (ds string,hour string)  

ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t';

導入數據:

 jimmy$

 hive -e "LOAD DATA LOCAL INPATH '/Users/jimmy/Downloads/download/pv_UV/2015082818' OVERWRITE INTO TABLE track_log PARTITION (ds='20150828',hour='18');"





發佈了207 篇原創文章 · 獲贊 20 · 訪問量 38萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章