在網上看過的安裝教程真的是有點心酸呢。好多都失敗了。
分享下,自己實驗成功可以用的博文供大家參考:
推薦1給力星:http://www.powerxing.com/install-hadoop/ hadoop+spark 完全參考的他的博文,相當給力推薦指數5個星
hive的話,我這邊參考的文章太多了。沒個能用的,不知道是我自己弄的不對還是怎麼回事.
.....
最後無意間看到hive變成指南這本書,裏面的教程安裝成功的。
root@iZ254fu6ocuZ:/usr/local/hadoop/hive# hive
hive>
root@iZ254fu6ocuZ:/usr/local/hadoop/hive# jps
8100 ResourceManager
8533 JobHistoryServer
7709 SecondaryNameNode
18406 Jps
7514 DataNode
7410 NameNode
8204 NodeManager
root@iZ254fu6ocuZ:/usr/local/hadoop/hive#
root@iZ254fu6ocuZ:/usr/local/spark# ./bin/spark-shell
scala>
文檔太多了,我就分享下我初次安裝所踩的坑吧:
不得不說的就是環境變量,踩了好多, 例如路徑明明是/usr/local 寫成/usr/lacol
啓動程序時總報莫名奇妙的錯誤,這種錯誤,看到此處的同學也都注意下吧,很坑的。
還有就網上很多配置文件中,有些是需要改成你自己的主機名的,別粘貼過來了,不改參數。
我在分享幾個我用過的一些包的鏈接吧;
wget http://www.eu.apache.org/dist/hive/hive-1.1.1/apache-hive-1.1.1-bin.tar.gz hive
wget http://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.39.tar.gz hive連接JDBC時依賴的包
wget http://archive.apache.org/dist/spark/spark-1.6.0/spark-1.6.0-bin-without-hadoop.tgz spark
wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz hadoop
hive的本地模式分享: 已實驗,無問題:hive的安裝包是1.1.1
主要就是一個配置文件,本地模式不用修改特別負責。只需要配置入下內容即可,
目的就是爲了,防止每次執行hive命令時元數據存儲都在不同的目錄下。
root@iZ254fu6ocuZ:/usr/local/hadoop/hive/conf# cat hive-site.xml
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/usr/local/hadoop/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<!--表示使用嵌入式的derby,create爲true表示自動創建數據庫,數據庫名爲metastore_db-->
<value>jdbc:derby:;databaseName=/usr/local/hadoop/hive/metastore_db;create=true</value>
<!--表示使用客服模式的derby,hadoopor爲數據庫名,192.168.0.3爲derby服務端的IP地址,而4567爲服務端的端口號-->
<!--<value>jdbc:derby://192.168.0.3:4567/hadoopor;create=true</value>-->
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.apache.derby.jdbc.EmbeddedDriver</value>
<!--<value>org.apache.derby.jdbc.ClientDriver</value>-->
<description>Driver class name for a JDBC metastore</description>
</property>
</configuration>
root@iZ254fu6ocuZ:/usr/local/hadoop/hive/conf#
使用JDBC管理元數據
需要有一臺有mysql的服務器:我是另外準備了一臺進行測試
hadoop@ubuntu:~$ mysql -uroot -pmysql mysql> CREATE USER 'hive' IDENTIFIED BY 'hive'; mysql> GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%' WITH GRANT OPTION; mysql> flush privileges;
使用JDBC管理元數據需要準備JDBC的驅動,上面已經提供鏈接可以使用:
將mv mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar /usr/local/hadoop/hive/lib/
將上面的hive-site.xml備份,重新編寫該文件:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://120.27.7.76/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
</configuration>
root@iZ254fu6ocuZ:/usr/local/hadoop/hive/conf#
root@iZ254fu6ocuZ:~# hive
hive>