上一篇簡單介紹了Spark Hello World,這一篇主要記錄一下怎麼獲得Spark Java源代碼,導入Eclipse,並且用Maven執行。
1、從github checkout 源代碼
$git clone git://github.com/perwendel/spark.git
2、把這個項目Eclipse化
$cd spark
$mvn eclipse:eclipse -Dwtpversion=2.0
這時候可以打開Eclipse,import Existing project,Eclipse的Import Wizard應該可以識別spark是個Eclipse項目。
3、執行Maven build
$mvn clean install
如果這一步失敗在
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] Failure executing javac, but could not parse the error:
javac: invalid target release: 1.7
Usage: javac <options> <source files>
use -help for a list of possible options
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.283s
[INFO] Finished at: Mon Apr 15 17:12:45 EDT 2013
[INFO] Final Memory: 5M/81M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) on project spark-core: Compilation failure
[ERROR] Failure executing javac, but could not parse the error:
[ERROR] javac: invalid target release: 1.7
[ERROR] Usage: javac <options> <source files>
[ERROR] use -help for a list of possible options
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
原因是spark需要Java jdk 1.7,沒有的可以去Oracle官網下載。
$java -version
java version "1.7.0_17"
Java(TM) SE Runtime Environment (build 1.7.0_17-b02)
Java HotSpot(TM) 64-Bit Server VM (build 23.7-b01, mixed mode)
之後再執行
$mvn clean install
應該可以順利download dependency,執行compile,test,package,生成一個spark jar。