win7(64位)平臺下Cygwin+Eclipse搭建Hadoop單機開發環境 (一) Cygwin(64位)的安裝 + ssh的配置
win7(64位)平臺下Cygwin+Eclipse搭建Hadoop單機開發環境 (二) Hadoop的安裝
win7(64位)平臺下Cygwin+Eclipse搭建Hadoop單機開發環境
(三) 在Eclipse中配置Hadoop
win7(64位)平臺下Cygwin+Eclipse搭建Hadoop單機開發環境 (四) 導入Hadoop源碼+wordcount程序+運行
=========================================================================================
Hadoop是開源項目,我們當然想能夠邊編寫自己的程序,又能學習Hadoop的源碼啦。(當然,做到這,可以直接在MyEclipse裏面創建mapreduce項目了,但這個不是本篇的內容了。)在開始之前,請確保已經在Cygwin terminal中,start-up.sh了Hadoop。
1、在MyEclipse新建一java項目 “MyHadoop”
2、選中新建的項目 ——> “build path” ——> “Configure build path” ——> 選中"Libraries" ——> “add external jars”
(ant自己去網上下載一個)
將以上這些都添加到所建項目的路徑裏面。
3、選中“Source”選項卡,將Hadoop的幾個重要的源文件包加入進去
4、將C:\cygwin64\hadoop\src\examples\org\apache\hadoop\examples\WordCount.java拷貝過來,調整一下源碼,使不報錯爲止。
5、準備hdfs文件,過程略(用Hadoop fs put)。
6、配置WordCount運行參數
7、運行
14/09/12 13:25:21 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/09/12 13:25:22 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/12 13:25:22 INFO input.FileInputFormat: Total input paths to process : 1
14/09/12 13:25:22 INFO mapred.JobClient: Running job: job_local_0001
14/09/12 13:25:22 INFO input.FileInputFormat: Total input paths to process : 1
14/09/12 13:25:22 INFO mapred.MapTask: io.sort.mb = 100
14/09/12 13:25:22 INFO mapred.MapTask: data buffer = 79691776/99614720
14/09/12 13:25:22 INFO mapred.MapTask: record buffer = 262144/327680
14/09/12 13:25:22 INFO mapred.MapTask: Starting flush of map output
14/09/12 13:25:23 INFO mapred.MapTask: Finished spill 0
14/09/12 13:25:23 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
14/09/12 13:25:23 INFO mapred.LocalJobRunner:
14/09/12 13:25:23 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
14/09/12 13:25:23 INFO mapred.LocalJobRunner:
14/09/12 13:25:23 INFO mapred.Merger: Merging 1 sorted segments
14/09/12 13:25:23 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 39 bytes
14/09/12 13:25:23 INFO mapred.LocalJobRunner:
14/09/12 13:25:23 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
14/09/12 13:25:23 INFO mapred.LocalJobRunner:
14/09/12 13:25:23 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
14/09/12 13:25:23 INFO mapred.JobClient: map 100% reduce 0%
14/09/12 13:25:23 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to hdfs://localhost:9000/user/pc-20130929egdl/administrator/output
14/09/12 13:25:23 INFO mapred.LocalJobRunner: reduce > reduce
14/09/12 13:25:23 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
14/09/12 13:25:24 INFO mapred.JobClient: map 100% reduce 100%
14/09/12 13:25:24 INFO mapred.JobClient: Job complete: job_local_0001
14/09/12 13:25:24 INFO mapred.JobClient: Counters: 14
14/09/12 13:25:24 INFO mapred.JobClient: FileSystemCounters
14/09/12 13:25:24 INFO mapred.JobClient: FILE_BYTES_READ=34009
14/09/12 13:25:24 INFO mapred.JobClient: HDFS_BYTES_READ=50
14/09/12 13:25:24 INFO mapred.JobClient: FILE_BYTES_WRITTEN=68754
14/09/12 13:25:24 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=25
14/09/12 13:25:24 INFO mapred.JobClient: Map-Reduce Framework
14/09/12 13:25:24 INFO mapred.JobClient: Reduce input groups=3
14/09/12 13:25:24 INFO mapred.JobClient: Combine output records=3
14/09/12 13:25:24 INFO mapred.JobClient: Map input records=2
14/09/12 13:25:24 INFO mapred.JobClient: Reduce shuffle bytes=0
14/09/12 13:25:24 INFO mapred.JobClient: Reduce output records=3
14/09/12 13:25:24 INFO mapred.JobClient: Spilled Records=6
14/09/12 13:25:24 INFO mapred.JobClient: Map output bytes=41
14/09/12 13:25:24 INFO mapred.JobClient: Combine input records=4
14/09/12 13:25:24 INFO mapred.JobClient: Map output records=4
14/09/12 13:25:24 INFO mapred.JobClient: Reduce input records=3
=========================================================================================
END , 撒花