CDH修改Oozie數據庫類型爲mysql
1.將mysql驅動存放到/var/lib/oozie目錄下
2.創建oozie數據庫
3.初始化數據庫
workflow.xml文件配置mapreduce遇到的問題:
1
.java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper not found
問題原因:
<name>mapreduce.map.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob.CounterMapper</value>
解決方式:將點.改爲 CounterMapper
問題2:
java.lang.NoClassDefFoundError: com/maxmind/geoip2/exception/AddressNotFoundException
問題原因:
缺失第三方jar包
解決方式:
將jar包放到workfolw.xml所在目錄的lib目錄下
問題3:
Oozie 出現 ClassNotFoundException 解決方法
只需要在job.properties裏面加入oozie.use.system.libpath=true
參考地址:http://jyd.me/nosql/oozie-classnotfoundexception-solution/
問題4:
執行shell時:[org.apache.oozie.action.hadoop.ShellMain], main() threw exception, Cannot run program
問題原因:由於我們是在集羣上運行,因此我們需要把腳本拷貝到每個節點:
加一個file節點
/user/root/examples/apps/hive/create_mysql_table.sh
/user/root/examples/apps/hive/create_mysql_table.sh#create_mysql_table.sh
**執行mapreduce時的workfolw.xml**
<workflow-app xmlns="uri:oozie:workflow:0.2" name="usercounter-job-wf">
<start to="mr-node"/>
<action name="mr-node">
<map-reduce>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/zqc/264/tmp/${outputDir}"/>
</prepare>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
<property>
<name>mapred.mapper.new-api</name>
<value>true</value>
</property>
<property>
<name>mapred.reducer.new-api</name>
<value>true</value>
</property>
<property>
<name>mapred.job.name</name>
<value>UserCounterJob</value>
</property>
<property>
<name>mapreduce.inputformat.class</name>
<value>org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat</value>
</property>
<property>
<name>mapreduce.map.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterMapper</value>
</property>
<property>
<name>mapreduce.reduce.class</name>
<value>com.rytong.mdap.analytics.compute.basic.UserCounterJob$CounterReducer</value>
</property>
<property>
<name>mapred.mapoutput.key.class</name>
<value>org.apache.hadoop.io.Text</value>
</property>
<property>
<name>mapred.mapoutput.value.class</name>
<value>com.rytong.mdap.analytics.source.Message</value>
</property>
<property>
<name>mapred.output.key.class</name>
<value>org.apache.hadoop.hbase.io.ImmutableBytesWritable</value>
</property>
<property>
<name>mapred.output.value.class</name>
<value>org.apache.hadoop.io.LongWritable</value>
</property>
<property>
<name>mapreduce.outputformat.class</name>
<value>org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>2</value>
</property>
<property>
<name>mapred.map.tasks</name>
<value>2</value>
</property>
<property>
<name>mapred.input.dir</name>
<value>/zqc/264/clean/session/20180415</value>
</property>
<property>
<name>mapred.output.dir</name>
<value>/zqc/264/tmp/${outputDir}</value>
</property>
</configuration>
<file>/user/root/examples/apps/java-main/config</file>
</map-reduce>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
“`