Spark jar包找不到解決方法

今天在使用Spark中DataFrame往Mysql中插入RDD,但是一直報出以下的異常次信息:

[itelbog@iteblog ~]$  bin/spark-submit --master local[2] 
	--jars lib/mysql-connector-java-5.1.35.jar 
	--class  spark.sparkToJDBC ./spark-test_2.10-1.0.jar

spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" java.sql.SQLException: No suitable driver found for 
jdbc:mysql://www.iteblog.com:3306/spark?user=root&password=123&useUnicode=
true&characterEncoding=utf8&autoReconnect=true
	at java.sql.DriverManager.getConnection(DriverManager.java:602)
	at java.sql.DriverManager.getConnection(DriverManager.java:207)
	at org.apache.spark.sql.DataFrame.createJDBCTable(DataFrame.scala:1189)
	at spark.<span class="wp_keywordlink_affiliate"><a href="http://www.iteblog.com/archives/tag/spark" title="" target="_blank" data-original-title="View all posts in Spark">Spark</a></span>ToJDBC$.toMysqlFromJavaBean(SparkToJDBC.scala:20)
	at spark.SparkToJDBC$.main(SparkToJDBC.scala:47)
	at spark.SparkToJDBC.main(SparkToJDBC.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$
$runMain(SparkSubmit.scala:569)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

感覺很奇怪,我在啓動作業的時候加了Mysql驅動啊在,怎麼會出現這種異常呢??經過查找,發現在–jars參數裏面加入Mysql是沒有用的。通過查找,發現提交的作業可以通過加入--driver-class-path參數來設置driver的classpath,試了一下果然沒有出現錯誤!

[itelbog@iteblog ~]$  bin/spark-submit --master local[2] 
	--driver-class-path lib/mysql-connector-java-5.1.35.jar 
	--class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar
其實,我們還可以在spark安裝包的conf/spark-env.sh通過配置SPARK_CLASSPATH來設置driver的環境變量,如下:

export SPARK_CLASSPATH=$SPARK_CLASSPATH:/iteblog/com/mysql-connector-java-5.1.35.jar

這樣也可以解決上面出現的異常。但是,我們不能同時在conf/spark-env.sh裏面配置SPARK_CLASSPATH和提交作業加上–driver-class-path參數,否則會出現以下異常:

[itelbog@iteblog ~]$  bin/spark-submit --master local[2] 
	--driver-class-path lib/mysql-connector-java-5.1.35.jar 
	--class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar

Spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" org.apache.spark.SparkException: 
	Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
	at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply
$7.apply(SparkConf.scala:339)
	at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply
$7.apply(SparkConf.scala:337)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:337)
	at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:325)
	at scala.Option.foreach(Option.scala:236)
	at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:325)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:197)
	at spark.SparkToJDBC$.main(SparkToJDBC.scala:41)
	at spark.SparkToJDBC.main(SparkToJDBC.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
		deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

轉載自過往記憶(http://www.iteblog.com/)

http://www.iteblog.com/archives/1300點擊打開鏈接

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章