python代碼中的importfrom spark_learning.utils.default_utils import setDefaultEncoding,initSparkContext,ensureOffsetsubmit命令:bin/spark-submit --jars /home/jabo/software/spark-1.5.2-bin-hadoop2.6/lib/spark-streaming-kafka-assembly_2.10-1.5.2.jar\/home/jabo/spark-by-python/spark_learning/third_day/streaming_kafka_avg.py\
--py-files /home/jabo/spark-by-python/spark_learning/utils/default_utils.py官網(wǎng)解釋:For Python applications, simply pass a .py file in the place of <application-jar> instead of a JAR, andadd Python .zip, .egg or .py files to the search path with --py-files.但是會報(bào)錯(cuò),找不到import模塊:Traceback (most recent call last): File "/home/jabo/spark-by-python/spark_learning/third_day/streaming_kafka_avg.py", line 10, in <module> import spark_learning.utils.default_utils
ImportError: No module named spark_learning.utils.default_utils如何解決??
2 回答

烙印99
TA貢獻(xiàn)1829條經(jīng)驗(yàn) 獲得超13個(gè)贊
你可以試一下把--py-files 參數(shù) 放在你要運(yùn)行腳本的前面哈!剛才我們也遇到這個(gè)問題 就是這樣解決的!
- 2 回答
- 0 關(guān)注
- 1600 瀏覽
添加回答
舉報(bào)
0/150
提交
取消