spark作業提交失敗分析

版權聲明:本文為博主原創文章,未經博主允許不得轉載。http://www.lxweimin.com/p/a4bf0f7173f5

提交一個spark作業,報錯:

error cluster.yarnclientschedulerbackend:the yarn application has already ended!It might have been killed or the Application Master may have failed to start.

error spark.sparkcontext:error inilializing sparkcontext.

再提交一個yarn作業,

hadoop jar /opt/cloudera/parcels/CDH-6.1-/jars/hadoop-examples.jar pi 10 1

報錯:

Application application_1560473958049_0005 failed 2 times due to AM Container

for appattempt_1560473958049_0005_000002 exited with exitCode: -1000

Diagnostics: Not able to initialize app directories in any of the configured local directories for app application_1560473958049_0005.

可以看出,是在yarn初始化作業的時候就報錯了,其實是一個權限問題。未啟用kerberos前目錄權限為yarn:yarn,啟用后變成wangkuan:yarn,導致權限不兼容。查看yarn.nodemanager.local-dirs,rm -rf 該目錄/usercache/*,之后再重啟yarn即可。

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。