hdp.version is not found,If you're running Spark under HDP?

HDP3.0.1提交spark程序to yarn cluster,运行报错:

    08-12 17:53:55,006 [icuService-akka.actor.default-dispatcher-21] ERROR c.s.i.s.server.job.SparkJobSubmitter.apply$mcV$sp(Slf4jLogger.scala:66) - error occurred while submit job
    org.apache.spark.SparkException: 
    hdp.version is not found,
    Please set HDP_VERSION=xxx in spark-env.sh,
    or set -Dhdp.version=xxx in spark.{driver|yarn.am}.extraJavaOptions
    or set SPARK_JAVA_OPTS="-Dhdp.verion=xxx" in spark-env.sh
    If you're running Spark under HDP.
            
            at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:986)
            at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
            at com.szl.icu.spark.server.submit.SparkClient.run(SparkClient.java:14)
            at com.szl.icu.spark.server.client.InsecureSparkYarnSubmitClient.submit(SparkSubmitClient.scala:87)
            at com.szl.icu.spark.server.job.SparkJobSubmitter.submitToYarn(SparkJobSubmitter.scala:170)
            at com.szl.icu.spark.server.job.SparkJobSubmitter.submitAndReport(SparkJobSubmitter.scala:95)
            at com.szl.icu.spark.server.job.SparkJobSubmitter.com$szl$icu$spark$server$job$SparkJobSubmitter$$submitByAuthMethod(SparkJobSubmitter.scala:81)
            at com.szl.icu.spark.server.job.SparkJobSubmitter$$anonfun$receive$1.applyOrElse(SparkJobSubmitter.scala:57)
            at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
            at com.szl.icu.spark.server.job.SparkJobSubmitter.aroundReceive(SparkJobSubmitter.scala:28)
            at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
            at akka.actor.ActorCell.invoke(ActorCell.scala:487)
            at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
            at akka.dispatch.Mailbox.run(Mailbox.scala:220)
            at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
            at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
            at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
            at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
            at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
    2019-08-12 17:53:55,007 [icuService-akka.actor.default-dispatcher-21] DEBUG com.szl.icu.service.proxy.MsgBox.apply$mcV$sp(Slf4jLogger.scala:77) - received job status JobFailed(fbc5b8f8-bee4-4cb5-961e-a1f8456e694d,fbc5b8f8-bee4-4cb5-961e-a1f8456e694d-0,1565603635005,0,
    hdp.version is not found,
    Please set HDP_VERSION=xxx in spark-env.sh,
    or set -Dhdp.version=xxx in spark.{driver|yarn.am}.extraJavaOptions
    or set SPARK_JAVA_OPTS="-Dhdp.verion=xxx" in spark-env.sh
    If you're running Spark under HDP.
            )

在网上也找了很多资料,不外乎在spark-env.sh配置:

export HDP_VERSION=3.0.1.0-187
spark.driver.extraJavaOptions -Dhdp.version=3.0.1.0-187
spark.yarn.am.extraJavaOptions -Dhdp.version=3.0.1.0-187

Custom spark2-defaults配置:

HDP_VERSION=3.0.1.0-187
spark.driver.extraJavaOptions -Dhdp.version=3.0.1.0-187
spark.yarn.am.extraJavaOptions -Dhdp.version=3.0.1.0-187

但是始终未生效,也看到过这篇文档,但是没有解决方法用IDEA提交spark程序,yarn-client模式报错!(用ambari搭建的环境)
我的环境信息如图:
CLUSTER_VERSION
ERROR
spark-env.sh
spark-default

阅读 5.3k
1 个回答

老哥,我这边处理好了,是在打包完后,在启动文件(.sh文件)里添加了export HDP_VERSION=3.0.1.0-187的设置

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
宣传栏