Failed to submit sparksql task #950
kl243818244
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Your environment
Linkis version used: release-1.1.3-rc1
Environment name and version:
hdp-3.1.5
hive-3.1.0.3.1.4.0-315
hadoop-3.1.1.3.1.4.0-315
scala-2.11.12
jdk 1.8.0_121
spark-2.3.2.3.1.4.0-315
Describe your questions
单独安装 linkis平台和管理页面,完成之后验证基础功能,使用
sh bin/linkis-cli -submitUser ... spark 命令验证失败
包括使用 hive提交的时候也会爆出相同的错误
微信图片_20220903121602
Eureka service list
eg:image
Some logs info or acctch file
stdout.log:
1b3/tmp/blockmgr-1098b279-2df6-4fc0-8c3a-5ecb57b2f5ab
2022-09-03 11:57:10.746 [INFO ] [main ] o.a.s.s.m.MemoryStore (54) [logInfo] - MemoryStore started with capacity 434.4 MB
2022-09-03 11:57:10.813 [INFO ] [main ] o.a.s.SparkEnv (54) [logInfo] - Registering OutputCommitCoordinator
2022-09-03 11:57:11.142 [INFO ] [main ] o.a.s.u.Utils (54) [logInfo] - Successfully started service 'SparkUI' on port 4040.
2022-09-03 11:57:11.333 [INFO ] [main ] o.a.s.u.SparkUI (54) [logInfo] - Bound SparkUI to 0.0.0.0, and started at http://:4040
2022-09-03 11:57:11.349 [INFO ] [main ] o.a.s.SparkContext (54) [logInfo] - Added JAR file:/appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca498c1b3/lib/linkis-engineplugin-spark-1.1.3.jar at spark://:6476/jars/linkis-engineplugin-spark-1.1.3.jar with timestamp 1662177431347
2022-09-03 11:57:11.433 [WARN ] [main ] o.a.s.s.FairSchedulableBuilder (66) [logWarning] - Fair Scheduler configuration file not found so jobs will be scheduled in FIFO order. To use fair scheduling, configure pools in fairscheduler.xml or set spark.scheduler.allocation.file to a file that contains the configuration.
2022-09-03 11:57:11.439 [INFO ] [main ] o.a.s.s.FairSchedulableBuilder (54) [logInfo] - Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
2022-09-03 11:57:12.634 [INFO ] [main ] o.a.h.y.c.ConfiguredRMFailoverProxyProvider (100) [performFailover] - Failing over to rm2
2022-09-03 11:57:12.673 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Requesting a new application from cluster with 6 NodeManagers
2022-09-03 11:57:12.755 [INFO ] [main ] o.a.h.c.Configuration (2757) [getConfResourceAsInputStream] - found resource resource-types.xml at file:/etc/hadoop/3.1.4.0-315/0/resource-types.xml
2022-09-03 11:57:12.784 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Verifying our application has not requested more than the maximum memory capability of the cluster (73728 MB per container)
2022-09-03 11:57:12.786 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Will allocate AM container, with 896 MB memory including 384 MB overhead
2022-09-03 11:57:12.787 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Setting up container launch context for our AM
2022-09-03 11:57:12.795 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Setting up the launch environment for our AM container
2022-09-03 11:57:12.806 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Preparing resources for our AM container
2022-09-03 11:57:14.556 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Use hdfs cache file as spark.yarn.archive for HDP, hdfsCacheFile:hdfs:///hdp/apps/3.1.4.0-315/spark2/spark2-hdp-yarn-archive.tar.gz
2022-09-03 11:57:14.563 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs:///hdp/apps/3.1.4.0-315/spark2/spark2-hdp-yarn-archive.tar.gz
2022-09-03 11:57:14.672 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Distribute hdfs cache file as spark.sql.hive.metastore.jars for HDP, hdfsCacheFile:hdfs:///hdp/apps/3.1.4.0-315/spark2/spark2-hdp-hive-archive.tar.gz
2022-09-03 11:57:14.673 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs:///hdp/apps/3.1.4.0-315/spark2/spark2-hdp-hive-archive.tar.gz
2022-09-03 11:57:14.690 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Source and destination file systems are the same. Not copying hdfs:///home/spark_conf/hive-site.xml
2022-09-03 11:57:14.713 [WARN ] [main ] o.a.s.d.y.Client (66) [logWarning] - Same path resource hdfs:///home/spark_conf/hive-site.xml added multiple times to distributed cache.
2022-09-03 11:57:14.723 [INFO ] [main ] o.a.s.d.y.Client (54) [logInfo] - Deleted staging directory hdfs:///user/hadoop/.sparkStaging/application_1661773015716_0041
2022-09-03 11:57:14.726 [ERROR] [main ] o.a.s.SparkContext (91) [logError] - Error initializing SparkContext. java.lang.IllegalArgumentException: Attempt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext.(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
:
2022-09-03 11:57:14.752 [INFO ] [main ] o.a.s.u.SparkUI (54) [logInfo] - Stopped Spark web UI at http://:4040
2022-09-03 11:57:14.773 [WARN ] [dispatcher-event-loop-9 ] o.a.s.s.c.YarnSchedulerBackend$YarnSchedulerEndpoint (66) [logWarning] - Attempted to request executors before the AM has registered!
2022-09-03 11:57:14.781 [INFO ] [main ] o.a.s.s.c.YarnClientSchedulerBackend (54) [logInfo] - Stopped
2022-09-03 11:57:14.797 [INFO ] [dispatcher-event-loop-11 ] o.a.s.MapOutputTrackerMasterEndpoint (54) [logInfo] - MapOutputTrackerMasterEndpoint stopped!
2022-09-03 11:57:14.810 [INFO ] [main ] o.a.s.s.m.MemoryStore (54) [logInfo] - MemoryStore cleared
2022-09-03 11:57:14.812 [INFO ] [main ] o.a.s.s.BlockManager (54) [logInfo] - BlockManager stopped
2022-09-03 11:57:14.832 [INFO ] [main ] o.a.s.s.BlockManagerMaster (54) [logInfo] - BlockManagerMaster stopped
2022-09-03 11:57:14.834 [WARN ] [main ] o.a.s.m.MetricsSystem (66) [logWarning] - Stopping a MetricsSystem that is not running
2022-09-03 11:57:14.850 [INFO ] [dispatcher-event-loop-16 ] o.a.s.s.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint (54) [logInfo] - OutputCommitCoordinator stopped!
2022-09-03 11:57:14.876 [INFO ] [main ] o.a.s.SparkContext (54) [logInfo] - Successfully stopped SparkContext
2022-09-03 11:57:14.877 [ERROR] [main ] o.a.l.e.l.EngineConnServer$ (58) [error] - EngineConnServer Start Failed. java.lang.IllegalArgumentException: Attempt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext.(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConnSession(SparkEngineConnFactory.scala:74) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.manager.engineplugin.common.creation.AbstractEngineConnFactory$class.createEngineConn(EngineConnFactory.scala:48) ~[linkis-engineconn-plugin-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConn(SparkEngineConnFactory.scala:42) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.core.engineconn.DefaultEngineConnManager.createEngineConn(EngineConnManager.scala:45) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.launch.EngineConnServer$.main(EngineConnServer.scala:64) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_202]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
2022-09-03 11:57:14.891 [ERROR] [main ] o.a.l.e.c.s.EngineConnAfterStartCallback (46) [callback] - protocol will send to em: EngineConnStatusCallback(ServiceInstance(linkis-cg-engineconn, :8681),ff2cebf5-72d7-4ef2-9254-d43ca498c1b3,Failed,ServiceInstance(linkis-cg-engineconn, :8681): log dir: /appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca498c1b3/logs,IllegalArgumentException: Attempt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.)
2022-09-03 11:57:14.912 [ERROR] [main ] o.a.l.e.c.e.h.ComputationEngineConnHook (58) [error] - EngineConnSever start failed! now exit. java.lang.IllegalArgumentException: Attempt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.
:
2022-09-03 11:57:14.891 [ERROR] [main ] o.a.l.e.c.s.EngineConnAfterStartCallback (46) [callback] - protocol will send to em: EngineConnStatusCallback(ServiceInstance(linkis-cg-
engineconn, :8681),ff2cebf5-72d7-4ef2-9254-d43ca498c1b3,Failed,ServiceInstance(linkis-cg-engineconn, :8681): log dir: /appcom/tmp/hadoop/20220903/spark/ff2cebf5-72d7-4ef2-9254-d43ca
498c1b3/logs,IllegalArgumentException: Attempt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.)
2022-09-03 11:57:14.912 [ERROR] [main ] o.a.l.e.c.e.h.ComputationEngineConnHook (58) [error] - EngineConnSever start failed! now exit. java.lang.IllegalArgumentException: Attem
pt to add (hdfs:///home/spark_conf/hive-site.xml) multiple times to the distributed cache.
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext.(SparkContext.scala:500) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2498) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.Option.getOrElse(Option.scala:121) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) ~[spark-sql_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createSparkSession(SparkEngineConnFactory.scala:117) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConnSession(SparkEngineConnFactory.scala:74) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.manager.engineplugin.common.creation.AbstractEngineConnFactory$class.createEngineConn(EngineConnFactory.scala:48) ~[linkis-engineconn-plugin-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineplugin.spark.factory.SparkEngineConnFactory.createEngineConn(SparkEngineConnFactory.scala:42) ~[linkis-engineplugin-spark-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.core.engineconn.DefaultEngineConnManager.createEngineConn(EngineConnManager.scala:45) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.launch.EngineConnServer$.main(EngineConnServer.scala:64) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala) ~[linkis-engineconn-core-1.1.3.jar:1.1.3]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_202]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
2022-09-03 11:57:14.917 [ERROR] [main ] o.a.l.e.c.h.ShutdownHook (58) [error] - process exit reason: java.lang.IllegalArgumentException: Attempt to add (hdfs://******/home/
spark_conf/hive-site.xml) multiple times to the distributed cache.
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:660) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17$$anonfun$apply$6.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:651) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$17.apply(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.11.12.jar:?]
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:650) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) ~[spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164) ~[spark-core_2.11-2.3.2.3.1.4.0-315.jar:2.3.2.3.1.4.0-315]
log file:
Beta Was this translation helpful? Give feedback.
All reactions