Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeException after running SocialGraphJob #4

Open
mh-github opened this issue Jun 27, 2019 · 0 comments
Open

RuntimeException after running SocialGraphJob #4

mh-github opened this issue Jun 27, 2019 · 0 comments

Comments

@mh-github
Copy link

spark-graphx $ sbt "runMain com.github.graphx.pregel.jobs.social.SocialGraphJob"
[info] Loading settings from plugins.sbt ...
[info] Loading project definition from /home/mahboob/Code/latest/spark/introduction-to-spark/spark-graphx/project
[info] Loading settings from build.sbt ...
[info] Set current project to spark-graphx (in build file:/home/mahboob/Code/latest/spark/introduction-to-spark/spark-graphx/)
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Running com.github.graphx.pregel.jobs.social.SocialGraphJob
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/tmp/sbt_1eaa94af/target/dc366f22/hadoop-auth-2.6.5.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Top 10 most-connected users:
[error] (run-main-0) java.lang.IllegalArgumentException
[error] java.lang.IllegalArgumentException
[error] at org.apache.xbean.asm5.ClassReader.(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.(Unknown Source)
[error] at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:443)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:426)
[error] at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
[error] at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
[error] at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
[error] at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
[error] at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
[error] at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
[error] at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
[error] at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:426)
[error] at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
[error] at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
[error] at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:257)
[error] at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:256)
[error] at scala.collection.immutable.List.foreach(List.scala:381)
[error] at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:256)
[error] at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:156)
[error] at org.apache.spark.SparkContext.clean(SparkContext.scala:2294)
[error] at org.apache.spark.rdd.RDD$$anonfun$zipPartitions$1.apply(RDD.scala:877)
[error] at org.apache.spark.rdd.RDD$$anonfun$zipPartitions$1.apply(RDD.scala:877)
[error] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[error] at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
[error] at org.apache.spark.rdd.RDD.zipPartitions(RDD.scala:876)
[error] at org.apache.spark.graphx.impl.VertexRDDImpl.aggregateUsingIndex(VertexRDDImpl.scala:215)
[error] at org.apache.spark.graphx.impl.GraphImpl.aggregateMessagesWithActiveSet(GraphImpl.scala:243)
[error] at org.apache.spark.graphx.Graph.aggregateMessages(Graph.scala:378)
[error] at org.apache.spark.graphx.GraphOps.degreesRDD(GraphOps.scala:75)
[error] at org.apache.spark.graphx.GraphOps.degrees$lzycompute(GraphOps.scala:62)
[error] at org.apache.spark.graphx.GraphOps.degrees(GraphOps.scala:61)
[error] at com.github.graphx.pregel.social.SocialGraph.getMostConnectedUsers(SocialGraph.scala:28)
[error] at com.github.graphx.pregel.jobs.social.SocialGraphJob$.main(SocialGraphJob.scala:16)
[error] at com.github.graphx.pregel.jobs.social.SocialGraphJob.main(SocialGraphJob.scala)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:564)
[error] at sbt.Run.invokeMain(Run.scala:93)
[error] at sbt.Run.run0(Run.scala:87)
[error] at sbt.Run.execute$1(Run.scala:65)
[error] at sbt.Run.$anonfun$run$4(Run.scala:77)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at sbt.util.InterfaceUtil$$anon$1.get(InterfaceUtil.scala:10)
[error] at sbt.TrapExit$App.run(TrapExit.scala:252)
[error] at java.base/java.lang.Thread.run(Thread.java:844)
19/06/27 14:31:51 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.base/java.lang.Object.wait(Native Method)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1279)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)
19/06/27 14:31:51 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1025)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1331)
at java.base/java.util.concurrent.Semaphore.acquire(Semaphore.java:318)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:80)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1279)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
19/06/27 14:31:51 ERROR Utils: throw uncaught fatal error in thread SparkListenerBus
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1025)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1331)
at java.base/java.util.concurrent.Semaphore.acquire(Semaphore.java:318)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:80)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1279)
at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
[error] java.lang.RuntimeException: Nonzero exit code: 1
[error] at sbt.Run$.executeTrapExit(Run.scala:124)
[error] at sbt.Run.run(Run.scala:77)
[error] at sbt.Defaults$.$anonfun$bgRunMainTask$6(Defaults.scala:1146)
[error] at sbt.Defaults$.$anonfun$bgRunMainTask$6$adapted(Defaults.scala:1141)
[error] at sbt.internal.BackgroundThreadPool.$anonfun$run$1(DefaultBackgroundJobService.scala:366)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] at scala.util.Try$.apply(Try.scala:209)
[error] at sbt.internal.BackgroundThreadPool$BackgroundRunnable.run(DefaultBackgroundJobService.scala:289)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
[error] at java.base/java.lang.Thread.run(Thread.java:844)
[error] (Compile / runMain) Nonzero exit code: 1
[error] Total time: 6 s, completed 27-Jun-2019, 2:31:51 PM

@mh-github mh-github changed the title IllegalArgumentException after running SocialGraphJob RuntimeException after running SocialGraphJob Jun 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant